blendedreality


Meta Quest 3 – seems very good

Firstly, yes it looks like its a while since I posted on here, but that’s the challenge when work life and social/hobby life are one and the same thing. I have to save all the really good stuff for my professional output. However, I bought myself what is my 16th VR or AR headset with the Meta Quest 3 and as a long time user of such kit I wanted to share my personal (and only personal) thoughts about this bit of kit.

Quest 3
Quest 3 box
Quest 3
Headset and controllers
Quest 3
Looking like the Aliens from Toy Story 3 with 3 eyes wearing a Quest 3

Our Quest 2 has been kept in our kitchen ready for anyone in the family to use it since it arrived. The kitchen is the only place with just about enough free space to wave arms and make the odd step safely in VR. Though as I may have explained before there is a dent in the fridge from a side fist I did playing Superhot VR when I was training for my 2nd Degree Black Belt in Choi Kwang Do. The Quest 2 went straight to VR, obscuring the world (that was it’s job after all) but the guardian that we manually drew around the kitchen (often repeating that process) tended to keep us in bounds. Picking up the controllers and getting them on the right hands in the right way was always a bit fiddly, like plugging in a USB A and always getting it wrong the first two times despite there only being two ways to do it.

The Quest 3 on the other hand, starts with colour pass through, you put it on (over glasses too as they fit in the headset), and just see a slightly blurry version of where you actually are, its much quicker to get in and stood in the right place. (Quest 2 has pass through but it’s very disorientating and B&W). The new hand controllers are much easier to see and to pick up the right way as they don’t have the massive halo loops of plastic around them for sensors. The headset also does the guardian and scanning around for things on its own, you don’t have to (though you can) draw a line around yourself, which in the Quest 2 seemed like some sort of weird religious or mystical incantation involving a border of salt.

The Quest 3 is also light and smaller than the 2, or at least feels like it, I haven’t weighed them. This makes the whole experience to just get going much less faff an bother. It is a similar leap to when we had to place external sensors and try and line everything up before playing anything.

Going into a VR game of applications is pretty much the same across headsets, though it is noticeably crisper and faster/flashier as all new kit tends to be. Though now games win particular are being upgraded to have richer visuals, which in turn will start slowing it all down again. I plumped for the lower spec memory of 128Gb as mostly the family plays Beat Sabre and Pistol Whip, but I now have some games that seems to be 13Gb so it will eat it up more quickly now.

The Mixed Reality (MR) elements of the Quest 3 blending the pass through view of the world with digital elements anchored in that view is the clever bit. This is augmented reality but calling it mixed reality is as accurate. The view you see is virtual, its a camera showing you the physical world, it is able to scan the environment and build a digital mesh of it knowing what are walls, surfaces and objects in the view. It does in the example first steps app asking you to look around you and you see it overlay what it considers the mesh of the world. It’s very matrix, and deliberately visual, not really any need to show the user other than to intrigue them. The demo app then causes a break in the ceiling with bits falling on the floor and giving a view of a space environment and another planet in the distance. Through this hole drops a spaceship that lands (in my case on the the kitchen floor). Being in pass through makes it easy to walk around not bumping into physical things, as you can see them, so get a better look up into the hole in the ceiling. Over the course of the demo tribble like fluffy things break through the walls and run around you space, they drop behind things like the breakfast bar in our kitchen and you can’t see them unless you go and look over. They creatures also run along the walls and other surfaces. It really is quite effective as a cartoon comedy demo of full AR. as this is not holding up a tablet or phone the field of view and turning of you own head and hence view is 100% natural.

The wonderful old painting application Vermillion had some upgrades, so now in full MR the easel and paint brushes can be in your own room and now even better you can hang you paintings on you physical wall and they will stay there if you want them too. You can see family walking into the room and talk to them, though it’s a little odd for them to know whether you can see them or not, having been used to giving the VR headset a wide birth (as the fridge didn’t above) πŸ™‚

This Quest 3 for the kitchen liberates by Quest 2 for my gaming PC, which in turn retires the tethered Rift/S. It also means we can play multiplayer games with two of us in the house. The main account can share their apps (though only on one headset). I shared the Quest 3 apps and unshared on Quest 2 but we were able (with me logged into Quest 2 and Predlet 2.0 using his account on Quest 3) to play a good few rounds of the VR Walkabout Mini-golf and some multiplayer beat Sabre. I say a good few rounds but the battery life is pretty short on both headsets. This can be sorted by attaching a very long USB cable to the power outlet, but that sort of removes the untethered nature of the headset.

This Quest 3 and pass through is obviously a much cheaper version of what the Apple Vision Pro is aiming to do, but it’s out first, it works and it has a library of interesting things to engage with. Though the really new stuff like Assassin’s Creed and Asgards Wrath II (Remake) are not out at release. So it’s more to play VR things and the odd MR update at the moment. I say out first but pass through has been the mission of commercially available Varjo headsets for a while.

One other thing pass through allows for is being able to see you phone (sort of as its a bit warped). This is very useful as trying to buy things in the store in VR doesn’t work as it needs the phone app to do all the verification. This used to mean taking the headset off, now it means double le tapping the headset for pass through and finding your phone. That’s a silly use case as it should just work in the environment or represent the phone and its OS in the environment, but that’s payment rails for you.

In short, it’s good, worth it and I like it. IO am looking forward to more proper AR/MR apps and experiences, whilst enjoying the brand new ones like Samba De Amigo that just launched (again not new as we played that a lot on the Sega Dreamcast with a set of maraca controllers way back).. and pose….. Happy memories reborn πŸ™‚

EyeBlend better than an Apple Vision Pro – still

In 2016 I wrote Cont3xt the follow up to Reconfigure and part of that was an upgrade our lead character gets. She has access to an API that can manipulate the physical world like the virtual, she builds that up into a Unity application for her smart phone in the first book. The second she gets an EyeBlend headset, state of the art (and still just a little ahead of Apple and co’s attempts. So she ports the Unity app and upgrades the messaging system for this high bandwidth device and to enhance the experience. Rather like all the developers are doing right at the moment. Life imitates art πŸ™‚ Read Reconfigure first, but this might be a topical time to check out Cont3xt whilst you wait for the Apple demo in a local store next year. (Just Β£0.99 each for these adventures πŸ™‚ )

β€œAn EyeBlend? They are not on general release yet? How did you get that?” Her eyes lit up wildly as Roisin took in the news.Β 

β€œOh you know, I have a few friends who hack for a living.” Alex replied with an air of comedy superiority. 

β€œYou sure do! The EyeBlend is an awesome piece of kit. It does full blended reality applications. It uses field effect projection with live object tracking!” She said enthusiastically. β€œThey call it a hologram, though it really isn’t as such. You can forgive that though as the effect is the same, apparently.” Roisin was on her tech turf again. 

Digital humans evolve

Way back in the dim and distant era of 2009 I was exploring a lot of tools to help me build virtual environments with avatars and characters that could be animated, typically in Unity. 3D modelling is an art in its right and then rigging those models for animation and applying animations to those is another set of skills too. At the time I was exploring Evolver, which in the end was bought by Autodesk in 2011. A decade on and there is a new kid on the block from Epic/Unreal called Metahuman. This cloud based application (where the user interface is also a cloud streamed one) runs in a browser and produces incredibly detailed digital humans. The animation rigging of the body is joined by very detailed facial feature rigging allow these to be controlled with full motion capture live in the development environment of Unreal. Having mainly used Unity there is a lot of similarity in the high level workflow experience of the two, as they are the leading approaches to assembling all sorts of game, film and XR content. However there was a bit of a leaning curve.

I decided to generate a couple of characters and ended up making what to me feels like Roisin and the Commander from my Reconfigure novels. Creating them in the tool plays instant animations and expressions on the faces and is really quite surreal an experience. I installed Unreal on my gaming rig with its RTX gfx card and all teh trimmings and set about seeing what I needed to do to get my characters working.

First there is an essential application called Quixel Bridge that would have been really handy a decade ago as it brokers the transfer of file formats between systems, despite standards being in place there are some quirks when you move complex 3D rigged assets around. Quixel can log directly into the metahuman repository and there is a specific plugin for the editor to import the assets to Unreal. Things in Unreal have a data and control structure called a blueprint that is a kind of configuration and flow model that can be used in a no-code (but still complex) way to get things working. You can still write c++ is needed of course.

My first few attempts to get Roisin to download failed as the beta was clearly very popular. I only took a photo of the screen not a screen cap, a bit low quality but there is more to come.

Metahumans

However, eventually I got a download and the asset was there and ready to go. Unreal has a demo application with two MetaHumans in it showing animation and lip synching and some nice camera work. Running this on my machine was a sudden rush to the future from my previous efforts with a decade old tech for things such as my virtual hospitals and the stuff on the TV slot I had back then.

Roisin ?
Roisin from Reconfigure and Cont3xt in Unreal engine

The original demo video went like this

Dropping into the edits and after a lot of shader compilation I swapped Rosin with the first metahuman by matching the location coordinates and moving the original away. Then in the Sequence controller, the film events list I swapped the target actor from the original to mine and away we go.

This was recorded without the sound as I was just getting to grips with how to render a movie rather than play or compile an application to then screen cap instead. Short and sweet but proves it works. A very cool bit of tech.

I also ran the still image through the amusing AI face animation app Wombo.AI this animates stills rather than the above which is animating the actual 3d model. I am not posting that as they are short audio clips of songs and teh old DMCA takedown bots end to get annoyed at such things.

Now I have plan/pipe dream to see if I can animate some scenes from the books, if not make the whole damn movie/tv series πŸ™‚ There are lots of assets to try and combine in this generation of power tooling. I also had a go at this tutorial, one of many that shows the use of a Live facial animation capture via an iPhone streamed to the metahuman model. I will spare Roisin public humiliation of sounded like me and instead leave it to the tutorial video for anyone wishing to try such a thing.

Lets see where this takes us all πŸ™‚

Amazing phone scanning and AR animation

In a break from the all day video calls and events I just quickly downloaded Qlone to my Iphone. I also printed out the black and white scanning reference map. This technique of a reference mat and moving the camera has been around for decades to varying degrees of success, so I was not sure how it would fair.

Superplastic

In about 1 minute of waving the phone around to fill the visual indicators on a virtual dome over this character I had a scan of him. A full colour 3D model in the app. I was so impressed with the initial quality I paid for the upgrade to it which did this remarkable thing.

Bear in mind the figure is just a bit of plastic, I did not highlight any joints or add any context to the model, but magically it just was able to walk around and animate itself instantly.

It can export to a multitude of 3D formats so I think I will have to get Unity back up and running again. Very impressed, well done Qlone

Flying virtually to a real virtual place

We don’t get to travel much at the moment so being able to virtual visit places is fun. Of course visiting places that exist, but don’t exist, yet do are even more fun πŸ™‚

Today is a slightly odd day as Predlet 1.0 is heading back to a face to face class at 6th form college for the first time since february and the 2020 lockdown. To avoid cramming into a train I need to drive here the 20 mins to college so I am up bright and early, so a good time to share this post which I put both on facebook and twitter yesterday, but is something more memorable (for me anyway).

Microsoft FlightSimulator2020 came out in mid August and I have always been a fan of flight sims in general, but this one lets you fly anywhere in the World because it uses Microsoft’s Bing photographic maps of the globe and combined that with a bunch of other data sources to create buildings, trees, traffic and even realtime weather for anywhere you choose to head to. The standard thing to do is jump in a plane and fly over your own house, we have all done it. You can pause the flight and jump off in a camera drone to explore the view and it is quite fascinating. You can of course do this with things like google maps and street view, I have played with that in VR too, where I actually found myself walking out of the train station. However the live digital twin of the world and the fun of actually flying different types of plane really give flight sim something else. Hence, I thought I would visit somewhere that is real, but also fictitious or virtual in one of my little planes.

When I wrote the first novel Reconfigure I set Roisin’s adventure in a very british style edge of town/city environment with lots of access to countryside and some rural locations. Whilst the science of the coordinate system was consistent with itself, the entire place was a mix of images in my head of places near me or that I have visited. When it came to the follow up Cont3xt I challenged myself to find a more exotic and remote location. As I wrote back then I found an actual island for sale and used Google Earth to scout the location off the Brazilian coast. I used Street View to head to the local mainland town and wander around to get the feel of the place. Lots of the things in the book, despite it being scifi, are based on real things. It is worth noting Roisin has a real Twitter account (key in the original story), Marmite is almost a second character, she builds systems using Unity and even uses protocols like MQTT sole some things. OK the quantum computing part is a bit more off the chart but its all based on real theories extended a little to tell a good yarn.

Along come Flightsim2020 with the ability to fly anywhere in the world. The navigation map to select where to fly lets you click anywhere but it works best if you have an airfield to start from, it feels more like a real journey. Whilst the main town in Cont3xt has an aiport there is also a grass landing strip on another island about 1 mile east of Roisin’s island. I headed there.

Flying over my books island http://cont3xtbook.co.uk
Landing strip

I took off, avoiding the trees at the end of the runway and as I banked and levelled on the right course, there it was.

Flying over my books island http://cont3xtbook.co.uk
heading to the island

I got quick a shiver down my neck and back as it took me back to the buzz of writing the books and the fun of the story and adventure, the anticipation of the fun others might have reading it.

Flying over my books island http://cont3xtbook.co.uk
Overview with the plane

The island in context has a slightly larger building and a helipad, plus a large “container” but of course they were the artefacts I added in the story.

I also then headed North, a trip that used boats in the book, toward the main town.

Flying over my books island http://cont3xtbook.co.uk
Heading to town

And there it was the jetty and the market shops that Roisin tries to stay out of trouble in while gathering supplies (including more Marmite)

Flying over my books island http://cont3xtbook.co.uk
Town

I then flew back, over the island again and landed (yes actually landed not crashed) the plane one the grass strip nearby.

Flying over my books island http://cont3xtbook.co.uk
Heading back

The whole flight was awe-inspiring. I realise that its more of an incredible personal experience for me as a writer but it made me consider the sort of images and film like sequences I saw in my head that I then wrote in the words. It made me smile doing it and writing this πŸ™‚ I hope you get a kick out of this too.

Video trickery – lockdown fun

Lots more people are playing with/using video now with the lockdown. Every conference is a video conference. We are communicating with friends, family, customers and colleagues in a variety of ways and a stack of different contexts.

I sparked up Faceapp on my phone, something I had tinkered with a while back and found that it now not only does mad crazy things with great accuracy to photos but can also do video for some of those effects. With still images like this before and after example, or they they the other way around πŸ™‚ you decide.

Before/after face app
Before/after faceapp
Before/after faceapp
Before/after faceapp ?

The video though…. mind blowing stuff.

There is also lots of fun to be had with the snapchat virtual camera on PC and Mac able to feed into any of the video conference tools you use by diverting your main camera to it first then selecting the snap camera as your webcam.

Lots can be done in foreground and background replacement. I mean who would turn up at a meeting in a predator costume? (like its 2006)

Snap predator

Like my Second Life avatar (below) I cant claim to have created this one. But it was great to see it on the list of snap chat lenses. My SL avatar has a custom jacket (my original RL one) and a feeding edge T-shirt so there are always tweaks to be made. Yes SL is still there, yes I still have 2 islands, yes you should sign up/re-login.

Visiting Hursley and IQ

Back in the UGC world of SL in 2006 we had lots of interesting uses for logos and brands, like the Wimbledon Tennis flying towel and Wimbledon contact lenses (Official sanctioned I should add). I dived into the Snapchat Lens Studio to see what I could quickly create. This was knowing full well there was a very rich 3d animation and code/no-code environment that could be a career for some talented artists. I used one of the base models and tweaked it a bit and gave myself some shades with a subtle mention of 451 Research (now part of S&P Global Market Intelligence). Obviously that would all not fit on the glasses, but you get the idea.

Snapchat lens - lenses
451 snap camera lenses for video conferencing

I have always talked about putting logos and our own designs into game environments like my reconfigure car to advertise my novels. Before that back in 2006 putting eightbar into games too ( in the link above too). Now I have a feeding edge logo in Animal Crossing New Horizons. I am planning on doing a reconfigure one too. Only a few pixels to work but its fun to try. Our entire family is obsessed with this game at the moment, on top of all the other various games we play its a unifying experience as we tend to our islands, or live on one another’s. (Hmmm another SL like reference)

Animal crossing
Animal Crossing merch

Of course this lockdown is driving us all crazy, but having a bit of fun is essential. Back to snapchat it was good to see the Trolls movie had some official snapchat lenses. When I tweeted this I was told I looked like Eddie Izzard πŸ™‚

How’s your day going ?
Troll lens

The family joined in too, but not posting them without permission πŸ™‚

Having said that all this tech is great, but I also felt the need to try a real life filter and broke out this costume left over from The Cool Stuff Collective TV show (I didn’t wear it on the show, but at the wrap party). It was certainly a shock for my colleagues on video. The neighbours probably thought I had lost it too as I ran around the garden in it, doing a real life animal crossing manoeuvre.

Best snapchat filter ever, but it seems to have transitioned into the physical world.
Squawk

Stay safe everyone, have some fun with the tech and lark around a bit!

WFH – The past 20+ years – Metaverse anyone?

The recent pandemic situation is causing many offices to suddenly switch from co-location in a building to everyone working from home. 11 years ago when I left IBM and started Feeding Edge I was immediately a home worker, but the preceding years, despite having an office and a base most work was effectively remote. Being a metaverse evangelist in 2006 was all about people using virtual world and game technology to be able to communicate and understand one another at distance.

The projects in 1997, in the early days of the web, were built as a team in an office but were clearly very much about interacting with the world at large. For those of us in these industries, building and shaping the use of the internet we did not face a big bang switch over from office to home. We tended to grab and adapt or write whatever tools were available to work across physical and digital divides. That of course is a typical pioneering spirit and having been an evangelist for change I know not everyone is comfortable with that, and nor should they be.

A sudden switch from one thing to another is bad enough, I recently had to switch from Mac to Windows for work, its annoying, and irritating at best and veer stressful at worse. Today we have the added external stress and personal worries about family and friends as well as the future of businesses layered on top.

The reality is, we do have the technology to communicate and share, we might not all have the right processes or social norms in place but the more we try, the better.

For many people teleconferences are a norm anyway, just usually they commute to a desk or office in order to have those. However many others will not. It is here we all need to be cognitive of how much the technology blocks and filters who we are. A voice only teleconference is great for a presenter, or for the alphas who thrive on talking, but many people will not feel comfortable to interject and cannot use body language to find a gap in a conversation. Video links are not really any better, for some that strips them of their normal behaviour as they attempt to stare at the camera or try and get a level of eye contact akin to the physical world. Some people engage in text chat alongside audio and video, though often those who are better at talking do not pay attention to the text, and sometimes the text just becomes chatter in the background. There is an art and style to all this and people will find what suits them, just as in a physical situation we adapt to one another’s signals.

It is these sort of difficulties that led to exploring virtual worlds like Second Life. Some of that is shown in the Album below of 600+ images 2006-2009. This was a long and varied journey and one I have talked, writes and presented on many many times. Happy to share the tales just ping me @epredator or the 11 years worth of posts here many related to metaverse concepts or even read my Reconfigure series and get the gist of of some of it in a modern sci fi context.

Second Life History

It acted (and still does) as a teleconference in having shared sounds in a space, but it has the presence of avatars to represent people, that get moved around even doing simple things like sitting or standing or hovering next to people you know in a meeting. Instant visual feedback, who is there, where are they, what are they representing with the avatar. We used text chat, group and 1:1 whilst also having voice. We were able to bring artefacts into the environment dynamically, whiteboard in a 3d space. All adding to the depth of communication. There is also, for those that can, ways to code, make virtual objects do something too. This is still entirely possible on Second Life and many other virtual environments. Now more than ever is time for people to experiment.

This is also without even bringing in the tech and immersion fo Virtual Reality or the potential for Augmented Reality to allow us to blend our physical and digital presences in physical space. That of course requires people to don headsets and have the kit, but virtual world interaction does not need that. A laptop/tablet and an internet connection is all that is required. Carry on with the frustration of teleconference and video conference by all means, sometimes they are the ideal approach, but just consider what is causing frustration of working remotely and investigate what might be a better way.

Good luck all, stay well.

Sci-fi coming true with AR and Quantum communication – Wake up World

Recently the fledgling AR industry got a bit of a boost with the announcement of Apple’s AR kit. This bit of software layering is supported by Unity3d and Unreal, to name a few, and puts some AR elements directly into the OS of the Iphone and iPad. That of course is a bit of a problem for the many other AR toolkits that have been around for some time, but these things happen. Developers are already diving in an making interesting things, such as this Minecraft style tech demo

The ability to position and accurately keep track of objects in the the camera view of the World is a core part of what happens in my novels, including the “how to build stuff in unity3d” parts of Roisin’s inner dialogue. Of course the bits of how she gets to instrument the world if is a bit further off, but as Quantum communication is getting closer and, well that leaves just a little sliver of fiction left that powers the stories πŸ™‚

These things add to the amazing amount of tech out there to keep track off and understand, something my IoT analyst day job keeps me very busy doing. Luckily (well by design really) my research agenda incorporates AR and VR and whatever XR comes along so I am still able to build on all these many years of being in the business of virtual. It is surprising how many companies are deep into it, but still keep it a little quiet, probably out of the same embarrassment or resistance we back in 2006 showing Second Life as an integration platform with both people and real World data, but oh…. it had colourful graphics and game like features, how could that be “serious”. It still makes me laugh to think of the resistance to it, as with the web, e-commerce and social media. The same is happening with blockchain, AI, IoT, autonomous vehicles. Lots of stuck in the mud attitudes, it will never catch on…(a few years pass) oh look my entire business has been disrupted…. why didn’t I listen…
Anyway, keep an open mind on tech and its interaction with society, it’s not all version numbers and installations, people are in the mix and very much part of whatever ecosystem is forming. Why not read the sci fi adventures whilst they are still fiction, look back in a few years and it will like a documentary. (That’s by design not accident BTW)
Reconfigure is the place to start there are some links on the right πŸ™‚ Enjoy the future.

CES 2017 – Bucket list, tick

For many years I have seen the CES show appear in magazines, then TV and then of course all over social media. As a long time tech geek and early adopter I have always wanted to attend, but never been able to. In my corporate days getting approval for a train to London was a chore. As a startup I never had the time or money either, preferring to invest in the gadgets like the Oculus rift or paying for a Unity license so I could build things. On the TV show we talked about CES, and if we had gone to a 4th series it was on the cards.
This year, with my industry analyst role in IoT I was able to go. Of course as a work trip it was a bit different to just being able to take the show in.
CES 17
I had briefing after briefing with a bit of travel time in between for the 2 main days I was there. Once thing that is not always obvious is just how big the show is. Firstly there is the Vegas convention centre with North and South Halls that is bigger than most airports. It was so big that I only got to really visit the south halls, the north hall of cars, a motor show in its own right alluded me. All the first days meeting were around the south halls. Day 2 was at the other areas of the show, the hotels have their own convention centres and also floors and suites get rented out. The Venetian Sands, Bellagio and Aria all had lots going on, each as big as any UK show it seemed.
Walking around 9-10 miles a day, still not seeing everything, at a trade show gives you an indication of the size.
#ces2017 steps
Again I pretty much missed most of the expo floor with meetings but the day I felt out I had an hour to pop back to the Sands main hall and see some things.
The split across the entire show of giant corporate powerhouses to tiny startups with a single table was amazing. I had assumed it was all the former, but the latter is heavily supported and with kickstarters and maker culture now mainstream it will continue to be really important.
One thing I was there to see was how much Augmented Reality was taking off running parallel with the VR wave, there were a lot of glasses and of course the Hololens and the industrial focussed Daqri smart helmet. Still not there as a consumer focus really yet, though the Asus Zenfone AR powered by Tango and Qualcomm was announced but not on sale until later in the year (no date given) which may put true AR into people’s hands.
#ces2017 day 1
HoloLens
#ces2017 day 1
DAQRI Smart Helmet

It was CES’s 50th anniversary, and that was fitting given I turn 50 this year too. It may not be the most exiting bucket list tick but I have already done lots of mine, and need to refresh the list anyway. I am not sure that will include riding in this human size quad copter that you fly with a smart phone though!
#ces2017 day 1

I guess we best experience everything before these guys and their bretheren take over.

#ces2017

Still at least we can 3d print new parts for ourselves

#ces2017

As you will see in this album the whole place just becomes a blur of everything looking the same, lights, sound, people, attract loops etc. All very fitting to be in Vegas.

CES 17

@xianrenaud and I wrote a spotlight piece for 451 Research as a show roundup which may end up outside the paywall CES 2017: connected, autonomous and virtual in case you do have access.

So that made a whirlwind start to this year. This time last year I was published Cont3xt and wondering what the next steps were going to be. This year I have stacks of IoT research and writing work to get on with, a 50th birthday to not get worried about, imminent wisdom tooth removal (yuk) and all being well a 2nd Degree/Il Dan black belt text in Choi Kwang Do. So onwards and upwards. Pil Seung!

Jersey Hackathon – Phone Breakathon

I had a trip to Jersey this weekend, sponsored by Jersey BCS so that I could be an out of town judge at the #hackjsy game development event. The focus here was for teams to build something in 36 hours, game related. With my BCS Animation and Games specialist group hat/badge on it made a lot of sense to to and see what was going on.
I also treated the trip as a re-aquanting myself with travelling on business, the family getting a chance to see I am now there all the time, but just for a short first stint. I also thought I would test out the new clothes for travel comfort. That test worked, but in a way I was not expecting. I usually have been wearing combat trousers and my phone sits nicely in the leg pocket. Instead it was inside my new jacket/ waistcoat arrangement. As i parked the car at the airport and hoofed my overcoat on with a hunch of the shoulders, my iPhone 6s plus felt the urge to slide upwards out of the shiny new pocket and propel itself face down onto the floor. I knew it was not going to be too well but I was surprised at just how smashed it made itself.
So much for not having a screen protector
It was completely unhappy with any sort of interaction. I couldn’t power it off with a slide either. I tried the power and home button together for a few seconds and it shutdown. The Jersey flight is inly 40 minutes in a turbo prop but they don’t like phones being on. It is not so bad these days to be without a phone as if you have a laptop/pad etc wi-fi is readily available, so I let home know I was not going to be texting and Jersey know I was not going to be ignoring them if they called.
The hackathon was great though. 9 teams building stuff in all sorts of ways. javascript, node.js, python, unity3d, ruby, stencyl and a raspberry pi all featured across the projects and we had a hard time judging down to 1, 2, 3 and the special WTF award.
Whilst there I got to talk to a lot of people from all over the island in different industries. It was great to catch up with the guys from vizuality as they are making huge strides in the areas of installation experiences using VR. Tracking users in a 10x10m space and providing headset visuals as they wander around.
I spent the Saturday hacking too. I looked a little in IBM Bluemix and its Unity api for text to speech, using my own book quotes to see if it could cope. They all still have trouble pronouncing Roisin though πŸ™‚ I also then spent a bit more time on my Vuforia AR covers for the books. I decided that Reconfigure should have the variant of the block world view that Roisin sees and builds in her own Unity application.
Joining in with the spirit of #hackjsy
Then on Cont3xt I explored writing a scene changer, so at certain intervals the models and view would swap. Initially I did that by toggling the image targets, but that did not trigger the re-viewing of them, as it expected the same target to have the same stuff on it. However, swapping the game objects attached in the tree, turning them on and off worked, just as the animation works. So I now have a little bit of authoring infrastructure that makes it easier to add multiple scenes and play through them.
I was going to do something with the leap motion sensor too, but that fitted more with having an AR headset to interact with the book covers, with a broken phone and other judging work to do I parked that one.
I also had a lot of conversations around IoT in various forms, and a bit of a chat about blockchain too. Jersey may only have 100,000 people on it, but there is a vibrant tech community there. It was a great trip, and the phone is now repaired (the Jersey shop wasn’t able to do 6s plus so Apple Basingstoke did it in 1 hour) It also now has a proper case. I had avoided that for ages, not having broken the phone before. Rather than take a picture of the phone in a mirror to show the case, I sparked up the AR unity, put the Reconfigure picture on the phone and then the mac did its thing and rendered Roisin, holding her phone with a view of the world that she sees.
AR reaching out of the fixed phone + cover
Just to re-interate the loops within loops here. I am rendering an AR representation using Unity3d and an Iphone onto a digital version of the cover of the e-book that contains a story about Roisin discovering a way to see the World in terms of position and labels that she expands on by writing an application in Unity that tuns on her iPhone. I will share this post on Twitter. The same Twitter (happy 10th birthday) that she uses to accidentally discover her new found abilities when she accidentally types into the wrong window. Meta enough ? πŸ™‚
Anyway, well done everyone at #hackjsy, great organization, great participation, great fellow judges and a great island. See you all again soon I hope.