augmented reality


Meta Quest 3 – seems very good

Firstly, yes it looks like its a while since I posted on here, but that’s the challenge when work life and social/hobby life are one and the same thing. I have to save all the really good stuff for my professional output. However, I bought myself what is my 16th VR or AR headset with the Meta Quest 3 and as a long time user of such kit I wanted to share my personal (and only personal) thoughts about this bit of kit.

Quest 3
Quest 3 box
Quest 3
Headset and controllers
Quest 3
Looking like the Aliens from Toy Story 3 with 3 eyes wearing a Quest 3

Our Quest 2 has been kept in our kitchen ready for anyone in the family to use it since it arrived. The kitchen is the only place with just about enough free space to wave arms and make the odd step safely in VR. Though as I may have explained before there is a dent in the fridge from a side fist I did playing Superhot VR when I was training for my 2nd Degree Black Belt in Choi Kwang Do. The Quest 2 went straight to VR, obscuring the world (that was it’s job after all) but the guardian that we manually drew around the kitchen (often repeating that process) tended to keep us in bounds. Picking up the controllers and getting them on the right hands in the right way was always a bit fiddly, like plugging in a USB A and always getting it wrong the first two times despite there only being two ways to do it.

The Quest 3 on the other hand, starts with colour pass through, you put it on (over glasses too as they fit in the headset), and just see a slightly blurry version of where you actually are, its much quicker to get in and stood in the right place. (Quest 2 has pass through but it’s very disorientating and B&W). The new hand controllers are much easier to see and to pick up the right way as they don’t have the massive halo loops of plastic around them for sensors. The headset also does the guardian and scanning around for things on its own, you don’t have to (though you can) draw a line around yourself, which in the Quest 2 seemed like some sort of weird religious or mystical incantation involving a border of salt.

The Quest 3 is also light and smaller than the 2, or at least feels like it, I haven’t weighed them. This makes the whole experience to just get going much less faff an bother. It is a similar leap to when we had to place external sensors and try and line everything up before playing anything.

Going into a VR game of applications is pretty much the same across headsets, though it is noticeably crisper and faster/flashier as all new kit tends to be. Though now games win particular are being upgraded to have richer visuals, which in turn will start slowing it all down again. I plumped for the lower spec memory of 128Gb as mostly the family plays Beat Sabre and Pistol Whip, but I now have some games that seems to be 13Gb so it will eat it up more quickly now.

The Mixed Reality (MR) elements of the Quest 3 blending the pass through view of the world with digital elements anchored in that view is the clever bit. This is augmented reality but calling it mixed reality is as accurate. The view you see is virtual, its a camera showing you the physical world, it is able to scan the environment and build a digital mesh of it knowing what are walls, surfaces and objects in the view. It does in the example first steps app asking you to look around you and you see it overlay what it considers the mesh of the world. It’s very matrix, and deliberately visual, not really any need to show the user other than to intrigue them. The demo app then causes a break in the ceiling with bits falling on the floor and giving a view of a space environment and another planet in the distance. Through this hole drops a spaceship that lands (in my case on the the kitchen floor). Being in pass through makes it easy to walk around not bumping into physical things, as you can see them, so get a better look up into the hole in the ceiling. Over the course of the demo tribble like fluffy things break through the walls and run around you space, they drop behind things like the breakfast bar in our kitchen and you can’t see them unless you go and look over. They creatures also run along the walls and other surfaces. It really is quite effective as a cartoon comedy demo of full AR. as this is not holding up a tablet or phone the field of view and turning of you own head and hence view is 100% natural.

The wonderful old painting application Vermillion had some upgrades, so now in full MR the easel and paint brushes can be in your own room and now even better you can hang you paintings on you physical wall and they will stay there if you want them too. You can see family walking into the room and talk to them, though it’s a little odd for them to know whether you can see them or not, having been used to giving the VR headset a wide birth (as the fridge didn’t above) πŸ™‚

This Quest 3 for the kitchen liberates by Quest 2 for my gaming PC, which in turn retires the tethered Rift/S. It also means we can play multiplayer games with two of us in the house. The main account can share their apps (though only on one headset). I shared the Quest 3 apps and unshared on Quest 2 but we were able (with me logged into Quest 2 and Predlet 2.0 using his account on Quest 3) to play a good few rounds of the VR Walkabout Mini-golf and some multiplayer beat Sabre. I say a good few rounds but the battery life is pretty short on both headsets. This can be sorted by attaching a very long USB cable to the power outlet, but that sort of removes the untethered nature of the headset.

This Quest 3 and pass through is obviously a much cheaper version of what the Apple Vision Pro is aiming to do, but it’s out first, it works and it has a library of interesting things to engage with. Though the really new stuff like Assassin’s Creed and Asgards Wrath II (Remake) are not out at release. So it’s more to play VR things and the odd MR update at the moment. I say out first but pass through has been the mission of commercially available Varjo headsets for a while.

One other thing pass through allows for is being able to see you phone (sort of as its a bit warped). This is very useful as trying to buy things in the store in VR doesn’t work as it needs the phone app to do all the verification. This used to mean taking the headset off, now it means double le tapping the headset for pass through and finding your phone. That’s a silly use case as it should just work in the environment or represent the phone and its OS in the environment, but that’s payment rails for you.

In short, it’s good, worth it and I like it. IO am looking forward to more proper AR/MR apps and experiences, whilst enjoying the brand new ones like Samba De Amigo that just launched (again not new as we played that a lot on the Sega Dreamcast with a set of maraca controllers way back).. and pose….. Happy memories reborn πŸ™‚

EyeBlend better than an Apple Vision Pro – still

In 2016 I wrote Cont3xt the follow up to Reconfigure and part of that was an upgrade our lead character gets. She has access to an API that can manipulate the physical world like the virtual, she builds that up into a Unity application for her smart phone in the first book. The second she gets an EyeBlend headset, state of the art (and still just a little ahead of Apple and co’s attempts. So she ports the Unity app and upgrades the messaging system for this high bandwidth device and to enhance the experience. Rather like all the developers are doing right at the moment. Life imitates art πŸ™‚ Read Reconfigure first, but this might be a topical time to check out Cont3xt whilst you wait for the Apple demo in a local store next year. (Just Β£0.99 each for these adventures πŸ™‚ )

β€œAn EyeBlend? They are not on general release yet? How did you get that?” Her eyes lit up wildly as Roisin took in the news.Β 

β€œOh you know, I have a few friends who hack for a living.” Alex replied with an air of comedy superiority. 

β€œYou sure do! The EyeBlend is an awesome piece of kit. It does full blended reality applications. It uses field effect projection with live object tracking!” She said enthusiastically. β€œThey call it a hologram, though it really isn’t as such. You can forgive that though as the effect is the same, apparently.” Roisin was on her tech turf again. 

Guest Appearance – Games At Work dot Biz Ep 311

Hot on the heels of my BCS animation and games webinar on the games of 2020 I was delighted to be invited to pop along to my favourite podcast, Games At Work, as a guest to talk about my personal views of some of the thing going on in the tech and games worlds including AR and VR.

To hear us riff on a range of subjects head over to here to website and the show links or look for Games at work dot biz on your favourite podcast repository.

I have had the honour to be on before, the last time was in June 2020, a mentally trying time that the podcast recording really helped with. Its always a blast to record, they kindly had me on years ago too when I first published my Sci-fi Novels Reconfigure and Cont3xt. Given it is now on episode 311 you can tell this podcast is certainly not a fad or a flash in the pan but a wonderfully produced and entertaining experience, despite my ramblings and book pitching. Enjoy.

Real to Real world Augmented Reality

As you may have noticed I think we can do more with Augmented Reality than simple data layers on top of the real world. The principles and technology of layering anything from anywhere make some things possible that can really help us.
An example of this concept is the considering the problem of driving along the motorway behind a massive great lorry of van. You keep your distance yet you have no real awareness of whats happening just in front.

We have an augmented reality of a rear view mirror to give us situational awareness of what is behind.
So what would happen whilst in your car you could have an augmented reality forward view, of HUD that showed you what was in front of the obstruction using their view of the world? This is an augmented reality application but not a traditional one. It is a magic lens but it is real world to real world.
It is not without problems of course, how big, how much screen, but it can be based on peer to peer networking between the vehicles. It also has the benefit that when in a traffic jam there is not reason not to zoom you view forward hopping from car to car until you get to the problem in the road.

Lego AR in the wild

It was fitting that the last day of the holiday wandering around Disney Downtown we ended up in the Lego Innovations store. For some reason I nearly missed seeing the Augmented Reality kiosk that lets you see whats inside your lego box.
IMG_1057
I have seen this in videos before, but it was good to have a go with what is an ar Magic Mirror in a commercial environment.
Certain lego boxes are available near the kiosk and you just hold it up and away it goes. It animates the contents of the box.
There were a few peoples who did not notice it either, once I started using it a queue formed πŸ™‚ Dad’s were busy video it and kids were busy going wow look at that.
Maybe these bodes well for the next gen tech they will start using in the theme parks as we saw lots of 3d (with glasses) of course and a location based service using a mobile for a treasure hunt.
Epcot did have this
IMGP5272
An energy sweeping game. It used what looked like a metal detector or round shuffle board stick to move virtual energy “pucks” that had to be croupier pushed to various stations. It was by Siemens, and was OK but a bit laggy.

Interesting control mechanisms: Cars, Rockets and Choppers

Props to @infiniteunity3d for tweeting this. This is an example of using a different control mechanism to enhance a virtual experience. In this case it is slot car racing. It is built in Unity3d which once again shows the power to just be able to get on and try out the ideas rather than battle the middleware. (I am sure it had it moments though πŸ™‚ )
Slot car racing is a very tactile experience, the friction of the pickup on the rheostat and the precision of trying to get just the right speed for a corner is the charm of slot racing. Scalextric was a big part of my childhood.

AlabSoftware have hooked the controller up to a set of slot racing cars. They did not just stick to a top down view either it is a full 3d model (as that is what Unity3d is very good at indeed).
Nearly the first thing I built in Second Life was a slot car racing system. I say nearly because other things like the tennis got in the way. My initial thoughts were to build a construction set in a world that was by its very nature a construction set. Being able to provide limited resources (track pieces) and specialized types of car etc had the makings of an economy and toolkit within the economy and toolkit.
There is a definite movement to more haptic control interfaces, none more so than the blend that has been created and released on the world at CES 2010 with the Parrot AR helicopter. A wirelessly controlled flying toy/copter that you control with your iphone and that also provides video and data feedback to you with various headsup displays and tricks. It is a form of AR, but like most AR the term gets used for all sorts of things.

Sitcking a camera on a vehicle is of course not a new thing DCJ put a wireless camera on his rocket at the launch day we had back in 2008 but the Parrot AR Drone is a very nice evolution and combination of technology and design

Immersive video live in 360 degrees

A long while back we had 360 degree panoramic webcams that let people direct what they see. Now we of course have google street views of the world, but I was impressed with this 360 video that you can drag around of NYC

According to this article live versions of Immersive Media’s technology is being used for an on demand music service.
Of course once you have live 360 video coverage you can still instrument that display and provide virtual augmented reality over the top, which is an interesting twist I think.

Augmenting Augmented Reality

Whilst having a god with the 3dvia iphone application I decided to see if I could use my Evolver model and upload it to 3dVia, and get myself composited into the real world. The answer is of course you can its just a 3ds model! However I then wondered what would start to happen if we started to augment augmented reality, even in the light sense. I had done some of this with the ARTag markers way back on eightbar. and also here putting the markers into the virtual world and then viewing the virtual world through the magic window of AR.

That of course was a lot of messing about. This however, was simple, handheld and took only a few seconds to augment the augmentation.
AR me
Or augment my view of Second Life.

So these are the sort of live 3d models we are likely to be able to move in realtime with the 3d elements in Layar very soon.
Likewise we should be able to use our existing virtual worlds, as much as the physical world to define points of interest.
I would like to see (and I am working on this a bit) an AR app that lets me see proxies of opensim objects in relative positions projected in an AR iphone app that lets be tag up a place or a mirror world building with virtual objects. Likewise I want to be able to move the AR objects from my physical view and update their positions in opensim.
All the bits are there, just nearly linked up πŸ™‚
We know it works taking real world input and injecting that in a virtual world (that was Wimbledon et al.) or marker based approaches such as I did over a year ago with my jedi mind numbers

I am thinking that between things like the Second Life media API’s or direct Opensim elements knowing what is going on in the world, with a combination of sharing services for models like 3Dvia, and wizards like Evolver all jammed into smartphones like the iphone we have a great opportunity.

Bionic Eye – Augmented Reality on the Iphone (officially)

@kohdspace just tweeted this piece of news from Mobile Crunch.
After what amounted to some terms of service violation hacks where people built AR applications for the iPhone it seems that this Bionic Eye is an official AppStore approved version.
I don’t have an iPhone (yet) despite being a registered iPhone developer, but if the api’s are released and working I expect a huge flood of AR applications, and I suspect there are some very clever ideas yet to be implemented.
I want to be able to create my relationship between the data and the real world using a mirror world or control deck in 3d in somewhere like Second Life and then be able to effectively push new AR overlays and contexts out to people.
I think this will feel a little like the renamed Operation Turtle now called Hark but for AR. In fact Hark is a handy name isn’t it as it already has AR letters in it. There I go thinking out loud and inventing things.
I would love to hear if anyone has this bionic eye app already, or if anyone is working on a mashup with the virtual worlds we already have rather than just GPS overlays.