As you may have noticed I think we can do more with Augmented Reality than simple data layers on top of the real world. The principles and technology of layering anything from anywhere make some things possible that can really help us.
An example of this concept is the considering the problem of driving along the motorway behind a massive great lorry of van. You keep your distance yet you have no real awareness of whats happening just in front.
We have an augmented reality of a rear view mirror to give us situational awareness of what is behind.
So what would happen whilst in your car you could have an augmented reality forward view, of HUD that showed you what was in front of the obstruction using their view of the world? This is an augmented reality application but not a traditional one. It is a magic lens but it is real world to real world.
It is not without problems of course, how big, how much screen, but it can be based on peer to peer networking between the vehicles. It also has the benefit that when in a traffic jam there is not reason not to zoom you view forward hopping from car to car until you get to the problem in the road.
It was fitting that the last day of the holiday wandering around Disney Downtown we ended up in the Lego Innovations store. For some reason I nearly missed seeing the Augmented Reality kiosk that lets you see whats inside your lego box.
I have seen this in videos before, but it was good to have a go with what is an ar Magic Mirror in a commercial environment.
Certain lego boxes are available near the kiosk and you just hold it up and away it goes. It animates the contents of the box.
There were a few peoples who did not notice it either, once I started using it a queue formed 🙂 Dad’s were busy video it and kids were busy going wow look at that.
Maybe these bodes well for the next gen tech they will start using in the theme parks as we saw lots of 3d (with glasses) of course and a location based service using a mobile for a treasure hunt.
Epcot did have this
An energy sweeping game. It used what looked like a metal detector or round shuffle board stick to move virtual energy “pucks” that had to be croupier pushed to various stations. It was by Siemens, and was OK but a bit laggy.
Props to @infiniteunity3d for tweeting this. This is an example of using a different control mechanism to enhance a virtual experience. In this case it is slot car racing. It is built in Unity3d which once again shows the power to just be able to get on and try out the ideas rather than battle the middleware. (I am sure it had it moments though 🙂 )
Slot car racing is a very tactile experience, the friction of the pickup on the rheostat and the precision of trying to get just the right speed for a corner is the charm of slot racing. Scalextric was a big part of my childhood.
AlabSoftware have hooked the controller up to a set of slot racing cars. They did not just stick to a top down view either it is a full 3d model (as that is what Unity3d is very good at indeed).
Nearly the first thing I built in Second Life was a slot car racing system. I say nearly because other things like the tennis got in the way. My initial thoughts were to build a construction set in a world that was by its very nature a construction set. Being able to provide limited resources (track pieces) and specialized types of car etc had the makings of an economy and toolkit within the economy and toolkit.
There is a definite movement to more haptic control interfaces, none more so than the blend that has been created and released on the world at CES 2010 with the Parrot AR helicopter. A wirelessly controlled flying toy/copter that you control with your iphone and that also provides video and data feedback to you with various headsup displays and tricks. It is a form of AR, but like most AR the term gets used for all sorts of things.
Sitcking a camera on a vehicle is of course not a new thing DCJ put a wireless camera on his rocket at the launch day we had back in 2008 but the Parrot AR Drone is a very nice evolution and combination of technology and design
This is a great promo done for an AR application featuring an Iphone. What is particularly interesting is the ability to interact with the virtual iphone one the iphone and spark up applications. Take a look :). It is by Ogmento
A long while back we had 360 degree panoramic webcams that let people direct what they see. Now we of course have google street views of the world, but I was impressed with this 360 video that you can drag around of NYC
According to this article live versions of Immersive Media’s technology is being used for an on demand music service.
Of course once you have live 360 video coverage you can still instrument that display and provide virtual augmented reality over the top, which is an interesting twist I think.
That of course was a lot of messing about. This however, was simple, handheld and took only a few seconds to augment the augmentation.
Or augment my view of Second Life.
So these are the sort of live 3d models we are likely to be able to move in realtime with the 3d elements in Layar very soon.
Likewise we should be able to use our existing virtual worlds, as much as the physical world to define points of interest.
I would like to see (and I am working on this a bit) an AR app that lets me see proxies of opensim objects in relative positions projected in an AR iphone app that lets be tag up a place or a mirror world building with virtual objects. Likewise I want to be able to move the AR objects from my physical view and update their positions in opensim.
All the bits are there, just nearly linked up 🙂
We know it works taking real world input and injecting that in a virtual world (that was Wimbledon et al.) or marker based approaches such as I did over a year ago with my jedi mind numbers
I am thinking that between things like the Second Life media API’s or direct Opensim elements knowing what is going on in the world, with a combination of sharing services for models like 3Dvia, and wizards like Evolver all jammed into smartphones like the iphone we have a great opportunity.
@kohdspace just tweeted this piece of news from Mobile Crunch.
After what amounted to some terms of service violation hacks where people built AR applications for the iPhone it seems that this Bionic Eye is an official AppStore approved version.
I don’t have an iPhone (yet) despite being a registered iPhone developer, but if the api’s are released and working I expect a huge flood of AR applications, and I suspect there are some very clever ideas yet to be implemented.
I want to be able to create my relationship between the data and the real world using a mirror world or control deck in 3d in somewhere like Second Life and then be able to effectively push new AR overlays and contexts out to people.
I think this will feel a little like the renamed Operation Turtle now called Hark but for AR. In fact Hark is a handy name isn’t it as it already has AR letters in it. There I go thinking out loud and inventing things.
I would love to hear if anyone has this bionic eye app already, or if anyone is working on a mashup with the virtual worlds we already have rather than just GPS overlays.
This just turned up on my twitter stream (via @jefftippett) it was reported in Fast Company (well worth a look at that article) It is obviously amazing, the pictures and subtitles say everything really.
Using technology of various sorts to provide inpuo to, and get output from a holographic display. Another glimpse of the future made real.
It has been intriguing to see and hear various bits and pieces about James Cameron’s Avatar project over the past few years. It has been kept pretty much under wraps. At the virtual worlds conference way back in September 2008 we heard the keynote conversation from John Landau with Corey Bridges of Multiverse which alluded to the role of multiverse in the project. Cameron and Landau being on the advisory board of Multiverse, though we know little more than that.
However these videos are appearing on youtube of an Augmented Reality component to the merchandising.
It also looks like markers on the card are used as buttons to cause actions when they are covered from view by a finger pressing them. Its very impressive I think you will agree.
As I wrote the other day AR really is what mobile devices are for. Today Brand Republic (which I saw via various tweets) wrote about this Iphone App that uses the 3GS to locate tube stations and lines (which by their very nature are hard to see as they are undergound).
I also recently tweeted “tweet from future: They used to sit a keyboards and look at screens a few years ago, can you believe that?. AR changed that”.
All that combined with some haptics on the way (possibly) from Apple, this market is set to explode I think. (Thanks to Koreen for bringing that one into my line of sight)