It is interesting looking, spotting, patterns and trends and extrapolating the future. Sometimes things come in and get focus because they are mentioned a lot on multiple channels and sometimes those things converge. Right at the moment it seems that Microsoft are doing all the right things in the emerging tech space, game technology and general interest space.
Firstly has been the move my Microsoft to make Minecraft even easier to use in schools.
They are of course following on and officially backing the sort of approach the community has already taken. However a major endorsement and drive to use things like Minecraft in education is fantastic news. The site is education.minecraft.net
And just to resort to type again, this is, as the video says, not about goofing around in a video game. This is about learning maths, history, how to communicate online. These are real metaverse tools that enable rich immersive opportunities, on a platform that kids already know and love. Why provide them with a not very exciting online maths test when they can use multiple skills in an online environment together?
Microsoft have also supported the high end sciences. Announcing teaming up with NASA to use the blended/mixed/augmented reality headset Hololens to help in the exploration of Mars. This is using shared experience telepresence. Bringing mars to the room, desk, wall that several people are sharing in a physical space. The Hololens and the next wave of AR are looking very exciting and very importantly they have an inclusion approach to the content. Admittedly each person needs a headset, but using the physical space each person can see the same relative view of the data and experience as the rest of the people. The don’t have to of course :), there is not reason for screens not to show different things in context for each user. However standard VR is in immersed experience without much awareness of others. A very important point I feel.
Microsoft have also published a paper and video about using depth scanning and live understanding and labelling of real world objects. Something that things like Google Tango will be approaching.
Slightly better than my 2008 attempt (had to re-use this picture again of the Nokia N95)
The real and virtual are becoming completely linked now. Showing how the physical universe is actually another plane alongside all the other virtual and contextual metaverses.
It is all linked and not about isolation, but understanding and sharing of information, stories and facts.
We are on the cusp of a huge change in how we as humans interact with one another, with the world and with the things we create for one another. A bold statement, but one that stands up, I believe, by following some historical developments in technology and social and work related change.
The change involves all the great terms, Augmented Reality, Virtual Reality, Blended Reality and the metaverse. It is a change that has a major feature, one of untethering, or unshackling us as human beings from a fixed place or a fixed view of the world.
Here I am being untethered from the world in a freefall parachute experience, whilst also being on TV 🙂
All the great revolutions in human endeavour have involved either transporting us via our imagination to another place or concept, books, film, plays etc. or transporting us physically to another place, the wheel, steam trains, flight. Even modern telecommunications fit into that bracket. The telephone or the video link transport us to a shared virtual place with other people.
Virtual worlds are, as I may have mentioned a few times before, ways to enhance the experience of humans interacting with other humans and with interesting concepts and ideas. That experience, up to know has been a tethered one. We have evolved the last few decades becoming reliant on rectangular screens. Windows on the world, showing us text, images, video and virtual environments. Those screens have in evolved. We had large bulky cathode ray tubes, LED, Plasma, OLED and various flat wall projectors. The screens have always remained a frame, a fixed size tethering us to a more tunnel vision version. The screen is a funnel through which elements are directed at us. We started to the untethering process with wi-fi and mobile communications. Laptops, tablets and smartphones gave us the ability to take that funnel, a focused view of a world with us.
More recent developments have led to the VR headsets. An attempt to provide an experience that completely immerses us by providing a single screen for each eye. It is a specific medium for many yet to be invented experiences. It does though tether us further. It removes the world. That is not to say the Oculus Rift, Morpheus and HTC Vive are not important steps but they are half the story of untethering the human race. Forget the bulk and the weight, we are good at making things smaller and lighter as we have seen with the mobile telephone. The pure injection of something into our eyes and eyes via a blinkering system feels, and is, more tethering. It is good at the second affordance of transporting is to places with others, and it is where virtual world and the VR headsets naturally and obviously co-exist.
The real revolution comes from full blended reality and realities. That plural is important. We have had magic lens and magic mirror Augmented Reality for a while. Marker based and markerless ways to use one of these screens that we carry around or have fixed in out living rooms to show us digital representations of things places in out environment. They are always fun. However they are almost always just another screen in our screens. Being able to see feel and hear things in our physical environment wherever we are in the world and have them form part of that environment truly untethers us.
Augmented Reality is not new of course. We, as in the tech community, has been tinkering with it for years. Even this mini example on my old Nokia N95 back in 2008 starts to hint at the direction of travel.
The devices used to do this are obviously starting at a basic level. Though we have the AR headsets of Microsoft Hololens, Google’s Magic Leap to start pushing the boundaries. They will not be the final result of this massive change. With an internet of things world, with the physical world instrumented and producing data, with large cloud servers offering compute power at the end of an wireless connection to analyse that data and to be able to visualize and interact with things in our physical environment we have a lot to discover and to explore.
I mentioned realities, not just reality. There is no reason to only have augmentation into a physical world. After all if you are immersed in a game environment or a virtual world you may actually choose, because you can, to shut out the world, to draw the digital curtains and explore. However, just as when you are engaged in any activity anywhere, things come to you in context. You need to be able to interact with other environments from which ever one you happen to be in.
Take an example, using Microsoft Hololens, Minecraft and Skype. In todays world you would have minecraft on you laptop/console/phone be digging around, building sharing the space with others. Skype call comes in from someone you want to talk to. You window away from Minecraft and focus on Skype. It is all very tethered. In a blended reality, as Hololens has shown you can have Minecraft on the rug in front of you and skype hanging on the wall next to the real clock. Things and data placed in the physical environment in a way that works for you and for them. However you may want to be more totally immersed in Minecraft and go full VR. If something can make a small hole in the real world for the experience, then it can surely make an all encompassing hole, thus providing you with only Minecraft. Yet, if it can place Skype on your real wall, then it can now place it on your virtual walls and bring that along with you.
This is very much a combination that is going to happen. It is not binary to be either in VR or not, in AR or not. It is either, both or neither.
It is noticeable that Facebook, who bought heavily into Oculus Rift have purchased Surreal Vision last month who specialize in using instrumentation and scanning kit to make sense of the physical world and place digital data in that world. Up until now Oculus Rift, which has really led the VR charge since its kickstarter (yes I backed that one!) has been focussed on the blinkered version of VR. This purchase shows the intent to go for a blended approach. Obviously this is needed as otherwise Magic Leap and Hololens will quickly eat into the Rifts place in the world.
So three of the worlds largest companies, Facebook, Google and Microsoft have significant plays in blended reality and the “face race” as it is sometimes called to get headsets on us. Sony have Project Morpheus which is generally just VR, yet Sony have had AR applications for many years with PS Move.
Here is Predlet 2.0 enjoying the AR Eyepet experience back in 2010 (yes five years ago !)
So here it is. We are getting increasingly more accurate ways to map the world into data, we have world producing IOT data streams, we have ever increasing ubiquitous networks and we have devices that know where they are, what direction they are facing. We have high power backend servers that can make sense of our speech and of context. On top of that we have free roaming devices to feed and overlay information to us, yes a little bit bulky and clunky but that will change. We are almost completely untethered. Wherever we are we can experience whatever we need or want, we can, in the future blend that with more than one experience. We can introduce others to our point of view or keep our point of view private. We can fully immerse or just augment, or augment our full immersion. WE can also make the virtual real with 3d printing!
That is truly exciting and amazing isn’t it?
It makes this sort of thing in Second Life seem an age away, but it is the underpinning for me and many others. Even in this old picture of the initial Wimbledon build in 2006 there is a cube hovering at the back. It is one of Jessica Qin’s virtual world virtual reality cubes.
That cube and other things like it provided an inspiration for this sort of multiple level augmentation of reality. It was constrained by the tethered screen but was, and still is, remarkable influential.
Here my Second Life avatar, so my view of a virtual world is inside 3d effect that provides a view, a curved 3d view not just a picture, or something else. It is like the avatar is wearing a virtual Oculus rift it that helps get the idea across.
This is from 2008 by way of a quick example of the fact even then it could work.
It is important when getting interested and excited about new things to look at its lineage. Something I often did in my series of articles over the years in Flush Magazine
So with the current rush of Virtual Reality gaming and experiences and the slew of new kit we have some very interesting near off the shelf kit such as this.
*warning it contains violence (see where I am on that here)
However, back in the 90’s we had VR kit like Virtuality (they must be kicking themselves for be too early for the masses. Though thats a curse us early adopters and evangelists have to live with 🙂
The headsets were quite heavy and the container you stood or sat in was not a treadmill but part of the sensing rig.
The graphics may look old fashioned and clunky but they were good experiences. When I was at poly/uni in Leicester getting ready to do a year out this was the company I wanted to go and work for if I had a chance too. They sent to me IBM instead. Though as you can see from the Virtuality company details they had very close ties with IBM. So it was close 🙂 It i also funny how things work out.
Still, it is very cool having all these headset sat around me and some that just work on my portable communication device too 🙂
A few days ago I realised it has been over 9 years since I first publicly blogged about how important I thought the principles of the metaverse and virtual worlds were going to be for both social and business uses. This post, pictured below for completeness was a tipping point for some radical changes in many of our lives as part of Eightbar.
I had been working to that sort of point of understanding though since some very early work with virtual environments and how people get to interact with one another in them around 1999/2000 with SmartVR trying to keep the social bond in our internal web and multi media design group together when we were cast asunder to different locations by business pressures (or bad decisions who knows!). I knew we had to have a sense of one another existence aside from text in emails and instant messages. So we tried to build a merged version of both offices as if they were in one place. The aim to then instrument those with presence and the ability to walk over to someone’s desk and talk. Mirror world, blended reality and even internet of things, Yes I know, a bit before its time! Cue music, “Story of my life” by 1D 🙂
We definitely had a technology and expectation bubble later 2006-2009. However, that, as with all emerging technology is all part of the evolution. The garnet curve et al. What surprises me the most still is that people think when a bubble like that bursts that its all over. That somehow everything that was learned in that time was pointless. “Are virtual worlds still a thing?” etc. I feel for the even earlier pioneers like Bruce Damer, who patiently put up with our ramblings as we all rushed to discover and feel for ourselves those things he already knew.
Increasingly I am talking to new startup ands seeing new activity in the virtual space. The same use cases, the same sparks of creativity that we had in the previous wave(s), the same infectious passion to do something interesting and worthwhile. Sometimes this is somehow differentiated from the last wave of virtual worlds under the heading of virtual reality. The current wave is focussed on the devices, the removal of keyboard and of a fixed screen. The Oculus Rift, HoloLens etc. However, thats a layer of new things to learn and experience on what we have already been through. After all its a virtual world you end up looking at or blending with in a VR headset!
I spend so much time looking forward and extrapolating concepts and ideas it is now very scary to look back and consider the experience I have gathered. The war stories and success stories, the concepts and ideas that I have tried. The emotional impact of virtual worlds. The evolution of the technology and of people’s expectation of that technology. The sheer number of people that have moved around in a 3d environment from an early age with things like Minecraft, who are now about to enter higher education and the workforce.
So I am left in a slightly bemused state as to what to do with this knowledge. With this all going so much more mainstream again I am no longer working in a niche. Do I ply my trade of independent consulting chipping away in odd places and helping and mentoring some of the new entrants in the market or do I try and find a bigger place to spread the word?
At the same time though, knowing lots of things makes you realise how much you don’t know. The imposter’s syndrome kicks in. Surely everyone must know all this stuff by now? It’s obvious and stands up to logical reasoning to try and connect with other people in as rich a way as possible. The network is there, the tech is there, the lineage is there. Though clearly not everyone gets it yet. I often wonder if the biggest naysayers I had to deal with on my journey so far have figured it out yet? It will probably turn out they will be getting rich quick right now whilst I sit and ponder such things.
On the other side of the virtual coin, I know from my martial arts that constant practice and investigation leads to a black belt. In the case of Choi Kwang Do that’s just over 3 years worth. So how many Dan worth of tech equivalent experience does that put me at. 25 years in the industry professionally but 30+ as techie.
I still want to do the right thing, to help others level themselves up. I don’t think I am craving fame and fortune but the ability to share and build is what drives me. If fame and fortune as a spokesperson, evangelist for some amazing idea or TV show reaches that end, then thats fine by me.
I am at a crossroads, my VR headset letting me look in all directions. I see half built roads in many directions. What now? A well funded company that I can help build a great road with, or forge off down one for the other paths seeing what happens on the way.
Of course that makes it seem like there is a clear choice on the plate. I suspect most of the the well funded companies and corporations don’t think they need any help, which is rather where I came in on this!
Needless to say I am always open to conversations, offers, partnerships, patronage, retainers and technology evangelist roles. There must be a slew of investors out there wondering what to put their money into, who need some sagely advice ;). Or that book… there is always that book… (600 images x 1000 words each 600,000 words just on these Second Life experiences. That’s just the ones with online. The offline ones double that!. Not to mention the other places, games, and self built experiences!
I have always liked a nice greenfield to start building on. Equally that did not build itself, it was a massive shared team experience. No one has all the answers. Some of us are good at helping people find them though.
There is always something going on in science and emerging technology. However some weeks just bring a bumper bundle of interesting things all at once. Here in the UK the biggest event had to been the near total eclipse of the Sun. We had some great coverage on the the TV with Stargazing live sending a plane up over the Faroe islands to capture the total eclipse. I was all armed and ready with a homemade pinhole camera.
This turned out great but unfortunately we were quite overcast here so it was of little use as a camera. I also spent the eclipse at Predlet 2.0 celebration assembly. They had the eclipse on the big screen for all the primary school kids to see. Whilst we had the lights off in the hall it did not get totally dark, but it did get a bit chilly. It was great that the school keyed into this major event that demonstrates the motion of the planets. So rather like the last one in 1999 I can certainly say I will remember where I was and what we were doing.(a conversation I had with @asanyfuleno on Twitter and Facebook)
This brings me on to our technological change and the trajectory we are on. In 1999 I was in IBM Hursley with my fellow Interactive Media Centre crew. A mix of designers, producers and techies and no suits. It was still the early days of the web and we were building all sorts of things for all sorts of clients. In particular during that eclipse it was some more work for Vauxhall cars. We downed tools briefly to look out across Hursley park to see the dusk settle in and flocks of birds head to roost thinking it was night.
It does not seem that long ago but… it is 16 years. When we were building those quite advanced websites Amazon was just starting, Flickr was 6 years away, Twitter about 7 years away, Facebook a mere 5 (but with a long lead time) and we were only on Grand Theft Auto II, still a top down pac man clone. We were connected to lots of our colleague son instant messaging but general communications were phone and SMS and of course email. So we were not tweeting and sharing pictures, or now as people do live feeds on Meerkat. Many people were not internet banking, trust in communications and computers was not high. We were pre dot.com boom/bust too. Not to mention no one really had much internet access out and about or at home. Certainly no wi-fi routers! We were all enthralled by the still excellent Matrix movie. The phone in that, the slide down communicator style Nokia being one of the iconic images of the decade.
NB. As I posted this I saw this wonderful lego remake of the lobby scene so just had to add it in this post 🙂
It was a wild time of innovation and one many of us remember fondly I think. People tended to leave us alone as we brought in money doing things no managers or career vultures knew to jump on. So that eclipse reminds me of a time I set on a path of trying to be in that zone all the time. I was back then getting my first samples from a company that made 3d printers as I was amazed at the principle, and I was pondering what we could do with designers that knew 3d and this emerging tech. We were also busy playing Quake and Unreal in shared virtual worlds across the LAN in our downtime so I was already forming my thoughts on our connection to one another through these environments. Having experiences that I still share today in a newer hi tech world where patterns are repeating themselves, but better and faster.
That leads me to another movie reference and in the spirit of staying in this zone. This footage of a new type of Terminator T-1000 style 3d manufacturing. 3D printers may not be mainstream as such but many more people get the concept of additive manufacture. Laying down layer after layer of material such as plastic. It is the same as we made coil clay pots out of snakes of rolled clay when we were at school. A newer form of 3D printing went a little viral on the inter webs this week from carbon3d.com. This exciting development pulls an object out of a resin. It is really the same layering principle but done in a much more sophisticated way. CLIP (Continuous Liquid Interface Production) balances exposing the resin to molecules to oxygen or to UV light. Oxygen keeps it as a liquid (hence left behind) and targeted UV light causes the resin to become solid, polymerization. Similar liquid based processes use lasers to fire into a resin. This one though slowly draws the object out of the resin. Giving it a slightly more ethereal or scifi look to it. It is also very quick in comparison to other methods. Whilst this video is going faster than actual speed it is still a matter of minutes rather than hours to create objects.
Another video doing the round that shows some interesting future developments is one from Google funded Magic Leap. This is a blended reality/augmented reality company. We already have Microsoft moving into the space with Hololens. Much of Magic Leap’s announcements have not been as clearly defined as one might hope. There is some magic coming and it is a leap. Microsoft of course had a great pre-release of Hololens, some impressive video but some equally impressive testimonials and articles from journalist and bloggers who got to experience the alpha kit. The video appeared to be a mock up but fairly believable.
Magic Leap were set to do a TED talk but apparently pulled out at the last minute and this video appeared instead.
It got a lot of people excited, which is the point, but it seems even more of a mock up video than any of the others. It is very ell done as the Lord of the Rings FX company Weta Workshop have a joint credit. The technology is clearly coming. I don’t think we are there yet in understanding and getting the sort of precise registration and overlays. We will, and one day it may look like this video. Of course it’s not just the tech but the design that has to keep up. If you are designing a game that has aliens coming out of the ceiling it will have a lot less impact if you try and play outside or in an atrium with a massive vaulted ceiling. The game has to understand not just where you are and what the physical space is like but how to use that space. Think about an blended reality board game, or an actual board game for that matter. The physical objects to play Risk, Monopoly etc require a large flat surface. Usually a table. You clear the table of obstructions and set up and play. Now a project board game could be done on any surface, Monopoly on the wall. It could even remove or project over things hung on the wall, obscure lights etc. It is relying on a degree of focus in one place. A fast moving shooting game where you walk around or look around will be reading the environment but the game design has to adjust what it throws at you to make it continue to make sense. We already have AR games looking for ghosts and creatures that just float around. They are interesting but not engaging enough. Full VR doesn’t have this problem as it replaces the entire world with a new view. Even in that there are lots of unanswered questions of design, how stories are told, cut scenes, attracting attention, user interfaces, reducing motion sickness etc. Blending with a physical world, where that world could be anywhere or anything is going to take a lot more early adopter suffering and a number of false starts and dead ends. It can of course combine with rapid 3d printing, creating new things in the real world that fit with the game or AR/BR experience. Yes thats more complexity, more things to try and figure out. It is why it is such a rich and vibrant subject.
Just bringing it back a little bit to another development this week. The latest in the Battlefield gaming franchise Battlefield Hardline went live. This, in case you don’t do games, is a 3d first person shooter. Previous games have been military, this one is cops and robbers in a modern Miami Vice tv style. One of the features of Battlefield is the massive online combat. It features large spaces and it makes you feel like a small spec in the map. Other shooters are more close in like Call of Duty. The large expanse means Battlefield can focus on things like vehicles. Flying helicopters and driving cars. Not just you though, you can be a pilot and deliver your colleagues to the drop zone whilst you gunner gives cover.
This new game has a great online multiplayer mode called hotwire that apps into vehicles really well. Usually game modes are capture the flag or holding a specify fixed point to win the game. In hotwire you grab a car/lorry etc and try and keep that safe. It means that you have to do some mad game driving weaving and dodging. It also means that you compatriots get to hand out of the windows of the car trying to shoot back at the bad guys. It is very funny and entertaining.
What also struck me was the 1 player game called “episodes”. This deliberately sticks with a TV cop show format as you play through the levels. After a level has finished the how you did page looks like Netflix with a next episode starts in 20 seconds down in the bottom right. If you quite a level before heading to the main menu it does a “next time in Battlefield Hardline” mini montage of the next episode. As the first cut scenes player I got a Miami Vice vibe which the main character then hit back by referencing it. It was great timing, and in joke, but one for us of a certain age where Miami Vice was the show to watch. Fantastic stuff.
I really like its style. It also has a logo builder on the website so in keeping with what I always do I built a version of the Feeding Edge logo in a Hardline style.
I may not be great at the game, as I bounce around looking for new experiences in games, but I do like a good bit of customisation to explore.
If you are getting excited and interested, or just puzzling what is going on with the Microsoft announcement about Hololens and can’t wait the months/years before it comes to market then there are some other options, very real, very now.
Just before christmas I was very kindly sent a prototype of new headset unit that uses an existing smartphone as its screen. It is called MergeVR. The first one like this we saw was the almost satirical take on Oculus Rift that Google took with Google Cardboard. A fold up box that let you strap your android to your face.
MergeVR is made of very soft, comfortable spongy material. Inside are two spherical lenses that can be slid in and out lateral to adjust the divergence of you eyes and get a comfortable feel.
Rather like the AntVR I wrote about last time, this uses the principle of one screen, split into two views. The MergeVR uses you smart phone as the screen and it slides comfortably into the spongey material at the front.
Using an existing device has its obvious advantages. The smartphones already have direction sensors in them, and screens designed to be looked at close up.
MergeVR is not just about 3d experience of Virtual Reality (One where the entire view is computer generated). It is, by its very name Merge, about augmented reality. In this case it is an augmented reality taking your direct view of the world and adding data and visuals to it. This is knows as a magic lens. You look through the magic lens, and see things you would not normally be able to see. As opposed to a magic mirror which you look at a fixed TV screen to see the effects of a camera merging with the real world.
The iPhone (in my case) camera has a slot to see through on the in the MergeVR. This makes it very difference from some of the other Phone On Face (POF – made up acronym) devices. The extra free device I got with the AntVR, the TAW is one of these non pass through POF’s. It is a holder, and lenses with a folding mechanism to adjusts to hold the phone in place. With no pass through it is just to watch 3d content.
Okay so the MergeVR is able to let you use the camera, see the world, and then you can watch the screen close up without holding anything The lenses make you left eye look at the right half and the right eye at the left half. One of the demo applications is instantly effective and has a wow factor. Using a marker based approach a dinosaur is rendered in 3d on the marker. Marker based AR is not new, neither is iPhone AR, but the stereoscopic hands free approach where the rest of the world is effectively blinkered for you adds an an extra level of confusion for the brain. Normally if you hold a phone up to a picture marker, the code will spot the marker, the orientation of the marker and relative position in the view then render the 3d model on top. So if you, or the marker moves the model is moved too. When holding the iPhone up you can of course still see around it, rather like holding up a magnifying glass (magic lens remember). When you POF though your only view of the actual world is the camera view of the phone. So when you see something added and you move your body around it is there in your view. It is only the slight lag and the fact the screen is clearly not the same resolution or same lighting as the real world that causes you to not believe it totally.
The recently previewed microsoft Hololens and the yet to be seen Google funded Magic Leap are a next step removing the screen. They let you see the real world, albeit through some panes of glass, and then use project tricks near to the eye, probably very similar to peppers ghost, to adjust what you see and how it is shaded, coloured etc. based on a deep sensing of the room and environment. It is markerless room aware blended reality. Using the physical and the digital.
Back to the MergeVR. It also comes with a bluetooth controller for the phone. A small hand held device to let you talk to the phone. Obviously the touch screen when in POF mode means you can’t press any buttons 🙂 Many AR apps and examples like the DinoAR simply use your head movements and the sensors in the phone to determine what is going on. Other things though will need some form of user input. As the phone can see, it can see hands, but not having a Leap motion controller or a kinect to sense the body some simpler mechanism can be employed.
However, this is where MergeVR gets much more exciting and useful for any of us techies and metaverse people. The labs are not just thinking about the POF container but the content too. A Unity3d package is being worked on. This provides camera prefabs (Rather like the Oculus Rift one) that splits the Unity3D view into a Stereo Camera when running into the right shape and size, perspective etc for the MergeVR view. It provides extra access to the bluetooth controller inputs too.
This means you can quickly build MergeVR 3d environments and deploy to the iPhone (or Droid). Combine this with some of the AR toolkits and you can make lots of very interesting applications, or simply just add 3d modes to existing ones you have. With the new unity3d 4.6 user interfaces things will be even easier to have headsup displays.
So within about 2 minutes of starting Unity I had a 3d view up on iPhone on MergeVR using Unity remote. The only problem I had was using the usb cable for quick unity remote debugging as the left hand access hole was a little too high. There is a side access on the right but the camera need to be facing that way. Of course being nice soft material I can just make my own hole in it for now. It is a prototype after all.
It’s very impressive, very accessible and very now (which is important to us early adopters).
Lets get blending!
(Note the phone is not in the headset as I needed to take the selfie 🙂
I backed a project called AntVR to produce another VR headset and it recently arrived. Whilst all these things may looks the same this is a slightly different approach the now famous Oculus Rift.
The box arrived via UPS, unfortunately there seems to have been a mistake on the shipping. Whilst this was included in the Kickstarter price I had to pay the delivery man another £35. That’s early adoption for you!
The main headset unit is about the same size and weight as a Oculus DK2. The main cables rather than passing over your back drop down either side of your cheeks.
The inside view is not two round lenses as with the rift but 2 much flatter rectangular lenses. It fits over my regular glasses (though being varifocals are not idea for any of these headsets)
The reason for the different in sense shape is quite fundamental to the entire design. The AntVR uses one screen inside the headset. The lenses direct the eye to focus on that screen. The Rift appears to have 2 screens one for each eye, each eye has a spherical image projected onto it. The AnyVR is really a regular monitor looked at up close.
Whilst the latter is a lesser experience it does have a major advantage. The AntVR has a mode button on it to let it just be a regular desktop screen. This makes for much less messing around (particularly in windows) with which monitor is which. Often with the Rift you find yourself trying to half look at the regular desktop when working in extended mode or swapping the primary monitor when an application doesn’t use direct mode.
So once in the AntVR you are looking at a big screen. The mode button on the unit lets you switch to side by side 3d. So anything, like youtube, that has SBS video you can watch straight away, looking at a normal view, navigate to the browser etc. Switch to full screen then press the button on the unit and you have 3d. I did find on Windows 8.1 internet explorer seems to refuse to show SBS video on youtube. The stub bar shows side by side images but the main window just the one. On chrome it was fine. Initially I though the headset was causing it but I think it was just the browser being the browser it is!
The unit has two sliders to move the lenses outward and inwards laterally to adjust for your own eye comfort.
The headset acts as a mouse pointer with the accelerometers in it moving the mouse around, which is not always very helpful to start with.
The kit then diverges from the Rift in what it provides.
Firstly it will plug into any HDMI source (needing USB to power it). Because it is not always in SBS 3D it doesn’t need the spherical processing the Rift has to do. Obviously that flexibility is a trade off with the field of view of the Rift and the comfort factor on the eyes.
The AntVR also comes with a peripheral. It is rather like some Klingon device, a few parts clip together and you have what looks like a gun controller.
This controller can also morph, with 2 handles plugged together, into a joypad. Both parts are charged by USB so are both active units.
In addition to the “gun” there is also a cleaning device, a small rubber bladder and nozzle for blowing dirt out of the lenses. They refer to this as the “hand grenade”.
All this came through the post from China. I suspect the delay getting here may have been some puzzled customs people.
There are demo games that are ready to download. These are designed to be 3d SBS, they are both horror/zombie ones. I am not sure if I configured things correctly, but the thumbstick on the back of the gun did allow me to move my character in an FPS environment. The gun trigger worked but it was a little odd using my head to turn and look at the bad guys but holding a gun that it didn’t matter where I pointed it. I think I needed to plug a few more things in.
Another interesting feature is a little sliding trap door underneath the headset that lets you peer down and see your hands. Great for typing !
So fr I have only tried the unit on my windows laptop. For gaming, Elite Dangerous for example the Rift DK2 with its head position sensor and spherical eye view is much more comfortable and immersive. However getting the thing running can be a pain with some many different ways windows likes to mess with graphics cards and extended monitors.
AntVR seemed to just work out of the box, the gun controller might be a bit over the top but the ability to work on anything and look at any content without too much hassle is interesting.
Next up I need to write about the two smartphone as a VR screen devices I have here, one of which was an extra thrown into the box with the AntVR.
We are creeping towards devices that anyone can use without too much faffing about but we have not got there yet. Then of course we need the content. The environments, the user interfaces and the new ways of engaging.
It’s starting to get towards winter, the clocks has fallen back so this months Flush Magazine, amongst other things, is all about snow, mountains, skiing and boarding. I just had something snow related to consider for my 14th article for the magazine. (That now sounds like a lot!)
I thought I should cover a bit go history, as usual. Some tech reminiscing. This time it was Hungry Horace goes skiing. Anyone else remember that ? 🙂
These sort of flashbacks to things that felt special or unusual then act a seed to zoom forward. This time I covered how snow has felt on screen in games and films through the years. From the simplistic to the “material point method for snow simulation” CGI research papers created by Disney in making films like Frozen.
I didn’t want to stick to pure virtual snow though but some other advances and where it could go in the near future to enhance our skiing experience. I did give a mention to SNOW the game as that looks mighty impressive and has some real potential IMHO.
Thanks to everyone writing for Flush Magazine, it’s great to have so many good articles to nestle in between on a cold winters night. Huge thanks to @tweetthefashion for such awesome production and editing and for still wanting me to write these journeys in history based futurism.
The full issue can be found below
Feeding Edge’s Oculus Rift DK2 (Development Kit version 2) arrived last week. I had an original Oculus Rift (Pre-Facebook buyout) form the Kickstarter campaign. It is amazing looking back and seen that I wrote about that in August 2013! How time flies!
The new DK2 is a big leap in technology, though it comes in a much less impressive cardboard box, rather than the foam filled attache case of the original one.
The new unit is smaller and lighter but boasts a huge increase in picture quality. There have been quite a few ergonomic tweaks too.
The placing of the wires to the unit are now in the centre top and the cable is fed over the top head strap so it doesn’t get in the way as much as the side dangling cable of the DK1.
The old control box form the DK1 is gone, this used to take all the connections and a power switch etc. It was often the first thing to fall on the floor as the cables got tangled up.
The eye pieces are noticeably bigger too, presumably to get all that extra resolution and field of view in.
Aside from the resolution of the screens the biggest change is the external tracking of the Rift’s position. DK1 knew you were turning and tipping but if you stepped or moved side to side it had no idea. DK2 comes with a sensor that need to be mounted and pointing at the rift. It can, with software that bothers to use it, tell if you are near of far right or left, up or down relative the the sensor. It is not a kinect, but it does give 3d space positioning.
My neighbours may glance up and see this tripod and thing it is facing out of the window, when in fact it is pointing at me!
Getting the rift running on the Mac was a doddle as usual. With no control box the connections are simply and HDMI for the display and two USB connections one for the camera device and one for the headset. The power lead runs into a tiny splitter block and a synch lead runs from that to the camera sensor.
It was more tricky getting it to run on windows 8.1. though. My HP laptop has 2 gfx cards and trying to persuade it to extend the display and put primary rendered graphics on the rift is a bit of a pain. I ended up with the primary display 90 turning itself 90 degrees and the secondary rift flipped 180 degrees in landscape. The demo/tuning application runs in direct mode so you don’t need any of this but lots of the other things rely on extended desktop. You can of course make rift your primary display but it is not then delivered a one image but a bit in the left eye and a bit in the right so it makes launching things difficult.
However after a bit of “well thats just windows isn’t it!” I sparked up the Elite Dangerous Beta. This had upgraded since the last time. Elite has a display on secondary option though this was not DK2 direct mode it was possible to make it work.
I was instantly blown away by the improvements to the visual resolution of DK2. My laptop managed to cope too with the environment and it may not be the best gaming rig in the world but it certainly worked. I launched out into space and got chills running down my spine with the sense of being there as the shadow of the cockpit struts moved across my field of view as I turned away from the star I was docked near. It is really really good. I mapped a button on the stick to re centre the rift view though as there was a slight drift. However 2 beta things and on windows 8.1 it was not a huge problem.
I flew around for a while taking in the scenery then got into a slight dogfight in space. Being able to pull tight turns but look up and backwards to spot the bad guy was really effective. However I did suffer a little from not being able to see my controls. Using a joystick with loads of buttons is great, but I did not realise how much I had relied on seeing button 7 ! So it did not take my adversary very long to break my ship down. In the previous beta of Elite the canopy would crack than the ship would blow and you ejected. The new version before the cracking I got sparks and smoke filling the cockpit. This was an incredible effect in the Rift. The combination of the sense of being there in the vastness of space, with the confinement of sitting in a smoking capsule is something I will remember as a key moment in my gaming and VR experiences. The rift makes sense in games like this as you probably would have a helmet on anyway. That is something a lot of games designers could consider. Rather than trying to pretend its not something you have on, embracing it and making it part of the game narrative.
I experienced a crash, in a typical windows style, whilst trying to dock as a space station. I can’t tell if it was the rift, the game or the machine that did it but at a key moment of concentration and full immersion the shock of being unplugged from the environment, to loose the cockpit and the space around me to be replaced by half of the windows desktop in each eye was horrendous. It seems the Rift and Elite where generating a different type of gaming Flow. I was not experiencing flow because the task had become automatic. I was concentrating on landing the ship in the right place. It required a lot of though. However, mentally I was experiencing sensory flow. I had stopped considering the medium as a barrier, or as anything other than real. For reality to be turned off with no warning created a mental jolt. Its a very strange feeling and one that took a few moments to analyse and think about. It shows the power of these imagination enhancers and our blended reality. I should be used to this stuff but it still hits me 🙂
Way back in 2006 when many of us started to use the persistent multi user networked virtual world with self created content (Second Life) in all sort so of place like business and education we thought we were on the cusp of a virtual world revolution. As a metaverse evangelist in a large corporation it was often an uphill struggle to persuade people that communicating in what looked like a game environment was valuable and worthwhile. It is of course valuable and worthwhile but not everyone see that straight away. In fact some people went out of their way to stop virtual environments and the people that supported and pioneered their use. Being a tech evangelist means patiently putting up with the same non arguments and helping people get to the right decision. Some people of course just can’t get past their pride, but you can’t win them all.
For me and my fellow renegades it started small in Second Life
I also captured many of the events over the course that first year in a larger set along with loads of blog posts on eightbar including this my first public post that always brings back a lot of memories of the time and place. It was a risky thing to post (if you are worried about career prospects and job security) to publicly post about something that you know is interesting and important but that is not totally supported.
One of the many ways people came to terms with virtual worlds was in the creation of mirror worlds. We all do it. In a world of infinite digital possibilities we ground ourselves by remaking out home, our office our locale. We have meeting rooms with seats and screens for powerpoint. This is not wrong. As I wrote in 2008 when we had a massive virtual worlds conference in London
It seems we are seeing many more mirror world applications appear once more and generally the builds are in Minecraft. It is interesting Minecraft has all the attributes of more detailed virtual worlds like Second Life and Opensim. It is a persistent virtual environment. It has the 8 bit block look and much less detailed and animated avatars, but it does have a very easy approach to building. You stack blocks. You still have to navigate the 3d space get you first person view in the right place and drop a block. (Many of the objects in 2006 were from people who felt they could not navigate 3d space easily something I think that has started to melt away as an objection). The relative resolution of a build in minecraft is low, you are not building with pixels but with large textured blocks. This does have a great levelling effect though. The richer environments require some other 3d and texturing skills to make things that look good. Minecraft is often compared to Lego. In Lego you can make very clever constructions but they still look like Lego. This becomes much less of an visual design challenge for those people less skilled in that art. A reduced pallet and single block type means not having to understand how to twist graphic primitives, upload textures and consider lighting maps etc. It is a very direct and instant medium. Minecraft has blocks that act as switches and wires that allows the creation of devices that react, but it does not have the feature that I used the most in Second Life and Opensim of being able to write code in the objects. There are ways to write code for Minecraft mods but it is not as instant as the sort of code scripts in objects in more advance virtual environments.Those scripts alter the world, respond to it push data to the outside and pull it back to the inside. It allows for a more blended interaction with the rest of the physical and digital world. Preserving state, creating user interfaces etc. It is all stuff that can be done with toolsets like Unity3d and Unreal Engine etc, but those are full dev environments. Scripting is an important element unless you are just creating a static exhibit for people to interact in. Minecraft also lack many of the human to human communication channels. It does not have voice chat by default though it is side loaded in some platforms. It has text chat but it is very console based and not accessible to many people. The social virtual worlds thrive on not just positional communication (as in Minecraft you can stand near something or someone) but on other verbal and non verbal communication.
The popularity of Minecraft has led to some institutions using it to create or wish to create mirror worlds. Already there was the the Ordanance Survey UK creation which “is believed to be the biggest Minecraft map made using real-world geographic data.” (Claiming firsts for things was a big thing back in 2006/7 for virtual worlds). It has just been updated to include houses not just terrain. By houses though this is not a full recreation of your house to walk into but a block showing its position and boundary. This map uses 83 billion blocks each at showing 25m of the UK, according to a story by the BBC
The BBC also reported this week on the British Museum looking at recreating all its exhibits and its buildings in Minecraft. This is an unusual approach to their exhibits. It will be an interesting build and they are asking the public to help. So once again Minecraft’s simplicity and accessibility will allow anyone to create the artefacts. It will, however, create a very low rez low texture experience. So it is a mirror world, in that it will be like the museum but it is more of a model village approach. You admire the talent to make it in the medium. It seems more of an art project than a virtual world project. It is all good though 🙂
The BBC seem to be on a Minecraft vibe just as they were on a Second Life Vibe back in the last wave. Paul Mason and BBC Newsnight came and did a piece with me and the team of fellow Virtual World pioneers in 2007 We were starting to establish a major sub culture in the corporate world of innovators and explorers looking into who we could use virtual worlds and how it felt. Counter culture of this nature in a corporate is not always popular. I have written this before but the day we filmed this for the BBC in the morning I had my yearly appraisal. My management line we not keen on virtual worlds not the success so gave me the lowest rating they could. I went straight for that (which is very annoying and disheartening) into an interview with Paul on camera. Sometimes you just have to put up with misguided people trying to derail you, other times you just get out of the situation and go do it elsewhere.
Anyway, Minecraft offers a great deal, it does allow mirror worlds, though it does allow for an other worldy approach. Most blocks do not obey gravity. You can build up and then a platform out with no supporting structure, you can dig tunnels and underground (the point of the mine in Minecraft and not worry about collapsing. You still have real world up and down left and right though. Most virtual worlds do this. Disney Infinity, Project Spark, Second Life, Opensim and the new Hi-Fidelity etc are all still avatars and islands at their core. People might mess with the scale of things and the size, size as the OS map or the Hard Drive machines and Guitars mentioned in this BBC piece on spectacular builds.
I feel we have another step to take in how we interact at distance or over time with others. Persistent virtual worlds allow us to either be in them, or even better to actually be them. My analogy here is that I could invite you to my mind and my thoughts, represented in a virtual environment. It may not be that those things are actual mirrors of the real world, they might be concepts and ideas represented in all sorts of ways. It may not mean gravity and ground need to exists. We are not going to get to this until we have all been through mirror work and virtual world experiences. That is the foundation of understanding.
Whilst many people got the virtual worlds of Second Life et al back in 2006 we were only continuing the ground preparation of all the other virtual world pioneers before. Minecraft is the first experience to really lay the foundations (all very serious it is fun to play with too!). These simple 8 bit style virtual bricks are the training ground for the next wave of virtual worlds, user generated content and our experiences of one another ideas. They may be mirror worlds, they have great value. There is no point me building an esoteric hospital experience when we need to have real examples to train on. However there is a point to having a stylised galleon floating in space as a theme for our group counselling experience we created.
The test for the naysayers of “it looks like a game” is really dying off now it seems. It will instead be it does’t look like the right sort of game. This is progress and I still get a rush when I see people understand the potential we have to connect and communicate in business, art, education or just good old entertainment. We could all sit and scoff as say yeah we’ve done that already, yes we know that etc. I am less worried about that, there might be a little voice in me saying I told you so, or wow how wrong were you to try and put a stop to this you muppets, but it is a very quiet voice. I want every one to feel the rush of potential that I felt when it clicked for me, whatever their attitude was before. So bring it on 🙂