vr


Augmented Reality about Augmented Reality

I thought it would be interesting to revisit some of the Augmented Reality tools that have Unity3d packages for them. The first I headed for was Vuforia, now owned by PTC. I am using the free version. It all worked straight from the install, though there are a lot of features to tinker with.
I wanted to use an image marker of my own, so naturally I used the book cover. There is a lot of AR and VR in the story so it all makes sense.
For the 3d models I used the current trial of Fuse/Mixamo now part of the Adobe. It lets you build a model and export it in a nice friendly Unity3d fix format. You would be suprised how these various formats for models and animations hide so much complexity and weird tech problems though. Eventually I found a couple of animations that fitted the mood. This is not a full action sequence from the book, but it could be. I have some more vignettes in mind to experiment with. I am just upgrading my windows machine so I can use the Kinect 2.0 for some of my own motion capture again. Finding the right animation, and then trying to edit it is really hard work. The addition of motion, especially in humans, hits a different part of the brain I think. It makes lots of things look not quite right.
Augmented Reality on Augmented Reality
So here is the video of augmented reality on a cover of a physical copy of an ebook novel telling the story with lots of augmented reality in it, if thats not too meta. Roisin rather sassily taps her for while the Commander lays down the law to her.

Reconfigure and Cont3xt are available right now to enjoy, and if at all possible, PLEASE write a review on Amazon, it makes a massive difference to the take up of the books.

Excited tech celebs and a countdown – Meta #AR

There is a countdown running and people are showing their excitement. This time, not to an enclosed virtual reality headset but for a full computing platform Augmented Reality headset. It is Meta @metaglasses. I have talked about and written a lot (including the two novels!) about augmented reality and what it means when it is done properly. This teaser video features many luminaries of tech, pop culture and business evangelising the product based on the demo they have seen. Scoble seems particularly gushing about it. He posted an hours worth of that on Facebook after his intro demo, but as its all embargoed he could only talk generally.
With Hololens, Magic Leap and now Meta (assuming it is in the same category which at $600 – $3000 it will be) and of course my own fictional EyeBlend there is a lot going on that may leap frog the VR wave this time around.

This has been around and developing for a while. It was initially seeming to have to answer itself to Google Glass which was really just a heads up display not full AR. The early video show bug eye aviator inspired glasses but the latest pictures are more visor with a noticeable sensor bar.
will.i.am seems pretty interested in it to. As he says it opens up the possibilities for the arts. So maybe he will help me make the blended reality movie or series for Reconfigure?
This sort of kit goes past what we would call gaming and entertainment as it provides realtime feedback of the World and the Internet of Things. Helping us see the stuff we can’t see normally, but in situ. Again with Meta it will depend on how they build the physical model, or if they do, of the World in order to implement the digital in place views. Some earlier videos show the use of real world objects, a flat surface such as a box, being used as canvas tracking and a reference point for the digital content.
The countdown clock is ticking though, and the final hype is being ramped up. So it is exciting, however it turns out. Their countdown says 19 days left (best check the site as that is of course an out of date piece of text 🙂 )

Using the Real World – IoT, WebGL, MQTT, Marmite, Unity3d and CKD

All the technology and projects I have worked on in my career take what we currently have at the moment and create or push something further. Development projects of any kind will enhance or replace existing systems or create brand new ones. A regular systems update will tweak and fix older code and operations and make them fresher and new. This happens even in legacy systems. In both studying and living some of the history of our current wave of technology, powered by the presence of the Internet, I find it interesting to reflect of certain technology trajectories. Not least to try and find a way to help and grow this industries, and with a bit of luck actually get paid to do that. I find that things finding out about other things is fascinating. With Predlet 2.0 birthday party we took them all Karting. There was a spare seat going so I joined in. The Karts are all instrumented enough that the lap times are automatically grabbed as you pass the line. Just that one piece of data for each Kart is then formulated and aggregated. Not just with your group, but with the “ProSkill” ongoing tracking of your performance. The track knows who I am now I have registered. So if I turn up and rice again it will show me more metrics and information about my performance, just from that single tag crossing the end of lap sensor. Yes that IoT in action, and we have had that for a while.
Great fun karting. Yay for being faster than 9 year olds :)
The area of Web services is an interesting one to look at. Back in 1997, whilst working on very early website for a car manufacturer, we had a process to get to some data about skiing conditions. It required a regular CRON job to be scheduled and perform a secure FTP to grab the current text file containing all the ski resorts snowfall, so that we could parse it and push it into a form that could be viewed on the Web. i.e. it had a nice set of graphics around it. That is nearly 20 years ago, and it was a pioneering application. It was not really a service or an API to talk to. It used the available automation we had, but it started as a manual process. Pulling the file and running a few scripts to try and parse the comma delimited data. The data, of course, came from both human observation and sensors. It was collated into one place for us to use. It was a real World set of measurements, pulled together and then adjusted and presented in a different form over the Internet via the Web. I think we can legitimately call that an Internet of Things (IoT) application?
We had a lot of fancy and interesting projects then, well before their time, but that are templates for what we do today. Hence I am heavily influenced by those, and having absorbed what may seem new today, a few years ago, I like to look to the next steps.
Another element of technology that features in my work is the ways we write code and deploy it. In particular the richer, dynamic game style environments that I build for training people in. I use Unity3d mostly. It has stood the test of time and moved on with the underlying technology. In the development environment I can place 3D objects and interact with them, sometimes stand alone, sometimes as networked objects. I tend to write in C# rather than Javascript, but it can cope with both. Any object can have code associated with it. It understands the virtual environment, where something is, what it is made of etc. A common piece of code I use picks one of the objects in the view and then using the mouse, the virtual camera view can orbit that object. It is an interesting feeling still to be able to spin around something that initial looks flat and 2D. It is like a drones eye view. Hovering or passing over objects.
Increasingly I have had to get the Unity applications to talk to the rest of the Web. They need to integrate with existing services, or with databases and API’s that I create. User logons, question data sets, training logs etc. In many ways it is the same as back in 1997. The pattern is the same, yet we have a lot more technology to help us as programmers. We have self defining data sets now. XML used to be the one everyone raved about. Basically web like take around data to start and stop a particular data element. It was always a little to heavy on payload though. When I interacted with the XML dat from the tennis ball locations for Wimbledon the XML was too big for Second Life to cope with at the time. The data had to be mashed down a little, removing the long descriptions of each field. Now we have JSON a much tighter description of data. It is all pretty much the same of course. An implied comma delimited file, such as the ski resort weather worked really well, if the export didn’t corrupt it. XML version would be able to be tightly controlled and parsed in a more formal language style way, JSON is between the two. In JSON the data is just name:value, as opposed to XML value. It is the sort of data format that I used to end up creating anyway, before we had the formality off this as a standard.
Unity3d copes well with JSON natively now. It used to need a few extra bits of code, but as I found out recently it is very easy to parse a web based API using code and extra those pieces of information and adjust the contents of the 3d Environment accordingly. By easy, I mean easy if you are a techie. I am sure I could help most people get to the point of understanding how to do this. I appreciate too that having done this sort of thing for years there is a different definition of easy.
It is this grounding in real World pulling info data and manipulating it, from the Internet and serving it to the Web that seems to be a constant pattern. It is the pattern of IoT and of Big Data.
As part of the ongoing promotion of the science fiction books I have written I created a version of the view Roisin has of the World in the first novel Reconfigure. In that she discovers and API that can transcribed and described the World around her.
This video shows a simulation of the FMM v1.0 (Roisin’s application) working as it would for her. A live WebGL version that just lets you move the camera around to get a feel for it is here.

WebGL is a new target that Unity3d can publish too. Unity used to be really good because it had a web plugin that let us deploy applications, rich 3d ones, to any web browser not just build for PC, mac and tablets. Every application I have done over the past 7 years has generally had the web plugin version at its core to make life easier for the users. Plugins are dying and no longer supported on many browsers. Instead the browser has functions to draw things, move things about on screen etc. So Unity3d now generates the same thing as the plugin, which was common code, but creates a mini version for each application that is published. It is still very early days for WebGl, but it is interesting to be using it for this purpose as a test and for some other API interactions with sensors across the Web.
In the story, the interaction Roisin starts as a basic command line ( but over Twitter DM), almost like the skiing FTP of 1997. She interrogates the API and figures out the syntax, which she then builds a user interface for. Using Unity3d of course. The API only provides names and positions of objects, hence the cube view of the World. Roisin is able to move virtual objects and the API then, using some Quantum theory, is able to move the real World objects. In the follow up, this basic interface gets massively enhanced, with more IoT style ways of interacting with the data, such as with MQTT for messaging instead of Twitter DM’s as in the first book. All real World stuff, except the moving things around. All evolved through long experience in the industry to explain it in relatively simple terms and then let the adventure fly.
I hope you can see the lineage of the technology in the books. I think the story and the twists and turns are the key though. The base tech makes it real enough to start to accept the storyline on top. When I wrote the tech parts, and built the storyboard they were the easy bits. How to weave some intrigue danger and peril in was something else. From what I have been told, and what I feel, this has worked. I would love to know what more people think about it though. It may work as a teaching aid for how the internet works, what IoT is etc for any age group, from schools to boardroom? The history and the feelings of awe and anger at the technology are something we all feel at some point with some element of out lives too.
Whilst I am on real World though. One of the biggest constants in Roisin’s life is the like it or love it taste of Marmite. It has become, through the course of the stories, almost a muse like character. When writing you have to be careful with real life brands. I believe I have used the ones I have in these books as proper grounding with the real World. I try to be positive about everyone else products, brands and efforts.
In Cont3xt I also added in some martial arts, from my own personal experience again, but adjusted a little her and there. The initial use of it in Cont3xt is not what you might think when you hear martial art. I am a practitioner of Choi Kwang Do, though I do not specially call any of the arts used in the book by that name as there are times it is used aggressively, not purely for defence. The element of self improvement is in there, but with a twist.
Without the background in technology over the years and the seeing it evolve and without my own personal gradual journey in Choi Kwang Do, I would not have had the base material to draw upon, to evolve the story on top of.
I hope you get a chance to read them, it’s just a quick download. Please let me know what you think, if you have not already. Thank you 🙂

It’s alive – Cont3xt – The follow up to Reconfigure

On Friday I pressed the publish button on Cont3xt. It whisked its way onto Amazon and the KDP select processes. The text box suggested it might be 72 hours before it was life, in reality it was only about 2 hours, followed by a few hours of links getting sorted across the stores. I was really planning to release it today, but, well, it’s there and live.
I did of course have to mention it in a few tweets, but I am not intending to go overboard. I recorded a new promo video, and played around with some of the card features on YouTube now to provide links. I am just pushing a directors cut version. The version here is a 3 minute promo. The other version is 10 minutes explaining the current state of VR and AR activity in the real industry.


As you can see I am continuing the 0.99 price point. I hope that encourages a few more readers to take a punt and then get immersed in Roisin’s world.
Cont3xt is full of VR, AR and her new Blended Reality kit. It has some new locations and even some martial arts fight scenes. Peril, excitement and adventure, with a load of real World and future technology. Whats not to like?
I hope you give it a download and get to enjoy it as much as everyone else who has read it seems to.
This next stage in the journey has been incredibly interesting and I will share some of that later. For now I just cast the book out there to see whether people will be interested now there is the start of a series 🙂

Making Movies – Flying around in Unity3d

Yes, ok, my latest obsession of writing #Reconfigure might make a greta TV series or Movie but that’s not what this is about. Instead it is about another set of skills I seem to be putting into practice.
Before the I spent a day editing up the school end of year video. I didn’t shoot it, but I did use al the various messing around and comedy moments from the teachers and staff to put a montage together. It had a story, some pace, a step change or two. It was great fun. I have now been asked to edit a few more things together. I like editing moving images, it is fun. Though I am not charging for this is just to help people out.
At the same time I am in the middle of a little project to try and jazz up a regular presentation in the corporate space. I demoed a little virtual fly through a building, past a couple of characters and onto an image using Unity3d. It turned into a slightly more complex epic but non the less I am using lots of varied skills to make it work. The pitch will be pre canned but I have built the Unity environment so that it runs on rails, but live. I have c# code in there to help do certain things like play video textures, animation controllers and even NavMesh agents to allow a character to walk around to a particular room and lots of flying cameras. I had used the first few things a lot. It is stock Unity. However I had not really had a chance to use Unity animations. All my animations had been either remade FBX files or ones that I created in Cheetah3D. Once you imported an animation like that it was then sort of locked in place.
However, Unity3d has its own animation dope and curve sheets, and its really handy.
The animation Tab brings a different timeline. It is different to the Animation Controller screen that lets you string animation together and create transitions. e.g. jump or run from a walk.
The animation tab lets you select an object in the scene and create a .anim for you. It then by default adds that anim to an animation controller and adds that to the object.
Unity3d Animation Timeline
The record mode then allows you to expand any or all of the properties of the object, usually transform position and rotation, but it can be colour, visibility, basically anything with a parameter. If you move the timeline and then move the object, like a camera for instance, it creates the key frames for the changed parameters. It removes any of the awkward importing of paths and animation ideas from another tool. It is not great for figures and people. They are still best imported as skinned meshes with skeletons and let the import deal with all the complexity so you can add any stock animations to them. Whoever for a cut scene, or zoom past to see a screen on a wall it works really well.
I have written a lot of complicated things in Unity3d, but never bothered using this part. It is great to find it working, and doing exactly what I needed it to do for my cinematic cut scenes.
You can have things zooming and spinning around a known path in no time. You have to break it down a little otherwise it resolves the easiest path through a wall or something, but it seems very robust to edit it afterwards.
The only pity is that its not possible to lock a camera preview window open in stock Unity3d when selecting another object. Having got a camera pan just right its trick if you was to move an object in the camera view. With camera selected you see it’s preview, with the object selected you see it. Never got around that yet, just end up with trial and error.
No matter, it works, and its awesomely useful.

Return of the camp fire – Blended Reality/Mixed Reality

Back in 2006 when we all got into in the previous wave of Virtual Reality (note the use of previous not 1st) and we tended to call them virtual worlds or the metaverse we were restricted to online presence via a small rectangular screen attached to a keyboard. Places like Second Life become the worlds that attracted the pioneers, explorers and the curious.
Several times recently I have been having conversations, comments, tweets etc about the rise of the VR headsets Oculus Rift et al, AR phone holders like Google Cardboard, MergeVR and then the mixed reality future of Microsoft Hololens, and Google Magic Leap.
In all the Second Life interactions and subsequent environments it has always really been about people. The tech industry has a habit of focusing on the version number, or a piece of hardware though. The headsets are an example of that. There may be inherent user experiences to consider but they are more insular in their nature. The VR headset forces you to look at a screen. It provides extra inputs to save you using mouse look etc. Essentially though it is an insular hardware idea that attempts to place you into a virtual world.
All out previous virtual world exploration has been powered by the social interaction with others. Social may mean educational, may mean work meetings or may mean entertainment or just chatting. However it was social.
Those of us who took to this felt as we looked at the screen we were engaged and part of the world. Those who did not get what we were doing only saw funny graphics on the screen. They did not make that small jump in understanding. There is nothing wrong with that, but that is essential what we were all trying to get people to understand. A VR headset provides a direct feed to the eyes. It forces that leap into the virtual, or at least brings it a little closer. Then ready to engage with people. Though at the moment all the VR is engaging with content. It is still in showing off mode.
120362
We would sit around campfires in Second Life. In particular Timeless Prototype’s very clever looking one that adds seats/logs per person/avatar arriving. Shared focal points of conversation. People gathering looking at a screen in front of them but being there mentally. The screen providing all the cues and hints to enhance that feeling. A very real human connection. Richer than a telephone call, less scary than a video conference, intimate yet stand offish enough to meet most peoples requirements if they gave it a go.
It is not such a weird concept as many people get lost in books, films and TV shows. They may be immersed as a viewer not a participant or they may be empathising with characters. They are still looking at an oblong screen or piece of paper and engaged.
With a VR headset, as I just mentioned, that is more forced. They will understand the campfire that much quicker and see the people and the interaction. There is less abstraction to deal with.
The rise of the blended and mixed reality headset though change that dynamic completely. Instead they use the physical world, one that people already understand and are already in. The campfire comes to you. Wherever you are.
This leads to a very different mental leap. You can have three people sat around the virtual campfire in their actual chairs in an actual room. The campfire is of course replaceable with any concept, piece of information, simulated training experience. Those people each have their own view of the world and of one another but its added to with a new perspective, they see the side of the campfire they would expect.
It goes further though, there is no reason for the campfire not to coexist is several physical spaces. I have my view of the campfire from my office, you from yours. We can even have a view of one another as avatars placed into our physical worlds, or not. It’s optional.
When just keeping with that physical blended model it is very simple for anyone to understand and start to believe. Sat in a space with one another sharing an experience, like a campfire is a very human very obvious one. For many that is where this will sort of end. Just run of the mill information, enhanced views, cut away engineering diagrams, point of view understanding etc.
The thing to consider, for those who already grok the VW concept is that just as in Second Life you can move your camera around, see things from any point of view, for the other persons point, from a birds eye view, from close in or far away, the mixed reality headsets will still allow us to do that. I could alter my view to see what you are seeing on the other side of the camp fire. That is the star of something much more revolutionary to consider whilst everyone catches up the fact that this is all about people not just insular experiences.
This is the real metaverse, a blend of everything anywhere and everywhere. Connecting us as people, our thoughts dreams and ideas. There are huge possibilities to explore and more people will once that little hurdle is jumped to understand it’s people, its always been about people.

Microsoft Minecraft, Space and Semantic Paint

It is interesting looking, spotting, patterns and trends and extrapolating the future. Sometimes things come in and get focus because they are mentioned a lot on multiple channels and sometimes those things converge. Right at the moment it seems that Microsoft are doing all the right things in the emerging tech space, game technology and general interest space.
Firstly has been the move my Microsoft to make Minecraft even easier to use in schools.

They are of course following on and officially backing the sort of approach the community has already taken. However a major endorsement and drive to use things like Minecraft in education is fantastic news. The site is education.minecraft.net
And just to resort to type again, this is, as the video says, not about goofing around in a video game. This is about learning maths, history, how to communicate online. These are real metaverse tools that enable rich immersive opportunities, on a platform that kids already know and love. Why provide them with a not very exciting online maths test when they can use multiple skills in an online environment together?
Microsoft have also supported the high end sciences. Announcing teaming up with NASA to use the blended/mixed/augmented reality headset Hololens to help in the exploration of Mars. This is using shared experience telepresence. Bringing mars to the room, desk, wall that several people are sharing in a physical space. The Hololens and the next wave of AR are looking very exciting and very importantly they have an inclusion approach to the content. Admittedly each person needs a headset, but using the physical space each person can see the same relative view of the data and experience as the rest of the people. The don’t have to of course :), there is not reason for screens not to show different things in context for each user. However standard VR is in immersed experience without much awareness of others. A very important point I feel.
Microsoft have also published a paper and video about using depth scanning and live understanding and labelling of real world objects. Something that things like Google Tango will be approaching.
Slightly better than my 2008 attempt (had to re-use this picture again of the Nokia N95)
Hackday5 Augmented Reality step 2


The real and virtual are becoming completely linked now. Showing how the physical universe is actually another plane alongside all the other virtual and contextual metaverses.
It is all linked and not about isolation, but understanding and sharing of information, stories and facts.

Untethering Humans, goodbye screens

We are on the cusp of a huge change in how we as humans interact with one another, with the world and with the things we create for one another. A bold statement, but one that stands up, I believe, by following some historical developments in technology and social and work related change.
The change involves all the great terms, Augmented Reality, Virtual Reality, Blended Reality and the metaverse. It is a change that has a major feature, one of untethering, or unshackling us as human beings from a fixed place or a fixed view of the world.

Hovering
Here I am being untethered from the world in a freefall parachute experience, whilst also being on TV 🙂

All the great revolutions in human endeavour have involved either transporting us via our imagination to another place or concept, books, film, plays etc. or transporting us physically to another place, the wheel, steam trains, flight. Even modern telecommunications fit into that bracket. The telephone or the video link transport us to a shared virtual place with other people.

Virtual worlds are, as I may have mentioned a few times before, ways to enhance the experience of humans interacting with other humans and with interesting concepts and ideas. That experience, up to know has been a tethered one. We have evolved the last few decades becoming reliant on rectangular screens. Windows on the world, showing us text, images, video and virtual environments. Those screens have in evolved. We had large bulky cathode ray tubes, LED, Plasma, OLED and various flat wall projectors. The screens have always remained a frame, a fixed size tethering us to a more tunnel vision version. The screen is a funnel through which elements are directed at us. We started to the untethering process with wi-fi and mobile communications. Laptops, tablets and smartphones gave us the ability to take that funnel, a focused view of a world with us.

More recent developments have led to the VR headsets. An attempt to provide an experience that completely immerses us by providing a single screen for each eye. It is a specific medium for many yet to be invented experiences. It does though tether us further. It removes the world. That is not to say the Oculus Rift, Morpheus and HTC Vive are not important steps but they are half the story of untethering the human race. Forget the bulk and the weight, we are good at making things smaller and lighter as we have seen with the mobile telephone. The pure injection of something into our eyes and eyes via a blinkering system feels, and is, more tethering. It is good at the second affordance of transporting is to places with others, and it is where virtual world and the VR headsets naturally and obviously co-exist.

The real revolution comes from full blended reality and realities. That plural is important. We have had magic lens and magic mirror Augmented Reality for a while. Marker based and markerless ways to use one of these screens that we carry around or have fixed in out living rooms to show us digital representations of things places in out environment. They are always fun. However they are almost always just another screen in our screens. Being able to see feel and hear things in our physical environment wherever we are in the world and have them form part of that environment truly untethers us.
Augmented Reality is not new of course. We, as in the tech community, has been tinkering with it for years. Even this mini example on my old Nokia N95 back in 2008 starts to hint at the direction of travel.

Hackday5 Augmented Reality step 2

The devices used to do this are obviously starting at a basic level. Though we have the AR headsets of Microsoft Hololens, Google’s Magic Leap to start pushing the boundaries. They will not be the final result of this massive change. With an internet of things world, with the physical world instrumented and producing data, with large cloud servers offering compute power at the end of an wireless connection to analyse that data and to be able to visualize and interact with things in our physical environment we have a lot to discover and to explore.

I mentioned realities, not just reality. There is no reason to only have augmentation into a physical world. After all if you are immersed in a game environment or a virtual world you may actually choose, because you can, to shut out the world, to draw the digital curtains and explore. However, just as when you are engaged in any activity anywhere, things come to you in context. You need to be able to interact with other environments from which ever one you happen to be in.

Take an example, using Microsoft Hololens, Minecraft and Skype. In todays world you would have minecraft on you laptop/console/phone be digging around, building sharing the space with others. Skype call comes in from someone you want to talk to. You window away from Minecraft and focus on Skype. It is all very tethered. In a blended reality, as Hololens has shown you can have Minecraft on the rug in front of you and skype hanging on the wall next to the real clock. Things and data placed in the physical environment in a way that works for you and for them. However you may want to be more totally immersed in Minecraft and go full VR. If something can make a small hole in the real world for the experience, then it can surely make an all encompassing hole, thus providing you with only Minecraft. Yet, if it can place Skype on your real wall, then it can now place it on your virtual walls and bring that along with you.
This is very much a combination that is going to happen. It is not binary to be either in VR or not, in AR or not. It is either, both or neither.

It is noticeable that Facebook, who bought heavily into Oculus Rift have purchased Surreal Vision last month who specialize in using instrumentation and scanning kit to make sense of the physical world and place digital data in that world. Up until now Oculus Rift, which has really led the VR charge since its kickstarter (yes I backed that one!) has been focussed on the blinkered version of VR. This purchase shows the intent to go for a blended approach. Obviously this is needed as otherwise Magic Leap and Hololens will quickly eat into the Rifts place in the world.
So three of the worlds largest companies, Facebook, Google and Microsoft have significant plays in blended reality and the “face race” as it is sometimes called to get headsets on us. Sony have Project Morpheus which is generally just VR, yet Sony have had AR applications for many years with PS Move.
Here is Predlet 2.0 enjoying the AR Eyepet experience back in 2010 (yes five years ago !)

So here it is. We are getting increasingly more accurate ways to map the world into data, we have world producing IOT data streams, we have ever increasing ubiquitous networks and we have devices that know where they are, what direction they are facing. We have high power backend servers that can make sense of our speech and of context. On top of that we have free roaming devices to feed and overlay information to us, yes a little bit bulky and clunky but that will change. We are almost completely untethered. Wherever we are we can experience whatever we need or want, we can, in the future blend that with more than one experience. We can introduce others to our point of view or keep our point of view private. We can fully immerse or just augment, or augment our full immersion. WE can also make the virtual real with 3d printing!
That is truly exciting and amazing isn’t it?
It makes this sort of thing in Second Life seem an age away, but it is the underpinning for me and many others. Even in this old picture of the initial Wimbledon build in 2006 there is a cube hovering at the back. It is one of Jessica Qin’s virtual world virtual reality cubes.
Wimbledon 06 magic carpet
That cube and other things like it provided an inspiration for this sort of multiple level augmentation of reality. It was constrained by the tethered screen but was, and still is, remarkable influential.
Jessica Qin Holocube
Here my Second Life avatar, so my view of a virtual world is inside 3d effect that provides a view, a curved 3d view not just a picture, or something else. It is like the avatar is wearing a virtual Oculus rift it that helps get the idea across.
This is from 2008 by way of a quick example of the fact even then it could work.

So lets get blending then 🙂

VR – Everything old is new again – good!

It is important when getting interested and excited about new things to look at its lineage. Something I often did in my series of articles over the years in Flush Magazine
So with the current rush of Virtual Reality gaming and experiences and the slew of new kit we have some very interesting near off the shelf kit such as this.
*warning it contains violence (see where I am on that here)

However, back in the 90’s we had VR kit like Virtuality (they must be kicking themselves for be too early for the masses. Though thats a curse us early adopters and evangelists have to live with 🙂

The headsets were quite heavy and the container you stood or sat in was not a treadmill but part of the sensing rig.
The graphics may look old fashioned and clunky but they were good experiences. When I was at poly/uni in Leicester getting ready to do a year out this was the company I wanted to go and work for if I had a chance too. They sent to me IBM instead. Though as you can see from the Virtuality company details they had very close ties with IBM. So it was close 🙂 It i also funny how things work out.
Still, it is very cool having all these headset sat around me and some that just work on my portable communication device too 🙂

Over 15 years experience in Virtual Worlds, 30+ in tech – What now?

A few days ago I realised it has been over 9 years since I first publicly blogged about how important I thought the principles of the metaverse and virtual worlds were going to be for both social and business uses. This post, pictured below for completeness was a tipping point for some radical changes in many of our lives as part of Eightbar.
Second Life first post
I had been working to that sort of point of understanding though since some very early work with virtual environments and how people get to interact with one another in them around 1999/2000 with SmartVR trying to keep the social bond in our internal web and multi media design group together when we were cast asunder to different locations by business pressures (or bad decisions who knows!). I knew we had to have a sense of one another existence aside from text in emails and instant messages. So we tried to build a merged version of both offices as if they were in one place. The aim to then instrument those with presence and the ability to walk over to someone’s desk and talk. Mirror world, blended reality and even internet of things, Yes I know, a bit before its time! Cue music, “Story of my life” by 1D 🙂

We definitely had a technology and expectation bubble later 2006-2009. However, that, as with all emerging technology is all part of the evolution. The garnet curve et al. What surprises me the most still is that people think when a bubble like that bursts that its all over. That somehow everything that was learned in that time was pointless. “Are virtual worlds still a thing?” etc. I feel for the even earlier pioneers like Bruce Damer, who patiently put up with our ramblings as we all rushed to discover and feel for ourselves those things he already knew.
Increasingly I am talking to new startup ands seeing new activity in the virtual space. The same use cases, the same sparks of creativity that we had in the previous wave(s), the same infectious passion to do something interesting and worthwhile. Sometimes this is somehow differentiated from the last wave of virtual worlds under the heading of virtual reality. The current wave is focussed on the devices, the removal of keyboard and of a fixed screen. The Oculus Rift, HoloLens etc. However, thats a layer of new things to learn and experience on what we have already been through. After all its a virtual world you end up looking at or blending with in a VR headset!

I spend so much time looking forward and extrapolating concepts and ideas it is now very scary to look back and consider the experience I have gathered. The war stories and success stories, the concepts and ideas that I have tried. The emotional impact of virtual worlds. The evolution of the technology and of people’s expectation of that technology. The sheer number of people that have moved around in a 3d environment from an early age with things like Minecraft, who are now about to enter higher education and the workforce.

So I am left in a slightly bemused state as to what to do with this knowledge. With this all going so much more mainstream again I am no longer working in a niche. Do I ply my trade of independent consulting chipping away in odd places and helping and mentoring some of the new entrants in the market or do I try and find a bigger place to spread the word?

At the same time though, knowing lots of things makes you realise how much you don’t know. The imposter’s syndrome kicks in. Surely everyone must know all this stuff by now? It’s obvious and stands up to logical reasoning to try and connect with other people in as rich a way as possible. The network is there, the tech is there, the lineage is there. Though clearly not everyone gets it yet. I often wonder if the biggest naysayers I had to deal with on my journey so far have figured it out yet? It will probably turn out they will be getting rich quick right now whilst I sit and ponder such things.

On the other side of the virtual coin, I know from my martial arts that constant practice and investigation leads to a black belt. In the case of Choi Kwang Do that’s just over 3 years worth. So how many Dan worth of tech equivalent experience does that put me at. 25 years in the industry professionally but 30+ as techie.

I still want to do the right thing, to help others level themselves up. I don’t think I am craving fame and fortune but the ability to share and build is what drives me. If fame and fortune as a spokesperson, evangelist for some amazing idea or TV show reaches that end, then thats fine by me.

I am at a crossroads, my VR headset letting me look in all directions. I see half built roads in many directions. What now? A well funded company that I can help build a great road with, or forge off down one for the other paths seeing what happens on the way.
Of course that makes it seem like there is a clear choice on the plate. I suspect most of the the well funded companies and corporations don’t think they need any help, which is rather where I came in on this!

Needless to say I am always open to conversations, offers, partnerships, patronage, retainers and technology evangelist roles. There must be a slew of investors out there wondering what to put their money into, who need some sagely advice ;). Or that book… there is always that book… (600 images x 1000 words each 600,000 words just on these Second Life experiences. That’s just the ones with online. The offline ones double that!. Not to mention the other places, games, and self built experiences!

I took this photo in April 2006 as part of sharing of our journey
The wilderness

I have always liked a nice greenfield to start building on. Equally that did not build itself, it was a massive shared team experience. No one has all the answers. Some of us are good at helping people find them though.

Right! Can we get on with this now?