vr


Excited tech celebs and a countdown – Meta #AR

There is a countdown running and people are showing their excitement. This time, not to an enclosed virtual reality headset but for a full computing platform Augmented Reality headset. It is Meta @metaglasses. I have talked about and written a lot (including the two novels!) about augmented reality and what it means when it is done properly. This teaser video features many luminaries of tech, pop culture and business evangelising the product based on the demo they have seen. Scoble seems particularly gushing about it. He posted an hours worth of that on Facebook after his intro demo, but as its all embargoed he could only talk generally.
With Hololens, Magic Leap and now Meta (assuming it is in the same category which at $600 – $3000 it will be) and of course my own fictional EyeBlend there is a lot going on that may leap frog the VR wave this time around.

This has been around and developing for a while. It was initially seeming to have to answer itself to Google Glass which was really just a heads up display not full AR. The early video show bug eye aviator inspired glasses but the latest pictures are more visor with a noticeable sensor bar.
will.i.am seems pretty interested in it to. As he says it opens up the possibilities for the arts. So maybe he will help me make the blended reality movie or series for Reconfigure?
This sort of kit goes past what we would call gaming and entertainment as it provides realtime feedback of the World and the Internet of Things. Helping us see the stuff we can’t see normally, but in situ. Again with Meta it will depend on how they build the physical model, or if they do, of the World in order to implement the digital in place views. Some earlier videos show the use of real world objects, a flat surface such as a box, being used as canvas tracking and a reference point for the digital content.
The countdown clock is ticking though, and the final hype is being ramped up. So it is exciting, however it turns out. Their countdown says 19 days left (best check the site as that is of course an out of date piece of text 🙂 )

Using the Real World – IoT, WebGL, MQTT, Marmite, Unity3d and CKD

All the technology and projects I have worked on in my career take what we currently have at the moment and create or push something further. Development projects of any kind will enhance or replace existing systems or create brand new ones. A regular systems update will tweak and fix older code and operations and make them fresher and new. This happens even in legacy systems. In both studying and living some of the history of our current wave of technology, powered by the presence of the Internet, I find it interesting to reflect of certain technology trajectories. Not least to try and find a way to help and grow this industries, and with a bit of luck actually get paid to do that. I find that things finding out about other things is fascinating. With Predlet 2.0 birthday party we took them all Karting. There was a spare seat going so I joined in. The Karts are all instrumented enough that the lap times are automatically grabbed as you pass the line. Just that one piece of data for each Kart is then formulated and aggregated. Not just with your group, but with the “ProSkill” ongoing tracking of your performance. The track knows who I am now I have registered. So if I turn up and rice again it will show me more metrics and information about my performance, just from that single tag crossing the end of lap sensor. Yes that IoT in action, and we have had that for a while.
Great fun karting. Yay for being faster than 9 year olds :)
The area of Web services is an interesting one to look at. Back in 1997, whilst working on very early website for a car manufacturer, we had a process to get to some data about skiing conditions. It required a regular CRON job to be scheduled and perform a secure FTP to grab the current text file containing all the ski resorts snowfall, so that we could parse it and push it into a form that could be viewed on the Web. i.e. it had a nice set of graphics around it. That is nearly 20 years ago, and it was a pioneering application. It was not really a service or an API to talk to. It used the available automation we had, but it started as a manual process. Pulling the file and running a few scripts to try and parse the comma delimited data. The data, of course, came from both human observation and sensors. It was collated into one place for us to use. It was a real World set of measurements, pulled together and then adjusted and presented in a different form over the Internet via the Web. I think we can legitimately call that an Internet of Things (IoT) application?
We had a lot of fancy and interesting projects then, well before their time, but that are templates for what we do today. Hence I am heavily influenced by those, and having absorbed what may seem new today, a few years ago, I like to look to the next steps.
Another element of technology that features in my work is the ways we write code and deploy it. In particular the richer, dynamic game style environments that I build for training people in. I use Unity3d mostly. It has stood the test of time and moved on with the underlying technology. In the development environment I can place 3D objects and interact with them, sometimes stand alone, sometimes as networked objects. I tend to write in C# rather than Javascript, but it can cope with both. Any object can have code associated with it. It understands the virtual environment, where something is, what it is made of etc. A common piece of code I use picks one of the objects in the view and then using the mouse, the virtual camera view can orbit that object. It is an interesting feeling still to be able to spin around something that initial looks flat and 2D. It is like a drones eye view. Hovering or passing over objects.
Increasingly I have had to get the Unity applications to talk to the rest of the Web. They need to integrate with existing services, or with databases and API’s that I create. User logons, question data sets, training logs etc. In many ways it is the same as back in 1997. The pattern is the same, yet we have a lot more technology to help us as programmers. We have self defining data sets now. XML used to be the one everyone raved about. Basically web like take around data to start and stop a particular data element. It was always a little to heavy on payload though. When I interacted with the XML dat from the tennis ball locations for Wimbledon the XML was too big for Second Life to cope with at the time. The data had to be mashed down a little, removing the long descriptions of each field. Now we have JSON a much tighter description of data. It is all pretty much the same of course. An implied comma delimited file, such as the ski resort weather worked really well, if the export didn’t corrupt it. XML version would be able to be tightly controlled and parsed in a more formal language style way, JSON is between the two. In JSON the data is just name:value, as opposed to XML value. It is the sort of data format that I used to end up creating anyway, before we had the formality off this as a standard.
Unity3d copes well with JSON natively now. It used to need a few extra bits of code, but as I found out recently it is very easy to parse a web based API using code and extra those pieces of information and adjust the contents of the 3d Environment accordingly. By easy, I mean easy if you are a techie. I am sure I could help most people get to the point of understanding how to do this. I appreciate too that having done this sort of thing for years there is a different definition of easy.
It is this grounding in real World pulling info data and manipulating it, from the Internet and serving it to the Web that seems to be a constant pattern. It is the pattern of IoT and of Big Data.
As part of the ongoing promotion of the science fiction books I have written I created a version of the view Roisin has of the World in the first novel Reconfigure. In that she discovers and API that can transcribed and described the World around her.
This video shows a simulation of the FMM v1.0 (Roisin’s application) working as it would for her. A live WebGL version that just lets you move the camera around to get a feel for it is here.

WebGL is a new target that Unity3d can publish too. Unity used to be really good because it had a web plugin that let us deploy applications, rich 3d ones, to any web browser not just build for PC, mac and tablets. Every application I have done over the past 7 years has generally had the web plugin version at its core to make life easier for the users. Plugins are dying and no longer supported on many browsers. Instead the browser has functions to draw things, move things about on screen etc. So Unity3d now generates the same thing as the plugin, which was common code, but creates a mini version for each application that is published. It is still very early days for WebGl, but it is interesting to be using it for this purpose as a test and for some other API interactions with sensors across the Web.
In the story, the interaction Roisin starts as a basic command line ( but over Twitter DM), almost like the skiing FTP of 1997. She interrogates the API and figures out the syntax, which she then builds a user interface for. Using Unity3d of course. The API only provides names and positions of objects, hence the cube view of the World. Roisin is able to move virtual objects and the API then, using some Quantum theory, is able to move the real World objects. In the follow up, this basic interface gets massively enhanced, with more IoT style ways of interacting with the data, such as with MQTT for messaging instead of Twitter DM’s as in the first book. All real World stuff, except the moving things around. All evolved through long experience in the industry to explain it in relatively simple terms and then let the adventure fly.
I hope you can see the lineage of the technology in the books. I think the story and the twists and turns are the key though. The base tech makes it real enough to start to accept the storyline on top. When I wrote the tech parts, and built the storyboard they were the easy bits. How to weave some intrigue danger and peril in was something else. From what I have been told, and what I feel, this has worked. I would love to know what more people think about it though. It may work as a teaching aid for how the internet works, what IoT is etc for any age group, from schools to boardroom? The history and the feelings of awe and anger at the technology are something we all feel at some point with some element of out lives too.
Whilst I am on real World though. One of the biggest constants in Roisin’s life is the like it or love it taste of Marmite. It has become, through the course of the stories, almost a muse like character. When writing you have to be careful with real life brands. I believe I have used the ones I have in these books as proper grounding with the real World. I try to be positive about everyone else products, brands and efforts.
In Cont3xt I also added in some martial arts, from my own personal experience again, but adjusted a little her and there. The initial use of it in Cont3xt is not what you might think when you hear martial art. I am a practitioner of Choi Kwang Do, though I do not specially call any of the arts used in the book by that name as there are times it is used aggressively, not purely for defence. The element of self improvement is in there, but with a twist.
Without the background in technology over the years and the seeing it evolve and without my own personal gradual journey in Choi Kwang Do, I would not have had the base material to draw upon, to evolve the story on top of.
I hope you get a chance to read them, it’s just a quick download. Please let me know what you think, if you have not already. Thank you 🙂

It’s alive – Cont3xt – The follow up to Reconfigure

On Friday I pressed the publish button on Cont3xt. It whisked its way onto Amazon and the KDP select processes. The text box suggested it might be 72 hours before it was life, in reality it was only about 2 hours, followed by a few hours of links getting sorted across the stores. I was really planning to release it today, but, well, it’s there and live.
I did of course have to mention it in a few tweets, but I am not intending to go overboard. I recorded a new promo video, and played around with some of the card features on YouTube now to provide links. I am just pushing a directors cut version. The version here is a 3 minute promo. The other version is 10 minutes explaining the current state of VR and AR activity in the real industry.


As you can see I am continuing the 0.99 price point. I hope that encourages a few more readers to take a punt and then get immersed in Roisin’s world.
Cont3xt is full of VR, AR and her new Blended Reality kit. It has some new locations and even some martial arts fight scenes. Peril, excitement and adventure, with a load of real World and future technology. Whats not to like?
I hope you give it a download and get to enjoy it as much as everyone else who has read it seems to.
This next stage in the journey has been incredibly interesting and I will share some of that later. For now I just cast the book out there to see whether people will be interested now there is the start of a series 🙂

Making Movies – Flying around in Unity3d

Yes, ok, my latest obsession of writing #Reconfigure might make a greta TV series or Movie but that’s not what this is about. Instead it is about another set of skills I seem to be putting into practice.
Before the I spent a day editing up the school end of year video. I didn’t shoot it, but I did use al the various messing around and comedy moments from the teachers and staff to put a montage together. It had a story, some pace, a step change or two. It was great fun. I have now been asked to edit a few more things together. I like editing moving images, it is fun. Though I am not charging for this is just to help people out.
At the same time I am in the middle of a little project to try and jazz up a regular presentation in the corporate space. I demoed a little virtual fly through a building, past a couple of characters and onto an image using Unity3d. It turned into a slightly more complex epic but non the less I am using lots of varied skills to make it work. The pitch will be pre canned but I have built the Unity environment so that it runs on rails, but live. I have c# code in there to help do certain things like play video textures, animation controllers and even NavMesh agents to allow a character to walk around to a particular room and lots of flying cameras. I had used the first few things a lot. It is stock Unity. However I had not really had a chance to use Unity animations. All my animations had been either remade FBX files or ones that I created in Cheetah3D. Once you imported an animation like that it was then sort of locked in place.
However, Unity3d has its own animation dope and curve sheets, and its really handy.
The animation Tab brings a different timeline. It is different to the Animation Controller screen that lets you string animation together and create transitions. e.g. jump or run from a walk.
The animation tab lets you select an object in the scene and create a .anim for you. It then by default adds that anim to an animation controller and adds that to the object.
Unity3d Animation Timeline
The record mode then allows you to expand any or all of the properties of the object, usually transform position and rotation, but it can be colour, visibility, basically anything with a parameter. If you move the timeline and then move the object, like a camera for instance, it creates the key frames for the changed parameters. It removes any of the awkward importing of paths and animation ideas from another tool. It is not great for figures and people. They are still best imported as skinned meshes with skeletons and let the import deal with all the complexity so you can add any stock animations to them. Whoever for a cut scene, or zoom past to see a screen on a wall it works really well.
I have written a lot of complicated things in Unity3d, but never bothered using this part. It is great to find it working, and doing exactly what I needed it to do for my cinematic cut scenes.
You can have things zooming and spinning around a known path in no time. You have to break it down a little otherwise it resolves the easiest path through a wall or something, but it seems very robust to edit it afterwards.
The only pity is that its not possible to lock a camera preview window open in stock Unity3d when selecting another object. Having got a camera pan just right its trick if you was to move an object in the camera view. With camera selected you see it’s preview, with the object selected you see it. Never got around that yet, just end up with trial and error.
No matter, it works, and its awesomely useful.

Return of the camp fire – Blended Reality/Mixed Reality

Back in 2006 when we all got into in the previous wave of Virtual Reality (note the use of previous not 1st) and we tended to call them virtual worlds or the metaverse we were restricted to online presence via a small rectangular screen attached to a keyboard. Places like Second Life become the worlds that attracted the pioneers, explorers and the curious.
Several times recently I have been having conversations, comments, tweets etc about the rise of the VR headsets Oculus Rift et al, AR phone holders like Google Cardboard, MergeVR and then the mixed reality future of Microsoft Hololens, and Google Magic Leap.
In all the Second Life interactions and subsequent environments it has always really been about people. The tech industry has a habit of focusing on the version number, or a piece of hardware though. The headsets are an example of that. There may be inherent user experiences to consider but they are more insular in their nature. The VR headset forces you to look at a screen. It provides extra inputs to save you using mouse look etc. Essentially though it is an insular hardware idea that attempts to place you into a virtual world.
All out previous virtual world exploration has been powered by the social interaction with others. Social may mean educational, may mean work meetings or may mean entertainment or just chatting. However it was social.
Those of us who took to this felt as we looked at the screen we were engaged and part of the world. Those who did not get what we were doing only saw funny graphics on the screen. They did not make that small jump in understanding. There is nothing wrong with that, but that is essential what we were all trying to get people to understand. A VR headset provides a direct feed to the eyes. It forces that leap into the virtual, or at least brings it a little closer. Then ready to engage with people. Though at the moment all the VR is engaging with content. It is still in showing off mode.
120362
We would sit around campfires in Second Life. In particular Timeless Prototype’s very clever looking one that adds seats/logs per person/avatar arriving. Shared focal points of conversation. People gathering looking at a screen in front of them but being there mentally. The screen providing all the cues and hints to enhance that feeling. A very real human connection. Richer than a telephone call, less scary than a video conference, intimate yet stand offish enough to meet most peoples requirements if they gave it a go.
It is not such a weird concept as many people get lost in books, films and TV shows. They may be immersed as a viewer not a participant or they may be empathising with characters. They are still looking at an oblong screen or piece of paper and engaged.
With a VR headset, as I just mentioned, that is more forced. They will understand the campfire that much quicker and see the people and the interaction. There is less abstraction to deal with.
The rise of the blended and mixed reality headset though change that dynamic completely. Instead they use the physical world, one that people already understand and are already in. The campfire comes to you. Wherever you are.
This leads to a very different mental leap. You can have three people sat around the virtual campfire in their actual chairs in an actual room. The campfire is of course replaceable with any concept, piece of information, simulated training experience. Those people each have their own view of the world and of one another but its added to with a new perspective, they see the side of the campfire they would expect.
It goes further though, there is no reason for the campfire not to coexist is several physical spaces. I have my view of the campfire from my office, you from yours. We can even have a view of one another as avatars placed into our physical worlds, or not. It’s optional.
When just keeping with that physical blended model it is very simple for anyone to understand and start to believe. Sat in a space with one another sharing an experience, like a campfire is a very human very obvious one. For many that is where this will sort of end. Just run of the mill information, enhanced views, cut away engineering diagrams, point of view understanding etc.
The thing to consider, for those who already grok the VW concept is that just as in Second Life you can move your camera around, see things from any point of view, for the other persons point, from a birds eye view, from close in or far away, the mixed reality headsets will still allow us to do that. I could alter my view to see what you are seeing on the other side of the camp fire. That is the star of something much more revolutionary to consider whilst everyone catches up the fact that this is all about people not just insular experiences.
This is the real metaverse, a blend of everything anywhere and everywhere. Connecting us as people, our thoughts dreams and ideas. There are huge possibilities to explore and more people will once that little hurdle is jumped to understand it’s people, its always been about people.

Microsoft Minecraft, Space and Semantic Paint

It is interesting looking, spotting, patterns and trends and extrapolating the future. Sometimes things come in and get focus because they are mentioned a lot on multiple channels and sometimes those things converge. Right at the moment it seems that Microsoft are doing all the right things in the emerging tech space, game technology and general interest space.
Firstly has been the move my Microsoft to make Minecraft even easier to use in schools.

They are of course following on and officially backing the sort of approach the community has already taken. However a major endorsement and drive to use things like Minecraft in education is fantastic news. The site is education.minecraft.net
And just to resort to type again, this is, as the video says, not about goofing around in a video game. This is about learning maths, history, how to communicate online. These are real metaverse tools that enable rich immersive opportunities, on a platform that kids already know and love. Why provide them with a not very exciting online maths test when they can use multiple skills in an online environment together?
Microsoft have also supported the high end sciences. Announcing teaming up with NASA to use the blended/mixed/augmented reality headset Hololens to help in the exploration of Mars. This is using shared experience telepresence. Bringing mars to the room, desk, wall that several people are sharing in a physical space. The Hololens and the next wave of AR are looking very exciting and very importantly they have an inclusion approach to the content. Admittedly each person needs a headset, but using the physical space each person can see the same relative view of the data and experience as the rest of the people. The don’t have to of course :), there is not reason for screens not to show different things in context for each user. However standard VR is in immersed experience without much awareness of others. A very important point I feel.
Microsoft have also published a paper and video about using depth scanning and live understanding and labelling of real world objects. Something that things like Google Tango will be approaching.
Slightly better than my 2008 attempt (had to re-use this picture again of the Nokia N95)
Hackday5 Augmented Reality step 2


The real and virtual are becoming completely linked now. Showing how the physical universe is actually another plane alongside all the other virtual and contextual metaverses.
It is all linked and not about isolation, but understanding and sharing of information, stories and facts.

Untethering Humans, goodbye screens

We are on the cusp of a huge change in how we as humans interact with one another, with the world and with the things we create for one another. A bold statement, but one that stands up, I believe, by following some historical developments in technology and social and work related change.
The change involves all the great terms, Augmented Reality, Virtual Reality, Blended Reality and the metaverse. It is a change that has a major feature, one of untethering, or unshackling us as human beings from a fixed place or a fixed view of the world.

Hovering
Here I am being untethered from the world in a freefall parachute experience, whilst also being on TV 🙂

All the great revolutions in human endeavour have involved either transporting us via our imagination to another place or concept, books, film, plays etc. or transporting us physically to another place, the wheel, steam trains, flight. Even modern telecommunications fit into that bracket. The telephone or the video link transport us to a shared virtual place with other people.

Virtual worlds are, as I may have mentioned a few times before, ways to enhance the experience of humans interacting with other humans and with interesting concepts and ideas. That experience, up to know has been a tethered one. We have evolved the last few decades becoming reliant on rectangular screens. Windows on the world, showing us text, images, video and virtual environments. Those screens have in evolved. We had large bulky cathode ray tubes, LED, Plasma, OLED and various flat wall projectors. The screens have always remained a frame, a fixed size tethering us to a more tunnel vision version. The screen is a funnel through which elements are directed at us. We started to the untethering process with wi-fi and mobile communications. Laptops, tablets and smartphones gave us the ability to take that funnel, a focused view of a world with us.

More recent developments have led to the VR headsets. An attempt to provide an experience that completely immerses us by providing a single screen for each eye. It is a specific medium for many yet to be invented experiences. It does though tether us further. It removes the world. That is not to say the Oculus Rift, Morpheus and HTC Vive are not important steps but they are half the story of untethering the human race. Forget the bulk and the weight, we are good at making things smaller and lighter as we have seen with the mobile telephone. The pure injection of something into our eyes and eyes via a blinkering system feels, and is, more tethering. It is good at the second affordance of transporting is to places with others, and it is where virtual world and the VR headsets naturally and obviously co-exist.

The real revolution comes from full blended reality and realities. That plural is important. We have had magic lens and magic mirror Augmented Reality for a while. Marker based and markerless ways to use one of these screens that we carry around or have fixed in out living rooms to show us digital representations of things places in out environment. They are always fun. However they are almost always just another screen in our screens. Being able to see feel and hear things in our physical environment wherever we are in the world and have them form part of that environment truly untethers us.
Augmented Reality is not new of course. We, as in the tech community, has been tinkering with it for years. Even this mini example on my old Nokia N95 back in 2008 starts to hint at the direction of travel.

Hackday5 Augmented Reality step 2

The devices used to do this are obviously starting at a basic level. Though we have the AR headsets of Microsoft Hololens, Google’s Magic Leap to start pushing the boundaries. They will not be the final result of this massive change. With an internet of things world, with the physical world instrumented and producing data, with large cloud servers offering compute power at the end of an wireless connection to analyse that data and to be able to visualize and interact with things in our physical environment we have a lot to discover and to explore.

I mentioned realities, not just reality. There is no reason to only have augmentation into a physical world. After all if you are immersed in a game environment or a virtual world you may actually choose, because you can, to shut out the world, to draw the digital curtains and explore. However, just as when you are engaged in any activity anywhere, things come to you in context. You need to be able to interact with other environments from which ever one you happen to be in.

Take an example, using Microsoft Hololens, Minecraft and Skype. In todays world you would have minecraft on you laptop/console/phone be digging around, building sharing the space with others. Skype call comes in from someone you want to talk to. You window away from Minecraft and focus on Skype. It is all very tethered. In a blended reality, as Hololens has shown you can have Minecraft on the rug in front of you and skype hanging on the wall next to the real clock. Things and data placed in the physical environment in a way that works for you and for them. However you may want to be more totally immersed in Minecraft and go full VR. If something can make a small hole in the real world for the experience, then it can surely make an all encompassing hole, thus providing you with only Minecraft. Yet, if it can place Skype on your real wall, then it can now place it on your virtual walls and bring that along with you.
This is very much a combination that is going to happen. It is not binary to be either in VR or not, in AR or not. It is either, both or neither.

It is noticeable that Facebook, who bought heavily into Oculus Rift have purchased Surreal Vision last month who specialize in using instrumentation and scanning kit to make sense of the physical world and place digital data in that world. Up until now Oculus Rift, which has really led the VR charge since its kickstarter (yes I backed that one!) has been focussed on the blinkered version of VR. This purchase shows the intent to go for a blended approach. Obviously this is needed as otherwise Magic Leap and Hololens will quickly eat into the Rifts place in the world.
So three of the worlds largest companies, Facebook, Google and Microsoft have significant plays in blended reality and the “face race” as it is sometimes called to get headsets on us. Sony have Project Morpheus which is generally just VR, yet Sony have had AR applications for many years with PS Move.
Here is Predlet 2.0 enjoying the AR Eyepet experience back in 2010 (yes five years ago !)

So here it is. We are getting increasingly more accurate ways to map the world into data, we have world producing IOT data streams, we have ever increasing ubiquitous networks and we have devices that know where they are, what direction they are facing. We have high power backend servers that can make sense of our speech and of context. On top of that we have free roaming devices to feed and overlay information to us, yes a little bit bulky and clunky but that will change. We are almost completely untethered. Wherever we are we can experience whatever we need or want, we can, in the future blend that with more than one experience. We can introduce others to our point of view or keep our point of view private. We can fully immerse or just augment, or augment our full immersion. WE can also make the virtual real with 3d printing!
That is truly exciting and amazing isn’t it?
It makes this sort of thing in Second Life seem an age away, but it is the underpinning for me and many others. Even in this old picture of the initial Wimbledon build in 2006 there is a cube hovering at the back. It is one of Jessica Qin’s virtual world virtual reality cubes.
Wimbledon 06 magic carpet
That cube and other things like it provided an inspiration for this sort of multiple level augmentation of reality. It was constrained by the tethered screen but was, and still is, remarkable influential.
Jessica Qin Holocube
Here my Second Life avatar, so my view of a virtual world is inside 3d effect that provides a view, a curved 3d view not just a picture, or something else. It is like the avatar is wearing a virtual Oculus rift it that helps get the idea across.
This is from 2008 by way of a quick example of the fact even then it could work.

So lets get blending then 🙂

VR – Everything old is new again – good!

It is important when getting interested and excited about new things to look at its lineage. Something I often did in my series of articles over the years in Flush Magazine
So with the current rush of Virtual Reality gaming and experiences and the slew of new kit we have some very interesting near off the shelf kit such as this.
*warning it contains violence (see where I am on that here)

However, back in the 90’s we had VR kit like Virtuality (they must be kicking themselves for be too early for the masses. Though thats a curse us early adopters and evangelists have to live with 🙂

The headsets were quite heavy and the container you stood or sat in was not a treadmill but part of the sensing rig.
The graphics may look old fashioned and clunky but they were good experiences. When I was at poly/uni in Leicester getting ready to do a year out this was the company I wanted to go and work for if I had a chance too. They sent to me IBM instead. Though as you can see from the Virtuality company details they had very close ties with IBM. So it was close 🙂 It i also funny how things work out.
Still, it is very cool having all these headset sat around me and some that just work on my portable communication device too 🙂

Over 15 years experience in Virtual Worlds, 30+ in tech – What now?

A few days ago I realised it has been over 9 years since I first publicly blogged about how important I thought the principles of the metaverse and virtual worlds were going to be for both social and business uses. This post, pictured below for completeness was a tipping point for some radical changes in many of our lives as part of Eightbar.
Second Life first post
I had been working to that sort of point of understanding though since some very early work with virtual environments and how people get to interact with one another in them around 1999/2000 with SmartVR trying to keep the social bond in our internal web and multi media design group together when we were cast asunder to different locations by business pressures (or bad decisions who knows!). I knew we had to have a sense of one another existence aside from text in emails and instant messages. So we tried to build a merged version of both offices as if they were in one place. The aim to then instrument those with presence and the ability to walk over to someone’s desk and talk. Mirror world, blended reality and even internet of things, Yes I know, a bit before its time! Cue music, “Story of my life” by 1D 🙂

We definitely had a technology and expectation bubble later 2006-2009. However, that, as with all emerging technology is all part of the evolution. The garnet curve et al. What surprises me the most still is that people think when a bubble like that bursts that its all over. That somehow everything that was learned in that time was pointless. “Are virtual worlds still a thing?” etc. I feel for the even earlier pioneers like Bruce Damer, who patiently put up with our ramblings as we all rushed to discover and feel for ourselves those things he already knew.
Increasingly I am talking to new startup ands seeing new activity in the virtual space. The same use cases, the same sparks of creativity that we had in the previous wave(s), the same infectious passion to do something interesting and worthwhile. Sometimes this is somehow differentiated from the last wave of virtual worlds under the heading of virtual reality. The current wave is focussed on the devices, the removal of keyboard and of a fixed screen. The Oculus Rift, HoloLens etc. However, thats a layer of new things to learn and experience on what we have already been through. After all its a virtual world you end up looking at or blending with in a VR headset!

I spend so much time looking forward and extrapolating concepts and ideas it is now very scary to look back and consider the experience I have gathered. The war stories and success stories, the concepts and ideas that I have tried. The emotional impact of virtual worlds. The evolution of the technology and of people’s expectation of that technology. The sheer number of people that have moved around in a 3d environment from an early age with things like Minecraft, who are now about to enter higher education and the workforce.

So I am left in a slightly bemused state as to what to do with this knowledge. With this all going so much more mainstream again I am no longer working in a niche. Do I ply my trade of independent consulting chipping away in odd places and helping and mentoring some of the new entrants in the market or do I try and find a bigger place to spread the word?

At the same time though, knowing lots of things makes you realise how much you don’t know. The imposter’s syndrome kicks in. Surely everyone must know all this stuff by now? It’s obvious and stands up to logical reasoning to try and connect with other people in as rich a way as possible. The network is there, the tech is there, the lineage is there. Though clearly not everyone gets it yet. I often wonder if the biggest naysayers I had to deal with on my journey so far have figured it out yet? It will probably turn out they will be getting rich quick right now whilst I sit and ponder such things.

On the other side of the virtual coin, I know from my martial arts that constant practice and investigation leads to a black belt. In the case of Choi Kwang Do that’s just over 3 years worth. So how many Dan worth of tech equivalent experience does that put me at. 25 years in the industry professionally but 30+ as techie.

I still want to do the right thing, to help others level themselves up. I don’t think I am craving fame and fortune but the ability to share and build is what drives me. If fame and fortune as a spokesperson, evangelist for some amazing idea or TV show reaches that end, then thats fine by me.

I am at a crossroads, my VR headset letting me look in all directions. I see half built roads in many directions. What now? A well funded company that I can help build a great road with, or forge off down one for the other paths seeing what happens on the way.
Of course that makes it seem like there is a clear choice on the plate. I suspect most of the the well funded companies and corporations don’t think they need any help, which is rather where I came in on this!

Needless to say I am always open to conversations, offers, partnerships, patronage, retainers and technology evangelist roles. There must be a slew of investors out there wondering what to put their money into, who need some sagely advice ;). Or that book… there is always that book… (600 images x 1000 words each 600,000 words just on these Second Life experiences. That’s just the ones with online. The offline ones double that!. Not to mention the other places, games, and self built experiences!

I took this photo in April 2006 as part of sharing of our journey
The wilderness

I have always liked a nice greenfield to start building on. Equally that did not build itself, it was a massive shared team experience. No one has all the answers. Some of us are good at helping people find them though.

Right! Can we get on with this now?

A great week for science and tech, games, 3d printing and AR

There is always something going on in science and emerging technology. However some weeks just bring a bumper bundle of interesting things all at once. Here in the UK the biggest event had to been the near total eclipse of the Sun. We had some great coverage on the the TV with Stargazing live sending a plane up over the Faroe islands to capture the total eclipse. I was all armed and ready with a homemade pinhole camera.
Shredded wheat pinhole camera
This turned out great but unfortunately we were quite overcast here so it was of little use as a camera. I also spent the eclipse at Predlet 2.0 celebration assembly. They had the eclipse on the big screen for all the primary school kids to see. Whilst we had the lights off in the hall it did not get totally dark, but it did get a bit chilly. It was great that the school keyed into this major event that demonstrates the motion of the planets. So rather like the last one in 1999 I can certainly say I will remember where I was and what we were doing.(a conversation I had with @asanyfuleno on Twitter and Facebook)
This brings me on to our technological change and the trajectory we are on. In 1999 I was in IBM Hursley with my fellow Interactive Media Centre crew. A mix of designers, producers and techies and no suits. It was still the early days of the web and we were building all sorts of things for all sorts of clients. In particular during that eclipse it was some more work for Vauxhall cars. We downed tools briefly to look out across Hursley park to see the dusk settle in and flocks of birds head to roost thinking it was night.
It does not seem that long ago but… it is 16 years. When we were building those quite advanced websites Amazon was just starting, Flickr was 6 years away, Twitter about 7 years away, Facebook a mere 5 (but with a long lead time) and we were only on Grand Theft Auto II, still a top down pac man clone. We were connected to lots of our colleague son instant messaging but general communications were phone and SMS and of course email. So we were not tweeting and sharing pictures, or now as people do live feeds on Meerkat. Many people were not internet banking, trust in communications and computers was not high. We were pre dot.com boom/bust too. Not to mention no one really had much internet access out and about or at home. Certainly no wi-fi routers! We were all enthralled by the still excellent Matrix movie. The phone in that, the slide down communicator style Nokia being one of the iconic images of the decade.
NB. As I posted this I saw this wonderful lego remake of the lobby scene so just had to add it in this post 🙂

It was a wild time of innovation and one many of us remember fondly I think. People tended to leave us alone as we brought in money doing things no managers or career vultures knew to jump on. So that eclipse reminds me of a time I set on a path of trying to be in that zone all the time. I was back then getting my first samples from a company that made 3d printers as I was amazed at the principle, and I was pondering what we could do with designers that knew 3d and this emerging tech. We were also busy playing Quake and Unreal in shared virtual worlds across the LAN in our downtime so I was already forming my thoughts on our connection to one another through these environments. Having experiences that I still share today in a newer hi tech world where patterns are repeating themselves, but better and faster.
That leads me to another movie reference and in the spirit of staying in this zone. This footage of a new type of Terminator T-1000 style 3d manufacturing. 3D printers may not be mainstream as such but many more people get the concept of additive manufacture. Laying down layer after layer of material such as plastic. It is the same as we made coil clay pots out of snakes of rolled clay when we were at school. A newer form of 3D printing went a little viral on the inter webs this week from carbon3d.com. This exciting development pulls an object out of a resin. It is really the same layering principle but done in a much more sophisticated way. CLIP (Continuous Liquid Interface Production) balances exposing the resin to molecules to oxygen or to UV light. Oxygen keeps it as a liquid (hence left behind) and targeted UV light causes the resin to become solid, polymerization. Similar liquid based processes use lasers to fire into a resin. This one though slowly draws the object out of the resin. Giving it a slightly more ethereal or scifi look to it. It is also very quick in comparison to other methods. Whilst this video is going faster than actual speed it is still a matter of minutes rather than hours to create objects.

Another video doing the round that shows some interesting future developments is one from Google funded Magic Leap. This is a blended reality/augmented reality company. We already have Microsoft moving into the space with Hololens. Much of Magic Leap’s announcements have not been as clearly defined as one might hope. There is some magic coming and it is a leap. Microsoft of course had a great pre-release of Hololens, some impressive video but some equally impressive testimonials and articles from journalist and bloggers who got to experience the alpha kit. The video appeared to be a mock up but fairly believable.
Magic Leap were set to do a TED talk but apparently pulled out at the last minute and this video appeared instead.

It got a lot of people excited, which is the point, but it seems even more of a mock up video than any of the others. It is very ell done as the Lord of the Rings FX company Weta Workshop have a joint credit. The technology is clearly coming. I don’t think we are there yet in understanding and getting the sort of precise registration and overlays. We will, and one day it may look like this video. Of course it’s not just the tech but the design that has to keep up. If you are designing a game that has aliens coming out of the ceiling it will have a lot less impact if you try and play outside or in an atrium with a massive vaulted ceiling. The game has to understand not just where you are and what the physical space is like but how to use that space. Think about an blended reality board game, or an actual board game for that matter. The physical objects to play Risk, Monopoly etc require a large flat surface. Usually a table. You clear the table of obstructions and set up and play. Now a project board game could be done on any surface, Monopoly on the wall. It could even remove or project over things hung on the wall, obscure lights etc. It is relying on a degree of focus in one place. A fast moving shooting game where you walk around or look around will be reading the environment but the game design has to adjust what it throws at you to make it continue to make sense. We already have AR games looking for ghosts and creatures that just float around. They are interesting but not engaging enough. Full VR doesn’t have this problem as it replaces the entire world with a new view. Even in that there are lots of unanswered questions of design, how stories are told, cut scenes, attracting attention, user interfaces, reducing motion sickness etc. Blending with a physical world, where that world could be anywhere or anything is going to take a lot more early adopter suffering and a number of false starts and dead ends. It can of course combine with rapid 3d printing, creating new things in the real world that fit with the game or AR/BR experience. Yes thats more complexity, more things to try and figure out. It is why it is such a rich and vibrant subject.
Just bringing it back a little bit to another development this week. The latest in the Battlefield gaming franchise Battlefield Hardline went live. This, in case you don’t do games, is a 3d first person shooter. Previous games have been military, this one is cops and robbers in a modern Miami Vice tv style. One of the features of Battlefield is the massive online combat. It features large spaces and it makes you feel like a small spec in the map. Other shooters are more close in like Call of Duty. The large expanse means Battlefield can focus on things like vehicles. Flying helicopters and driving cars. Not just you though, you can be a pilot and deliver your colleagues to the drop zone whilst you gunner gives cover.
This new game has a great online multiplayer mode called hotwire that apps into vehicles really well. Usually game modes are capture the flag or holding a specify fixed point to win the game. In hotwire you grab a car/lorry etc and try and keep that safe. It means that you have to do some mad game driving weaving and dodging. It also means that you compatriots get to hand out of the windows of the car trying to shoot back at the bad guys. It is very funny and entertaining.
What also struck me was the 1 player game called “episodes”. This deliberately sticks with a TV cop show format as you play through the levels. After a level has finished the how you did page looks like Netflix with a next episode starts in 20 seconds down in the bottom right. If you quite a level before heading to the main menu it does a “next time in Battlefield Hardline” mini montage of the next episode. As the first cut scenes player I got a Miami Vice vibe which the main character then hit back by referencing it. It was great timing, and in joke, but one for us of a certain age where Miami Vice was the show to watch. Fantastic stuff.
I really like its style. It also has a logo builder on the website so in keeping with what I always do I built a version of the Feeding Edge logo in a Hardline style.
Battlefield Hardline Feeding Edge logo
I may not be great at the game, as I bounce around looking for new experiences in games, but I do like a good bit of customisation to explore.