Monthly Archives: July 2015


Return of the camp fire – Blended Reality/Mixed Reality

Back in 2006 when we all got into in the previous wave of Virtual Reality (note the use of previous not 1st) and we tended to call them virtual worlds or the metaverse we were restricted to online presence via a small rectangular screen attached to a keyboard. Places like Second Life become the worlds that attracted the pioneers, explorers and the curious.
Several times recently I have been having conversations, comments, tweets etc about the rise of the VR headsets Oculus Rift et al, AR phone holders like Google Cardboard, MergeVR and then the mixed reality future of Microsoft Hololens, and Google Magic Leap.
In all the Second Life interactions and subsequent environments it has always really been about people. The tech industry has a habit of focusing on the version number, or a piece of hardware though. The headsets are an example of that. There may be inherent user experiences to consider but they are more insular in their nature. The VR headset forces you to look at a screen. It provides extra inputs to save you using mouse look etc. Essentially though it is an insular hardware idea that attempts to place you into a virtual world.
All out previous virtual world exploration has been powered by the social interaction with others. Social may mean educational, may mean work meetings or may mean entertainment or just chatting. However it was social.
Those of us who took to this felt as we looked at the screen we were engaged and part of the world. Those who did not get what we were doing only saw funny graphics on the screen. They did not make that small jump in understanding. There is nothing wrong with that, but that is essential what we were all trying to get people to understand. A VR headset provides a direct feed to the eyes. It forces that leap into the virtual, or at least brings it a little closer. Then ready to engage with people. Though at the moment all the VR is engaging with content. It is still in showing off mode.
120362
We would sit around campfires in Second Life. In particular Timeless Prototype’s very clever looking one that adds seats/logs per person/avatar arriving. Shared focal points of conversation. People gathering looking at a screen in front of them but being there mentally. The screen providing all the cues and hints to enhance that feeling. A very real human connection. Richer than a telephone call, less scary than a video conference, intimate yet stand offish enough to meet most peoples requirements if they gave it a go.
It is not such a weird concept as many people get lost in books, films and TV shows. They may be immersed as a viewer not a participant or they may be empathising with characters. They are still looking at an oblong screen or piece of paper and engaged.
With a VR headset, as I just mentioned, that is more forced. They will understand the campfire that much quicker and see the people and the interaction. There is less abstraction to deal with.
The rise of the blended and mixed reality headset though change that dynamic completely. Instead they use the physical world, one that people already understand and are already in. The campfire comes to you. Wherever you are.
This leads to a very different mental leap. You can have three people sat around the virtual campfire in their actual chairs in an actual room. The campfire is of course replaceable with any concept, piece of information, simulated training experience. Those people each have their own view of the world and of one another but its added to with a new perspective, they see the side of the campfire they would expect.
It goes further though, there is no reason for the campfire not to coexist is several physical spaces. I have my view of the campfire from my office, you from yours. We can even have a view of one another as avatars placed into our physical worlds, or not. It’s optional.
When just keeping with that physical blended model it is very simple for anyone to understand and start to believe. Sat in a space with one another sharing an experience, like a campfire is a very human very obvious one. For many that is where this will sort of end. Just run of the mill information, enhanced views, cut away engineering diagrams, point of view understanding etc.
The thing to consider, for those who already grok the VW concept is that just as in Second Life you can move your camera around, see things from any point of view, for the other persons point, from a birds eye view, from close in or far away, the mixed reality headsets will still allow us to do that. I could alter my view to see what you are seeing on the other side of the camp fire. That is the star of something much more revolutionary to consider whilst everyone catches up the fact that this is all about people not just insular experiences.
This is the real metaverse, a blend of everything anywhere and everywhere. Connecting us as people, our thoughts dreams and ideas. There are huge possibilities to explore and more people will once that little hurdle is jumped to understand it’s people, its always been about people.

BBC Micro:bit – It’s taken a while but worth it

I have always been keen that kids learn to code early on. Not to make them all software engineers but to help find those that can be, and also to help everyone get an appreciation of what goes into the devices, games and things we all totally depend on now.
Way back in the last ever Cool Stuff Collective (in 2011 !) I did a round up of all things tech coming in the future. I begged kids to hassle their teachers to learn to program
It was looking as if the Raspberry Pi was going to be the choice for a big push into schools. It is probably even why our show was seen as too common to be allowed to show the early version. I was keen to have the Pi on but that didn’t work out. We had already done the Arduino on a previous show and that seemed to work really well.
So I was very pleased to see the BBC Micro:bit arrive on the scene last week. The Raspberry Pi is a great small computer, it is however quite a techie thing to get going as you have to be managing operating systems, drivers etc. (Though things like the Kano make that much much easier)
Arduino are great for being able to plug things into though it requires some circuitry and connections on breadboard to start to make sense of it. The programming environment allows a lot oa things to be achieved. It is a micro controller not a full working computer.
microbit
The new BBC micro:bit is a step in from the Arduino. It has an array of LED lights on it as its own display but also has bluetooth and a collection of sensors already onboard.
I was very pleased to see the involvement of Technology WilL Save Us in the design. I met and shared the stage with the co founder of this maker group Daniel Hirschman @danielhirschman at the GOTO conference in Amsterdam a couple of years ago. He knows his stuff and is passionate about how to get kids involved in tech so I know that this is a great step and a good piece of kit.
The plans to roll this out to every year 7 student is great, though predlet 1.0 is just moving up out of that year. So I will just have to buy one when we can.
Microsoft have a great video to describe it too.

I am sure STEMnet ambassadors in the filed of computing will be interested and hopefully excited too. This is important, as it is not just code it is physical computing, understanding sensors what can and can’t be done. It is an internet of things (IOT) device and it is powered by a web based programming language. It is definitely off the moment and the right time for it. Oh and its not just an “app” 🙂

Emotiv Insight – Be the borg

Many years ago a few of us had early access, or nearly early access, to some interesting brain wave reading equipment from Emotiv. This was back in 2005. I had a unity arrive at the office but then it got locked up in a cupboard until the company lawyers said it was OK to use. It was still there when I left but did make its way out of the cupboard by 2010. It even helped some old colleagues get a bit of TV action
It was not such a problem with my US colleagues who at least got to try using it.
I got to experience and show similar brain controlling toys on the TV with the NeuroSky mindflex gym. This allows the player to try to relax or concentrate to control air fans that blow a small foam ball around a course.
So in 2013 when Emotiv announced a kickstarter for the new headsets, in this case the insight I signed right up.
We (in one of our many virtual world jam sessions) had talked about interesting uses of the device to detect user emotions in virtual worlds. One such use was if your avatar and how you chose to dress or decorate yourself caused discomfort or shock to another person (wearing a headset), their own display of you could be tailored to make them happy. It’s a modern take on the emperors new clothes, but one that can work. People see what they need to see. It could have worked at Wimbledon this year when Lewis Hamilton turned up to the royal box without a jacket and tie. In a virtual environment that “shock” would mean any viewers of his avatar would indeed see a jacket and tie.
It has taken 2 years but the Emotiv Insight headset has finally arrived. As with all early adopter kit though there are some hiccups. Michael Rowe, one of my fellow conspirators from back in the virtual world hey day, has blogged about his unit arriving
Well here is my shiny boxed up version complete with t-shirt and badge 🙂
Emotiv insight has arrived
The unit itself feels very slick and modern. Several contact sensors spread out over your scalp and an important reference on the bone behind the ear (avoid your glasses) needs to be in place.
Though you do look a bit odd wearing it 🙂 But we have to suffer for the tech art.
Emotiv headset
I charged the unity up via the USB cable waited for the light to go green. Unplugged the device and hit the on switch (near the green light). Here I then thought it was broken. It is something very simple but I assumed, never assume, that the same light near the switch would be the power light. That was not the case. The “on” light is a small blue/white LED above the logo on the side. It was right under where I was holding the unit. Silly me.
Emotiv insight power light
I then set about figuring out what could be done, if it worked (once I saw the power light).
This got much trickier. With a lot of Kickstarters they all have their own way of communicating. When you have not seen something appear for 2 years you tend not to be reading it everyday, following every link and nuance.
So the unit connected with Bluetooth, but not any bluetooth, it is the new Bluetooth (BTLE). This is some low power upgrade to bluetooth that many older devices do not support. I checked my Macbook Pro and it seemed to have the right stack. (How on earth did NASA manage to get the probe to Pluto over 9 years without the comms tech all breaking !)
I logged into the Emotiv store and saw a control panel application. I tried to download it, but having selected Mac it bounced me to my user profile saying I needed to be logged on. It is worrying that the simplest of web applications is not working when you have this brain reading device strapped to your head which is extremely complex! It seems the windows one would download, but that was no good.
I found some SDK code via another link but a lot of that seemed to be emulation. It also turns out that the iPhone 8 updates mashed up the Bluetooth LTE versions of any apps they had so it doesn’t work on those now. Still some android apps seem to be operational.
I left it a few days, slightly disappointed but figured it would get sorted.
Then today I found a comment on joining the google plus group (another flashback to the past) and a helpdesk post saying go to https://cpanel.emotivinsight.com/BTLE/
As soon as I did that, it downloaded a plugin, and then I saw the bluetooth pair with the device with no fuss. A setup screen flickerd into life and it showed sensors were firing on the unit.
The webpage looked like it should then link somewhere but it turns out there is a menu option hidden in the top left hand corner, with a submenu “detections”. That let me get to a few web based applications that atlas responded. I did not have much luck training and using it yet as often the apps would lockup, throw a warning message and the bluetooth would drop. I did however get a trace, my brain is in there and working. I was starting to think it was not!
Emotiv Insight Brain trace
So I now have a working brain interface, I am Borg, it’s 2015. Time to dig up those ideas from 10 years ago!
I am very interested in what happens when I mix this with the headsets like Oculus Rift. Though it is a pity that Emotiv seem to want to charge loyal backers $70 for an Emotiv unity plugin, that apparently doesn’t work in Unity 5.0 yet ? I think they may need some software engineering and architecture help. Call me, I am available at the moment!
I am alos interested, give how light the unit is, to trace what happens in Choi Kwang Do training and if getting and aiming for a particular mind set trace, i.e. we always try and relax to generate more speed and power from things that instinct tells you to tense up for. Of course that will have to wait a few weeks for my hip to recover. The physical world is a dangerous place and I sprained my left hip tripping over the fence around the predpets. So it’s been a good time to ponder interesting things to do whilst not sitting too long at this keyboard.
As ever, I am open for business. If you need an understand what is going on out here in the world of new technology and where it fits, that what feeding edge ltd does, takes a bit out of technology so you don’t have to.

Space travel and fighting – Elite plus Mortal Kombat Predator

I occasionally like to flow between gaming experiences and think what they might be like as a whole. I have been playing more Elite Dangerous, this time on the early access version on Xbox One. Obviously I have talked about the game before and the kickstarter,and the oculus rift etc. Strangely though I am finding the “sofa version” even more engaging. There is a lot of sitting around in flying form place to place. It works really well on the joypad and the big TV too.
Yesterday one of the other game styles, the one on one fighting game finally got the character upgrade/DLC that I mistakingly thought was there are release. Mortal Kombat has been trickling out the DLC and now the Yautja/Predator set has arrived. Obviously I have quite an affinity with the character and the lore and I think they have done a really good job.
So I did this little xbox montage mashup showing one of characters in the story flying to a place to take part in a (gory comic book violence) fight. It was all recorded by just shouting at the kinect “xbox record that” then using upload studio in the xbox to stitch it together.
I used the training room for the predator vs predator fight, as it looks a bit like a docking bay. Its more of a short poem than a feature film, ideal for our internet attention spans. It shows of some of the nice predator work, both with and without facemask. Some plasma cannon action in there too.

Of course like all great expansive games like elite it would be great to be able to get out of the ship and do stuff like this. I guess that may come though No Man’s Sky may beat them to that 🙂

Microsoft Minecraft, Space and Semantic Paint

It is interesting looking, spotting, patterns and trends and extrapolating the future. Sometimes things come in and get focus because they are mentioned a lot on multiple channels and sometimes those things converge. Right at the moment it seems that Microsoft are doing all the right things in the emerging tech space, game technology and general interest space.
Firstly has been the move my Microsoft to make Minecraft even easier to use in schools.

They are of course following on and officially backing the sort of approach the community has already taken. However a major endorsement and drive to use things like Minecraft in education is fantastic news. The site is education.minecraft.net
And just to resort to type again, this is, as the video says, not about goofing around in a video game. This is about learning maths, history, how to communicate online. These are real metaverse tools that enable rich immersive opportunities, on a platform that kids already know and love. Why provide them with a not very exciting online maths test when they can use multiple skills in an online environment together?
Microsoft have also supported the high end sciences. Announcing teaming up with NASA to use the blended/mixed/augmented reality headset Hololens to help in the exploration of Mars. This is using shared experience telepresence. Bringing mars to the room, desk, wall that several people are sharing in a physical space. The Hololens and the next wave of AR are looking very exciting and very importantly they have an inclusion approach to the content. Admittedly each person needs a headset, but using the physical space each person can see the same relative view of the data and experience as the rest of the people. The don’t have to of course :), there is not reason for screens not to show different things in context for each user. However standard VR is in immersed experience without much awareness of others. A very important point I feel.
Microsoft have also published a paper and video about using depth scanning and live understanding and labelling of real world objects. Something that things like Google Tango will be approaching.
Slightly better than my 2008 attempt (had to re-use this picture again of the Nokia N95)
Hackday5 Augmented Reality step 2


The real and virtual are becoming completely linked now. Showing how the physical universe is actually another plane alongside all the other virtual and contextual metaverses.
It is all linked and not about isolation, but understanding and sharing of information, stories and facts.