Reconfigure – time to read and edit

Last Friday I finished writing the story that I had in my head for Reconfigure. I surprised myself at the speed it came out, but when you have to do something, you have to do it! I also learned a lot about how I thread idea together. The story is set now, the tools and tech in it is, for the most part, genuine and real. Constructing it form the initial plan to the first draft has felt just like a software project. Architecture, Internal Design, Code. It now differs in that there is not a “compiler” to help spot any errors. There is of course spell check. This stage is one of reading it, lots it would seem. Checking it makes sense, meets the spec and then adding or taking away from it to create a good user experience. As it has a basis in fact there is real world continuity checking. There is also speculative science fiction continuity that I can double check too.
My biggest surprise was the elements of the story, the little mental pictures and feelings that I discovered on the way. I had the structure, that was fun to conceive but the colouring in has been an interesting mix of conscious thought and of Flow.
When I started to re-read it for the first time I was not totally sure what it was going to sound like in my head. A few times I have written articles and looked at the end result and started again. Same structure just different words. I was hoping that was not the case this time. Of course this is all potentially self deluding, but I like it so far.
Another thing I was not expecting is in building the potential for a follow up, if not a series. The core elements have several threads that seem to be brewing as I re-read the text. I was thinking this would be done and dusted.
I am looking ahead to just getting it out there, probably just on Amazon Kindle store. I don’t think I can justify making actual print copy books though services like LuLu do over print on demand. So I have to think about pricing. There, like the app store, seems to be lots of discussion of people not paying over $0.99 or 99p for a book. Discussions of free, just for the publicity or of making something reassuringly expensive because it has enough words per pence. I don’t know, but I think I will release it and see if I can charge for it. Just enough that maybe my social media friends and colleagues might feel happy to sponsor me by buying a copy.
I have started to consider cover designs, an early one, just an alpha (rather like the text so far) was this last night.
Exploring cover art ideas for #reconfigure
It was using the iPhone app Typorama. I will have to rebuild the image in the right size and shape and may make the cubes in Unity3D instead of the stock image, that being more in keeping with the story.
I started to work on the back cover hook, the abstract. It was interesting trying o find a balance between telling the story and hinting at it. Introducing the central character and allowing her strengths and vulnerabilities to try and surface in a paragraph. I still have work to do on that but may share that in another post.

Forza 6 – 10 years in the making

Today Forza 6 arrived on the Xbox One. I have been a long time fan of Forza. It is now in its 10th year. A decade of what, at the time it came out, was considered to be a poor mans Gran Turismo. It has, in my opinion, surpassed all expectations and been a fantastic franchise. The driving feel and the exhilaration is always spot on. I have also been a big fan of their decal customisation. It was little annoying moving to the xbox one and not having my various logos I had created for forza 5 a few year ago. I still carried on and re-created most of my ‘art’ work on the cars in 5. I had a nice cool stuff collective TV logo back on the 360 and forza 4. However instead I created A Choi Kwang Do logo.
I was very pleased, once I got back to the main menu on Forza 6 to see the custom decals and the layouts for specific cars were available to import from Forza 5 now.
I did my first races in a very old impreza using one of the community designs
Forza 6
Now though I am back in a 2005 Impreza WRX complete with some feeding edge words and a logo on the back long with a bright CKD logo. So thankyou Turn 10 🙂 It now means my drivatar will be appearing random races advertising CKD across the world. Also any network race I am in I can show off the logo for my martial art of choice and also advertise feeding egde at the same time. It is an odd concept that there is still a lot of mileage in (Excuse the pun)
Forza 6 CKD scooby
It was also quite amusing to hear James May and Richard Hammond’s voices, albeit labelled at “Automative Journalist” as opposed to Top Gear presenters. I guess all the voice work was done before the demise of Top Gear as it previously existed.
**Update here is some video of one of the cars in action. It looks better in the flesh as this has been Xbox DVR captured, sent onedrive then uploaded to youtube 🙂

***Update
A longer video using the Forza vista, some driving and a photo of a newer scooby, with a black spoiler to represent a belt and orange wing mirrors to represent a belt tag 🙂

**Update 21/9 I noticed during night races another level of detail that impressed me. I have not noticed headlights behave quite so headlighty before. These my car had different bulbs and cast a different light to the other car on the track with me.
Forza6 headlights
I am going to reward myself with a massive driving session this weekend after I complete writing this #reonfigure novel first draft. Near there, the chequered flag is waving and I can see the finish line.
**Update I just finished the novel !

Making Movies – Flying around in Unity3d

Yes, ok, my latest obsession of writing #Reconfigure might make a greta TV series or Movie but that’s not what this is about. Instead it is about another set of skills I seem to be putting into practice.
Before the I spent a day editing up the school end of year video. I didn’t shoot it, but I did use al the various messing around and comedy moments from the teachers and staff to put a montage together. It had a story, some pace, a step change or two. It was great fun. I have now been asked to edit a few more things together. I like editing moving images, it is fun. Though I am not charging for this is just to help people out.
At the same time I am in the middle of a little project to try and jazz up a regular presentation in the corporate space. I demoed a little virtual fly through a building, past a couple of characters and onto an image using Unity3d. It turned into a slightly more complex epic but non the less I am using lots of varied skills to make it work. The pitch will be pre canned but I have built the Unity environment so that it runs on rails, but live. I have c# code in there to help do certain things like play video textures, animation controllers and even NavMesh agents to allow a character to walk around to a particular room and lots of flying cameras. I had used the first few things a lot. It is stock Unity. However I had not really had a chance to use Unity animations. All my animations had been either remade FBX files or ones that I created in Cheetah3D. Once you imported an animation like that it was then sort of locked in place.
However, Unity3d has its own animation dope and curve sheets, and its really handy.
The animation Tab brings a different timeline. It is different to the Animation Controller screen that lets you string animation together and create transitions. e.g. jump or run from a walk.
The animation tab lets you select an object in the scene and create a .anim for you. It then by default adds that anim to an animation controller and adds that to the object.
Unity3d Animation Timeline
The record mode then allows you to expand any or all of the properties of the object, usually transform position and rotation, but it can be colour, visibility, basically anything with a parameter. If you move the timeline and then move the object, like a camera for instance, it creates the key frames for the changed parameters. It removes any of the awkward importing of paths and animation ideas from another tool. It is not great for figures and people. They are still best imported as skinned meshes with skeletons and let the import deal with all the complexity so you can add any stock animations to them. Whoever for a cut scene, or zoom past to see a screen on a wall it works really well.
I have written a lot of complicated things in Unity3d, but never bothered using this part. It is great to find it working, and doing exactly what I needed it to do for my cinematic cut scenes.
You can have things zooming and spinning around a known path in no time. You have to break it down a little otherwise it resolves the easiest path through a wall or something, but it seems very robust to edit it afterwards.
The only pity is that its not possible to lock a camera preview window open in stock Unity3d when selecting another object. Having got a camera pan just right its trick if you was to move an object in the camera view. With camera selected you see it’s preview, with the object selected you see it. Never got around that yet, just end up with trial and error.
No matter, it works, and its awesomely useful.

Reconfigure – That book update

On Wednesday I embarked on the start of a journey, like many others have taken to try and write a book. A science fiction near future adventure. Currently called Reconfigure. Firstly thankyou for all the advice and mentoring and support. This while Author thing is something that many people have tried at various times with varying degrees of success, so Thankyou for sharing. I am writing in a bit if a frenzy at the moment as I am in some sort of zone. I start at 9:30am and kind of wake from the blitz of typing at 1pm. That leaves some time to decompress and then chase all the other pieces of work and jobs that might end up paying their way at some point.
Reconfigure whiteboard
Whilst not obsessed on the whole word count thing I am following some great advice to just go with it, but to keep it in mind. If there is stuff there, it needs to come out. The flow I am experiencing is very much like those really productive coding sessions. When you are just on it and away you go. I made a plan at the start, a story board with some key moments and discoveries within the story. I am mostly sticking to that, but as things pop up in this voyage of discovery I do pursue them.
The underpinning of the story is technology and approaches I know and that I do all the time. It is working as a substrate on which to grow the more esoteric elements of the story. The unusual limit breaking adventure elements.
Like everything I do there is an element of obsession to try something and see where it goes. Well after five days of blasting through I seem to have something that might just work. At about 25k words so far its feeling about 1/4 or 1/3 through. If it was a TV series, which it would work as one I know, it would be the start of the regular adventures. So writhe now it depends how many scrapes, twists and quirks I throw in on the ever increasing discovery arc.
Just like a good DVD with extras I have a couple of endings. One I really like but just don’t want to do and another that might be a bit of a cheat.
I am aware that finishing the main body is just part of the whole thing. Editing down and making sure they are no continuity errors or spelling/grammar errors will be very time consuming. Still I have no publisher to worry about so I will, as usual, do my own thing and see what happens.
Now time for a bit of background research that extends what is currently the books operating system. 🙂

Just got to try it – Book time

We just spent the past 2 weeks in a villa in Mallorca on our family holiday. It was, as with all holidays, great fun and a change of scenery and pace. Though I still was on the look out for some interesting new projects and had a few calls to discuss things. It is a time to reflect though.
Untitled
I spent a lot of time doing my Choi patterns out in the sun, as well as us all swimming, eating and playing games like Exploding Kittens and Forgotten Dessert. We travelled around a bit in the hire car and visited some interesting places. Caves, mountains and beaches. I managed to read an entire book too. I finally go to reading Andy Weir – The Martian. It was an odd dichotomy to read the claustrophobia of being stranded on an unforgiven planet, but having to use maker and engineer wits to overcome the problems the faced. The technical detail in it was superb, the maths and calculations around how much water, oxygen, fuel, soil, potatoes etc that he needed kept whirring in my brain whilst in this baking heat and relaxed family atmosphere.
We were not cut off completely, we had WiFi in the villa. So the predlets were constantly streaming minecraft and other vloggers. Of an evening I found CBS Action on the TV and got the predlets into watching Star Trek The Original Series, followed by MacGyver which they had never seen before.
Somewhere this all seemed to come together in a blinding flash of inspiration and I suddenly realized I had a scifi story in my head. One I needed to write.
I know, its something loads of people do, but I do love writing. I realized that my technical know how spanning so many years had a value in bringing some genuine flavour to any story. Also I specialise in the near future with emerging technology. Often in articles, and in posts and tweets I extrapolate a little further. I have played a lot of games, watched a lot of films, and like to riff on ideas. My book reading is of course not great, I am not an avid reader, but I think that means the things I have read have a bigger impact on me.
So I came back home, sat down and started. Having mused over the idea for the second week of the holiday I thought I best storyboard it first. After all this is like trying to build software. You need to architect it first, based on the requirements. This requirements I had, mostly. The base premise, some of the development and potential end.
Yesterday was just day 1, it may take years but I got 2 chapters worth of 5,500 words in total out. As I wrote it flowed, it was like watching a boxed set on the TV. I didn’t totally know what was going to happen and things cropped up and needed to be addressed. Again this was very much like coding. You just do it, find the flow and ride it. I hope I can continue to find the flow. I didn’t have a specific character in mind but yesterday they just arrived on the virtual paper I was typing into.
Of course there is no money in this, but whilst I wait and chase those customers and even potentially a regular job I might as well try something that I should have done a long time ago.
It seems that despite my wide ranging experience both technically and sharing with people and doing the right thing I am not finding the right avenues to express those talents and get paid properly.
So my plan is to get it done and just self publish on Amazon or some such place. If it works great, if not then its another experience. I also get to keep my tech skills without writing too much software as the book needs to use the right tools and work in the right way. So I figured this is another excuse to keep up to date with everything.
I started off with Feeding Edge thinking I would write the book about the virtual world experiences in corporate life, some great positives to share but also some ridiculous negative human behaviour in big organisations too. It is the latter that stops me writing that factual book. I may do after this fiction on but I think the scars are still to fresh to want to open those up again.
So here we go, lets see what happens on this little experiment. If a massive piece of traditional work appears requiring my full undivided attention, well at least I have started 🙂
The working title of this is “Reconfigure”. I have not checked out if this already is used in places but I am not married to the title. It’s a real world sci-fi thriller. Full of true technical detail and science fact but asking some challenging questions of the world.
Right, time to write 🙂

25 years ago today – The start of an interesting career

A quarter of a century ago I started my first full time job at IBM. August 13th 1990. I had a sandwich year there doing the same job but this had more permanence about it. Indeed it was pretty permanent as I stayed at that same company for 19 years.
The tech world was very different back then. It was much simpler to just be a programmer or just be a systems administrator etc.
We worked on green screen terminals like this.
IBM 3277 Display
They were dub terminals, no storage, just a command line CRT display. Everything was run and compiled remotely on the main machines. That doesn’t seem strange now as we have the cloud as a concept, but when you had cut your teeth as I had on home computers it was a strange experience. Scheduling batch jobs to compile code over night, coming in int he morning to find out if you had typed the thing correctly.
We had intercompany messaging, in a sort of intranet email system, but it was certainly not connected to the outside world and neither were we. We did have access to some usenet style forums internally and they were great sources of information and communal discovery. There were a few grumpy pedants but it was not a massive trollfest
As a programmer you had very little distraction or need to deal with things that were above your pay grade. The database admin and structure for instance, well that someone else’s role. Managing the storage, archiving, backup etc all different specialist roles.
The systems we built were huge, lots of lines of code but in reality we were just building stock control systems. They were mission critical though.
After writing some code, a module to meet a specific requirement we had code inspections. Groups of more experience people sat and read printed out listing of the code, checking it for consistency and formatting even before a compile job was run. It was also a time to dry run the code but, sitting with a pen and paper working out hat a piece of code did and how to make it better or catch more errors.
we wrote in PL/1 a heavily typed non fancy language. It changed over time but there were not many trick to learn, not funny pointers and reflection of variables. No pubs messaging or hierarchy problems. It was a great way to learn the trade.
Very rapidly though this rigidity of operation changed. As we moved over the next few years to building client server applications where the newly arriving PC’s had things they could do themselves, drag and drop interfaces like OS/2 (IBM’s version go windows). I generally ended up in the new stuff as, well, that’s what I do!
We had an explosion of languages to choose from and to work with, lots of OO smalltalk was all the rage but we were busy also doing C and then C++ as well. Then before you knew it, or before I knew it too, we had the web to play with. From 1997 on everything changed. It started with a bit of email access but very soon we had projects with proper access to the full inter web.
Building and creating in this environment was almost the polar opposite of the early years of small units of work and code inspections. We all did everything, everywhere on every platform. Scripting languages, web server configs, multiple types of content management, e-commerce products, app server products, portal products. We had Java on the backend and front end but also lots of other things to write with, rules based languages and lots and lots of standards that were not yet standards. Browsers that worked with some things, corporate clients dictating a random platform over another which we then built to.
It was a very anarchic yet incredibly interesting time.
The DotCom bubble burst happened quite naturally, but that did more to destroy a lot of business models and over eager investment. For the tech industry it actually helped slow down some of the mad expansion so we could all get to grips with what we had, it led ultimately to people started to connect with one another and the power of the web for communication.
Of course now we are gathering momentum again, lots of things are developing really quickly. The exciting buzz of things that are nearly mass market, but not quite, or competing platforms and open source implementations all competing to be the best pre-requisite for your project.
Everyone who does any of this need to be Full Stack now. For those of us who grew with this and survived the roller coaster it is a bit easier I think. Though not having those simpler times to reflect on high actually be a blessing for this generation. Not think they are not supposed to be messing around with the database because that’s not their job might be good for them.
One thing is certain, despite some nice cross platform systems like Unity3d the rest of the tech has not got any easier. Doing any new development involves sparking up a terminal emulator working just like those green screens back in the 90’s. Some interesting but ultimately arcane incantations to change a path variable or attempt to install the correct pre-req packages still flourish under the covers. If thats what you came from then it just feel like home.
Doing these things, as I did today installing some node.js as a server side scripting and a maven web server and then some variants of a java SDK followed by some security key generation it certainly felt like this is still an engineering subject. You can follow the instructions but you do need to know why. The things we do are the same, input process output with a bit of validation and security wrapped around. There are just so many more ways to explore and express the technical solution now.
If I was me now, at 18 entering the workforce I wonder what I would be doing. I hope that I would have tinkered with all this stuff as much as ever, just had more access to more resources, more help on the web and probably known a lot more. I would have been a junior Full Stack developer rather than junior programmer. Oh and I wouldn’t have to wear a suit, shirt and tie to sit at the desk either 🙂

Pepper’s IGhost

A few days ago on Facebook I saw a post about build of a visual trick that makes a smartphone look like it has a hologram floating above it (props to Ann Cudworth for sharing it). It is of course not really a hologram but a version of Pepper’s Ghost using a trick of light passing through an angled piece of transparent material. This allows the eye to see an object floating or looking transparent and ethereal.
The video shows how this all work and how to build one.

I did have a go with an old CD case but I found the plastic shattered way too much so instead I used a slight more cut-able acetate sheet. I made a quick prototype that was a little bit wonky, but it worked still.
"Holographic" acetate
There are lots of these types of video out on youtube
There are some commercially available version of the such as the Dreamoc which works in a similar way but the pyramid is the other way up.

There are lots of other examples where a visual trick fools our brains into thinking something is truly 3D and floating in space. It’s all done with mirrors 🙂
Some of you may remember the time traveller game

This used a projection onto a bowl shaped mirror. This effect is also used in the visual trick you sometimes see in gadget and joke shops. Such as this one from optigone

There are some fascinating tricks and of course Microsoft Hololens, and Magic Leap will be using “near eye light fields”, which are slightly more complex arrangements than a bit of acetate on an iPhone, but we can appreciate some of the magical impact may have by looking at these simpler optical illusions.
Our ability to do more more light, and not just deal with the flat 2d plane of a TV screen or of a single photo is definitely advancing. The recent camera’s such as the Lytro which is a light-field camera treat multiple layers of light as important. Just as the Near Eye light fields bounce the light through multiple screens of colour and different angles to create their effect.
Whilst sometimes the use of the Hologram word is over used I think that what matters is how it makes us feel as a human when we look at something. The mental leap, or the trick of our brain that causes us to think something is really there is fascinating. If we think it is, well… it is.
At the moment we are still focussed (no pun intended) on altering the images that travel into our eyes and the way the eye works with its lenses and optic nerve to the brain. It is only a matter of time before we go deeper. Straight to the brain. Already there are significant advances being made in helping those with no eyesight or restricted eyesight to have a digital view of the world directed and detected in different ways. So it may be that our true blended and augmented reality is to further augment ourselves. Quite a few ethical issues and trust issues to consider there.
Anyway, back to amazing the predlets with Pepper’s IGhost, time to build a bigger one for the ipad!
**Update Just after posting I made a larger one for the ipad. The predlets enjoyed watching the videos in a darkened corner.
Predlets enjoying holographic effect on ipad
Then maybe an interactive unity3d live version.

Return of the camp fire – Blended Reality/Mixed Reality

Back in 2006 when we all got into in the previous wave of Virtual Reality (note the use of previous not 1st) and we tended to call them virtual worlds or the metaverse we were restricted to online presence via a small rectangular screen attached to a keyboard. Places like Second Life become the worlds that attracted the pioneers, explorers and the curious.
Several times recently I have been having conversations, comments, tweets etc about the rise of the VR headsets Oculus Rift et al, AR phone holders like Google Cardboard, MergeVR and then the mixed reality future of Microsoft Hololens, and Google Magic Leap.
In all the Second Life interactions and subsequent environments it has always really been about people. The tech industry has a habit of focusing on the version number, or a piece of hardware though. The headsets are an example of that. There may be inherent user experiences to consider but they are more insular in their nature. The VR headset forces you to look at a screen. It provides extra inputs to save you using mouse look etc. Essentially though it is an insular hardware idea that attempts to place you into a virtual world.
All out previous virtual world exploration has been powered by the social interaction with others. Social may mean educational, may mean work meetings or may mean entertainment or just chatting. However it was social.
Those of us who took to this felt as we looked at the screen we were engaged and part of the world. Those who did not get what we were doing only saw funny graphics on the screen. They did not make that small jump in understanding. There is nothing wrong with that, but that is essential what we were all trying to get people to understand. A VR headset provides a direct feed to the eyes. It forces that leap into the virtual, or at least brings it a little closer. Then ready to engage with people. Though at the moment all the VR is engaging with content. It is still in showing off mode.
120362
We would sit around campfires in Second Life. In particular Timeless Prototype’s very clever looking one that adds seats/logs per person/avatar arriving. Shared focal points of conversation. People gathering looking at a screen in front of them but being there mentally. The screen providing all the cues and hints to enhance that feeling. A very real human connection. Richer than a telephone call, less scary than a video conference, intimate yet stand offish enough to meet most peoples requirements if they gave it a go.
It is not such a weird concept as many people get lost in books, films and TV shows. They may be immersed as a viewer not a participant or they may be empathising with characters. They are still looking at an oblong screen or piece of paper and engaged.
With a VR headset, as I just mentioned, that is more forced. They will understand the campfire that much quicker and see the people and the interaction. There is less abstraction to deal with.
The rise of the blended and mixed reality headset though change that dynamic completely. Instead they use the physical world, one that people already understand and are already in. The campfire comes to you. Wherever you are.
This leads to a very different mental leap. You can have three people sat around the virtual campfire in their actual chairs in an actual room. The campfire is of course replaceable with any concept, piece of information, simulated training experience. Those people each have their own view of the world and of one another but its added to with a new perspective, they see the side of the campfire they would expect.
It goes further though, there is no reason for the campfire not to coexist is several physical spaces. I have my view of the campfire from my office, you from yours. We can even have a view of one another as avatars placed into our physical worlds, or not. It’s optional.
When just keeping with that physical blended model it is very simple for anyone to understand and start to believe. Sat in a space with one another sharing an experience, like a campfire is a very human very obvious one. For many that is where this will sort of end. Just run of the mill information, enhanced views, cut away engineering diagrams, point of view understanding etc.
The thing to consider, for those who already grok the VW concept is that just as in Second Life you can move your camera around, see things from any point of view, for the other persons point, from a birds eye view, from close in or far away, the mixed reality headsets will still allow us to do that. I could alter my view to see what you are seeing on the other side of the camp fire. That is the star of something much more revolutionary to consider whilst everyone catches up the fact that this is all about people not just insular experiences.
This is the real metaverse, a blend of everything anywhere and everywhere. Connecting us as people, our thoughts dreams and ideas. There are huge possibilities to explore and more people will once that little hurdle is jumped to understand it’s people, its always been about people.

BBC Micro:bit – It’s taken a while but worth it

I have always been keen that kids learn to code early on. Not to make them all software engineers but to help find those that can be, and also to help everyone get an appreciation of what goes into the devices, games and things we all totally depend on now.
Way back in the last ever Cool Stuff Collective (in 2011 !) I did a round up of all things tech coming in the future. I begged kids to hassle their teachers to learn to program
It was looking as if the Raspberry Pi was going to be the choice for a big push into schools. It is probably even why our show was seen as too common to be allowed to show the early version. I was keen to have the Pi on but that didn’t work out. We had already done the Arduino on a previous show and that seemed to work really well.
So I was very pleased to see the BBC Micro:bit arrive on the scene last week. The Raspberry Pi is a great small computer, it is however quite a techie thing to get going as you have to be managing operating systems, drivers etc. (Though things like the Kano make that much much easier)
Arduino are great for being able to plug things into though it requires some circuitry and connections on breadboard to start to make sense of it. The programming environment allows a lot oa things to be achieved. It is a micro controller not a full working computer.
microbit
The new BBC micro:bit is a step in from the Arduino. It has an array of LED lights on it as its own display but also has bluetooth and a collection of sensors already onboard.
I was very pleased to see the involvement of Technology WilL Save Us in the design. I met and shared the stage with the co founder of this maker group Daniel Hirschman @danielhirschman at the GOTO conference in Amsterdam a couple of years ago. He knows his stuff and is passionate about how to get kids involved in tech so I know that this is a great step and a good piece of kit.
The plans to roll this out to every year 7 student is great, though predlet 1.0 is just moving up out of that year. So I will just have to buy one when we can.
Microsoft have a great video to describe it too.

I am sure STEMnet ambassadors in the filed of computing will be interested and hopefully excited too. This is important, as it is not just code it is physical computing, understanding sensors what can and can’t be done. It is an internet of things (IOT) device and it is powered by a web based programming language. It is definitely off the moment and the right time for it. Oh and its not just an “app” 🙂

Emotiv Insight – Be the borg

Many years ago a few of us had early access, or nearly early access, to some interesting brain wave reading equipment from Emotiv. This was back in 2005. I had a unity arrive at the office but then it got locked up in a cupboard until the company lawyers said it was OK to use. It was still there when I left but did make its way out of the cupboard by 2010. It even helped some old colleagues get a bit of TV action
It was not such a problem with my US colleagues who at least got to try using it.
I got to experience and show similar brain controlling toys on the TV with the NeuroSky mindflex gym. This allows the player to try to relax or concentrate to control air fans that blow a small foam ball around a course.
So in 2013 when Emotiv announced a kickstarter for the new headsets, in this case the insight I signed right up.
We (in one of our many virtual world jam sessions) had talked about interesting uses of the device to detect user emotions in virtual worlds. One such use was if your avatar and how you chose to dress or decorate yourself caused discomfort or shock to another person (wearing a headset), their own display of you could be tailored to make them happy. It’s a modern take on the emperors new clothes, but one that can work. People see what they need to see. It could have worked at Wimbledon this year when Lewis Hamilton turned up to the royal box without a jacket and tie. In a virtual environment that “shock” would mean any viewers of his avatar would indeed see a jacket and tie.
It has taken 2 years but the Emotiv Insight headset has finally arrived. As with all early adopter kit though there are some hiccups. Michael Rowe, one of my fellow conspirators from back in the virtual world hey day, has blogged about his unit arriving
Well here is my shiny boxed up version complete with t-shirt and badge 🙂
Emotiv insight has arrived
The unit itself feels very slick and modern. Several contact sensors spread out over your scalp and an important reference on the bone behind the ear (avoid your glasses) needs to be in place.
Though you do look a bit odd wearing it 🙂 But we have to suffer for the tech art.
Emotiv headset
I charged the unity up via the USB cable waited for the light to go green. Unplugged the device and hit the on switch (near the green light). Here I then thought it was broken. It is something very simple but I assumed, never assume, that the same light near the switch would be the power light. That was not the case. The “on” light is a small blue/white LED above the logo on the side. It was right under where I was holding the unit. Silly me.
Emotiv insight power light
I then set about figuring out what could be done, if it worked (once I saw the power light).
This got much trickier. With a lot of Kickstarters they all have their own way of communicating. When you have not seen something appear for 2 years you tend not to be reading it everyday, following every link and nuance.
So the unit connected with Bluetooth, but not any bluetooth, it is the new Bluetooth (BTLE). This is some low power upgrade to bluetooth that many older devices do not support. I checked my Macbook Pro and it seemed to have the right stack. (How on earth did NASA manage to get the probe to Pluto over 9 years without the comms tech all breaking !)
I logged into the Emotiv store and saw a control panel application. I tried to download it, but having selected Mac it bounced me to my user profile saying I needed to be logged on. It is worrying that the simplest of web applications is not working when you have this brain reading device strapped to your head which is extremely complex! It seems the windows one would download, but that was no good.
I found some SDK code via another link but a lot of that seemed to be emulation. It also turns out that the iPhone 8 updates mashed up the Bluetooth LTE versions of any apps they had so it doesn’t work on those now. Still some android apps seem to be operational.
I left it a few days, slightly disappointed but figured it would get sorted.
Then today I found a comment on joining the google plus group (another flashback to the past) and a helpdesk post saying go to https://cpanel.emotivinsight.com/BTLE/
As soon as I did that, it downloaded a plugin, and then I saw the bluetooth pair with the device with no fuss. A setup screen flickerd into life and it showed sensors were firing on the unit.
The webpage looked like it should then link somewhere but it turns out there is a menu option hidden in the top left hand corner, with a submenu “detections”. That let me get to a few web based applications that atlas responded. I did not have much luck training and using it yet as often the apps would lockup, throw a warning message and the bluetooth would drop. I did however get a trace, my brain is in there and working. I was starting to think it was not!
Emotiv Insight Brain trace
So I now have a working brain interface, I am Borg, it’s 2015. Time to dig up those ideas from 10 years ago!
I am very interested in what happens when I mix this with the headsets like Oculus Rift. Though it is a pity that Emotiv seem to want to charge loyal backers $70 for an Emotiv unity plugin, that apparently doesn’t work in Unity 5.0 yet ? I think they may need some software engineering and architecture help. Call me, I am available at the moment!
I am alos interested, give how light the unit is, to trace what happens in Choi Kwang Do training and if getting and aiming for a particular mind set trace, i.e. we always try and relax to generate more speed and power from things that instinct tells you to tense up for. Of course that will have to wait a few weeks for my hip to recover. The physical world is a dangerous place and I sprained my left hip tripping over the fence around the predpets. So it’s been a good time to ponder interesting things to do whilst not sitting too long at this keyboard.
As ever, I am open for business. If you need an understand what is going on out here in the world of new technology and where it fits, that what feeding edge ltd does, takes a bit out of technology so you don’t have to.