technical


Xbox One Kinect Two

The title almost sounds like a football score but we are very close to the release of the next wave of consoles. I am particularly interested in the Xbox One for a number of reasons.
1. Most of my gaming has gravitated towards the 360 so I am geared up for the next gen franchises like the fantastic looking Forza 5.
2. The Kinect 2.0 has some features that are going to be really useful for any Choi Kwang Do applications.
3. Unity 3d has a tie in with Xbox One development.
4. Proper Cloud processing, utility computing looks like it is part of Microsofts plan. Not just streaming games from elsewhere, but farming off processing to large servers and delivering results back. (Grid computing as we used to call it 🙂 )

As a Unity developer, whilst much of what I do is private rather than publicly available I am really interested in being able to deploy to the Xbox One. It opens up a lot of possibilities from a research point of view and may lead to some extra commercial work.
I have applied for the ID@Xbox scheme which is to help developers get onboard with the Xbox One. Eventually any Xbox One will be a potential piece of development kit, which is great news, but at the moment they are still in the old console model of needing a special Xbox One to develop on. Unity3d have announced free versions of Pro to go with those kits. As a Pro licence owner already I really just need access to the kit.
In particular when you see the different in how the Kinect 2.0 can deal with the human form as in this video.
***fixed the link as it was the wrong kinect video 🙂

Having a richer skeleton, complete with real shoulders, but also the tools already in place to look at weight distribution, muscle tension, limb acceleration and interestingly too the heart rate from the face.
You can see that this provides me with a whole lot more tech vocabulary to be able to analyse what we do in Choi Kwang Do and provide a training aid, or training mirror. This is compared to where I am up to with Kinect 1.0 as in this previous post (one of my virtual/gaming technology Use Case examples)
I am not sure if I meet Microsoft’s requirements to be called a developer, but then most of what I do never fits on any of these forms that I have to fill in 🙂 If I do and I get access to dev kit that is great. Either way this is much more useful than the alternative platforms, so I am hopeful.
As it is my Xbox One has been on preorder since they were launched so I am hoping the post delivers it promptly on 22nd November, just 10 days away. That is the fun side of playing, but it as ever is also part of learning and understanding the world of technology.

Use Case 2 – real world data integration – CKD

As I am looking at a series of boiled down use cases of using virtual world and gaming technology I thought I should return to the exploration of body instrumentation and the potential for feedback in learning a martial art such as Choi Kwang Do.
I have of course written about this potential before, but I have built a few little extra things into the example using a new windows machine with a decent amount of power (HP Envy 17″) and the Kinect for Windows sensor with the Kinect SDK and Unity 3d package.
The package comes with a set of tools that let you generate a block man based on the the join positions. However the controller piece of code base some options for turning on the user map and skeleton lines.
In this example I am also using unity pro which allows me to position more than one camera and have each of those generate a texture on another surface.
You will see the main block man appear centrally “in world”. The three screens above him are showing a side view of the same block man, a rear view and interestingly a top down view.
In the bottom right is the “me” with lines drawn on. The kinect does the job of cutting out the background. So all this was recorded live running Unity3d.
The registration of the block man and the joints isn’t quite accurate enough at the moment for precise Choi movements, but this is the old Kinect, the new Kinect 2.0 will no doubt be much much better as well as being able to register your heart rate.

The cut out “me” is a useful feature but you can only have that projected onto the flat camera surface, it is not a thing that can be looked at from left/right etc. The block man though is actual 3d objects in space. The cubes are coloured so that you can see join rotation.
I think I will reduce the size of the joints and try and draw objects between them to give him a similar definition to the cutout “me”.
The point here though is that game technology and virtual world technology is able to give a different perspective of a real world interaction. Seeing techniques from above may prove useful, and is not something that can easily be observed in class. If that applies to Choi Kwang Do then it applies to all other forms of real world data. Seeing from another angle, exploring and rendering in different ways can yield insights.
It also is data that can be captured and replayed, transmitted and experienced at distance by others. Capture, translate, enhance and share. It is something to think about? What different perspectives could you gain of data you have access to?

Dear BBC I am a programmer and a presenter let me help

I was very pleased to see that the Tony Hall the new DG of the BBC wants to get the nation coding. He plans to “bring coding into every home, business and school in the UK”. http://www.bbc.co.uk/news/technology-24446046
So I thought, as I am lacking a full time agent in the TV world, I should throw my virtual hat in the ring to offer to work on the new programme that the BBC has planned for 2015.
It is not the first time I have offered assistance to such an endeavour, but this is the most public affirmation of it happening.
So why me? Well I am a programmer and have been since the early days of the shows on TV back in zx81/c64/bbc model a/b/spectrum days. I was initially self taught through listings in magazine and general tinkering before studying to a degree level, and then pursuing what has been a very varied career generally involving new tech each step of the way.
I was lucky enough to get a TV break with Archie Productions and the ITV/CITV show The Cool Stuff Collective, well documented on this blog 😉 In that I had an emerging technology strand of my own. The producers and I worked together to craft the slot, but most of it was driven by things that I spend my time sharing with C-level executives and at conferences about the changing world and maker culture.
It was interesting getting the open source arduino, along with some code on screen in just a few short minutes. It became obvious there was a lot more that could be done to help people learn to code. Of course these days we have many more ways to interact too. We do not have to just stick to watching what is on screen, that acts as a hub for the experience. Code, graphics, art work, running programs etc can all be shared across the web and social media. User participation, live and in synch with on-demand can be very influential. Collections of ever improving assets can be made available then examples of how to combine them put on TV.
We can do so much with open source virtual worlds, powerful accessible tools like Unity 3d and of course platforms like the Raspberry Pi. We can also have a chance to explore the creativity and technical challenges of user generated content in games. Next gen equipment like the Oculus rift. Extensions to the physical world with 3d printers, augmented reality and increasingly blended reality offer scope for innovation and invention by the next generation of technical experts and artists. Coding and programming is just the start.
I would love to help, it is such an important a worthy cause for public engagement.
Here is a showreel of some of the work.

There is more here and some writing and conference lists here
So if you happen to read this and need some help on the show get in touch. If you are an agent and want to help get this sort of thing going then get in touch. If you know someone who knows someone then pass this on.
This blog is my CV, though I do have a traditional few pages if anyone needs it.
Feeding Edge, taking a bite out of technology so you don’t have to.
Yours Ian Hughes/epredator

Oculus Rift arrived

Yesterday was a very interesting day all around. The main event for me was the arrival of my Oculus Rift development kit. This is a Virtual Reality (VR) headset. We have had these before but not ones that are this cheap (relatively) and that just plug into any machine and work with Unity3d !
It came in a cool case too. Some nice big clicking catches on it like something James Bond would get from Q.
Oculus rift arrived
I had to dash out and get a mini display port to DVI adapter as I had that frustrating moment we have all had at some time when you realize a tiny piece of cable is missing. I only had a VGA adapter that I usually need to projectors at presentation. Handily there is a Mac store in Basingstoke and also Apple kit stocked in the nearby PC world so I had that covered in minutes.
I ran the Tuscany demo straight away. This is a small, but lovely rendered, building in the Italian countryside and on the coast. I was, despite having experienced this sort of stuff alot before suitably amazed and excited by it.
I did not hog it though and got each member of 3 generations in the house to have a go.
I was really happy that the predlets got how cool it was too. There was a suitable amount of bemusement and wonder.

The unit is two screens one for each eye and the ability to track head movements. So these head movements can then inform a game or application that you have moved you view.
The unity 3d integration is the library that is able to talk to the rift, but it is also a couple of prefabs (bundles of reusable code) that lets you change the normal single view camera to one that generates 2 views from slightly different angles to then feed to the rift screens (one per eye). Elegant and simple. It also has a controller prefab that takes the input from the headset movement and feeds that into the unity environment.
So it is not going to take to much to test some of the applications I have and make them rift enabled, though they have to be native application publications not web ones I am assuming.
It was not the only tech of the day though as in the morning Predlet 1.0 went and experienced indoor skydiving at Airkix.

Something I remember well from the TV show 🙂
She had a blast doing that.
I had a go on the iracing sim rig there had there too, this is basically a high end pc with a hydraulic chair and some force feedback steering.
We have yet to try the brush boarding or the skiplex slope (a skiing conveyer belt).
Just to juxta pose those though predlet 1.0 finished her Revell glue together and paint kit of the Titanic. Don’t worry though, it was all perfectly normal as predlet 2.0 was completing levels on Lego Indiana Jones 2 on the Xbox 🙂

Jumping into LEAP

I had originally thought I would not bother with a LEAP controller. However new technology has to be investigated. That is what I do after all 🙂
I ordered the LEAP from Amazon, it arrived very quickly. So whilst I might not have been an early adopter, and was not on the developer programme it is not hard to get hold of one now. It is £69.99 but that is relatively cheap for a fancy peripheral.
Giving LEAP a go
It is interesting the self proclamation on the box. “The remarkably accurate, incredibly natural way to interact with your computer”. My first impressions are that it is quite accurate. However, as with all gesture based devices as there is no tactile feedback you have to sort of feel you way through space to get used to where you are supposed to be.
Leap
However the initial setup demonstration works very well giving you a good sense for how it is going to work.
It comes with a few free apps via Airspace and access to another ecosystem to buy some more.
The first one I clicked on was Google Earth, but it was a less than satisfying experience as it is not that obvious how to control it so you end up putting the world into a Superman style spin before plunging into the ocean.
I was more impressed with the nice target catching game DropChord (which has DoubleFine’s logo on it). This has you trying to intersect a circle with a chord and hit the right targets to some blasting music and glowing visuals. It did make my arms ache after a long game of it though!
What was more exciting for me was to download the Unity3d SDK for LEAP. It was a simple matter or dropping the plugin into a unity project and then importing a few helper scripts.
The main one Leap Unity Bridge can be attached to a game object. You then configure it with a few prefabs that will act as fingers and palms, press run (and if you have the camera point the right way) you see you objects appear as your fingers do.
Many of the apps on Airspace are particle pushing creative expression tools. So creating an object that is a particle generator for fingers immediately gives you the same effect.
Leap unity
It took about 10 minutes to get it all working (6 of those were downloading on my slow ADSL).
The problem I can see at the moment is that pointing is a very natural thing to do, that works great, though of course the pointing it relative to where the LEAP is placed. So you need to have a lot of visual feedback and large buttons (rather like Kinect) in order to make selections. Much of that is easier with touch or with a mouse.
Where it excels though is in visualisation and music generation where you get a sense of trying to master a performance and get to feel you have an entire space to play with, not limiting yourself to trying to select a button or window on a 2d screen which is a bit (no) hit and miss.
I spent a while tinkering with Chordion Conductor that lets you play a synth in various ways. The dials to adjust settings are in the top row and you point and twirl your finger on the dials to make adjustments. It is a fun and interesting experience to explore.
Just watch out where you are seen using the LEAP. You are either aggressively pointing at the screen, throwing gang signs or testing melons for ripeness in a Carry on Computing style.
I am looking forward to seeing if I can blend this with my Oculus Rift and Unity3d when it arrives though 🙂

Sensing movement – BitGym

I thought I should try the Unity modules for a new motion sensing library called BitGym. A Unity/webcam based motion sensing toolkit that works with depth as well as lateral movements.

It, like many Unity3d packages, is just so straight forward to get working. You simply import the package then add a script to an object and there you have it.
An interesting twist though is that you are not locked into the laptop. Unity being able to target different platforms as a runtime (with the right licences) you can build iOS and Android sensing apps.
I have yet to explore how or if this includes skeleton tracking. It was pure Unity3d. The way it should be done 🙂
Determining size and shape and motion on a camera with a single plane it is of course going to be tricky to look at detail of physical form as with the kinect which is really a 3d scanner. However for some sports applications and rep counters it may well be all that is needed. Some further investigation is on the list of things to do.
The toolkit is available to anyone to download and experiment with but that is on a non commercial licence. Obviously full licences need to be bought to do any more with the kit, but it certainly is worth looking at if you are interested in such things.

Yet more future arrived in the post – Raspberry PI

This has been a good week , first my Makie arrived, and now the Raspbery Pi has arrived on my doorstep. I was a little behind ordering it so ended up in one of the later waves. It is odd seeing it in the flash as I originally wanted to feature it nice and early on The Cool Stuff Collective as it is an important development. That didn’t workout. Still here is it now.
Raspberry pi out of box
I suffered a few technical hitches though.
Raspberry pi time
Firstly I couldn’t find any USB keyboards, nor adapters for a PS/2 keyboard to USB. Then I thought I would risk my desktop wireless logitech, though I figured that would be a non starter it was worth a go. Secondly my card reader was not happy with the 4gb flash card, so I ended up using my wifes newer MBP as its built in one worked fine. (You have a flash card to put the Operating System on). Having sorted that I took the power supply out of the box to find it was a euro plug not a 3 pin UK one. The one travel adapter that works backwards, didn’t. Again I was saved by my wife’s Blackberry charger with mini USB.
I followed some instructions that suggested that fedora was the place to start, got it up and running
It's alive
After a bit of messing with the config.txt to fit the cheap HDMI TV I was plugging into I was all set to get going, but it refused to accept any of the passwords I had set. A quick google I found that i was working on some out of date info and that the Debian instance of linux was the one to use. The joy of open source is choice, but it is also its curse.
A few minutes later, a new image on the card and away we went.
Trying again :)
At the time out house was full of kids on holiday running around and screaming, so I did not do anything “serious” but I had to try the Commodore 64 emulation to bring things full circle.
That's more like it, 64 pi
Next up is finish the 3d printer (reprap) and print a case for the RPi, and see what makes sense to let the predlets loose on. At the moment it is an operating system tinkerers paradise, just instal x with y and of course don’t forget z. Still it all has to start somewhere.
A suitable kid friendly instance that doesn’t need lots of arcane command lines known only the linux gurus or that need another machine to constantly Google for best practice will no doubt appear.
It’s a good start.

Join in NOW on government future of ICT plans

There is currently a government consultation on the future of ICT open to all to comment on. It is here. There is an online form to fill in with just 4 pages.
Any of you concerned with the future of computer science and technology education in the UK really should just fill this in firstly as an individual, but also try to get official responses from your organisations and employers.
All the comments do add to the weight of concern we all have.
My response was mainly that this is about people being able to show how the technology works, using easily accessible resources, not about a huge IT spending requirement. It is the passion of the tech industry, maker culture and experiences professionals that will make the difference. Eventually that will mean more technology and science related teaching.
It turned out I was response number 22 so I seem to have got in early. I would be interested to see what the numbers are like as this is something almost everyone I know in the tech and games industry feels very strongly about. So lets try and add to the weight of the argument with some positive suggestions to the government.
(The BCS is heavily involved hence us doing it as individuals as well, just mention you are BCS or IET, STEMnet Advisor etc in the form)
I will be running for office before you know it 🙂

Use the Force (feedback) – Haptic Driving

For christmas I got the wonderful new Mad Catz Force feedback steering wheel and pedals for the xbox 360. You have to be careful buying steering wheels as many of them just regular controllers, i.e. the haptic feedback from them is merely a rumble.
Mad catz FFB wheel
This wheel (like the discontinued microsoft one before it and along with the very expensive other ones exploits the fact that the physics models in games are accurate enough to generate forces back out to devices. When you drive a real car different surfaces and different speeds generate a different load and feel on your control of the car. A force feedback wheel emulate some of that. The subtlety of such control may not be important for many gamers but for sheer enjoyment of car racing of any sort it really is a must.
Dirt 3 generates some amazing violent ruts and bumps as you hammer through the rally terrain, F1 2011 generates load on the corners and vibration in the dirty air of the cars in front and Forza 4 becomes an even more sublime experience.
Force feedback wheel
Another key element is the separation of control of speed and braking (with your feet), which pedals allow a more obvious analogue travel, but even more important is the gear changing. Whilst I love driving games I can never really be bothered with manual gear changes on the joypads. This wheel has both flappy paddle and an interchangeable left or right gearstick. The gears are sequential, something most of us don’t have in cars but that works great in games.
This separation of control allows you to leave a car in higher (or lower) gear to control traction with the accelerator. In the higher end cars that spin out at moments notice driving with the gears can help find the balance.
Whilst sat on a sofa is not a recaro seat, and other things rare missing, actual speed and danger, lateral and vertical movement, it still elevates the experience.
I found my lap times dropped initially, but as I switched on my mental map of the joypad and patched in my driving brain they started to increase. I shaved seconds already from the excellent “star in a reasonably priced car” rivals challenge on Forza 4.
Now I have driven various fast car experiences, a bit of rally too and of course the higher end simulators like Pure Tech that we visited in series 2 of Cool Stuff Collective and I can say that this generation of console and force feedback wheel are pretty stunning, exciting and enjoyable.
We have of course had this sort of tech for ages, the PC fraternity have a dirty of controls. I had a sidewinder stick in the 90’s. However if tech gets more accessible and the games designers take account of it, or the engines do that for them, then some of our gaming experiences will be elevated even further.
The more the digital becomes physical and breaks that plane of the screen, distributes into real life the better.

Hassle your teachers to ask to learn to… program – programme

Happy new year everyone. n.b. I updated the title to use program and programme to avoid confusion as my not so subtle play on words 🙂 This new years eve over on CITV was the last in this series of Cool Stuff Collective. It is running for the next month on ITV player here
It was notable for a double custard pie on the wall of fame but aside from that we did a different future tech. I was intrigued what was going to make the edit as we could have done an entire show I think with all the things Vicky and I went through.
I talked about Robotics in 2012 and covered Asimo, Nao and also exo-skeletons. This was a general robotics discussion and its spin offs, plus a small piece about artificial intelligence. We did this as a talking piece with footage played in between of the various things as they were going to be impossible to squeeze into the school. In transport I also talked about the Toyota car with a giant OLED customisable surface. All very big things.
I finally got to talk about Maker Culture too, and spin off from the open source ideas to things like Sugru that let you hack things better. i.e. physical hacking.

I have my headphone mute button that I enhanced a while ago and I demonstrated altering a PS3 controller button using it. It is wonderful stuff and fits nicely into maker culture.
However the closing statements were my wish for the year. To paraphrase….
Education and teaching of computer programming needs to be done properly. We have got stuck with ICT, which is important, but it is about using computers not building with them. All the gadgets we show and all the block buster games we talk about need to be built. So I want everyone to hassle their teachers to ask to be taught computer programming as it is one of the most important skills that will be needed in the future. Without it we will not be making the next generation of gadgets or fantastic games. (That of course also relates to other business areas, banks, medical establishments etc….) Knowing how to build is important, knowing how things work not just being a user is important.
A much used quote that was a bit too heavy for the show “if you are not programming you are being programmed”
I think 2012 will be a year many more things drive this point forward to build on all the work people have been doing up to know to get this message across. So if we can keep pushing we may have a chance to save the economy and everyones future, and have a rewarding career for people too. It’s all good !
Anyway thats this series done. I may do a new year retrospective on all the pieces of the puzzle that I have talked about this series, but its all on this blog 🙂
It’s also a major part of my speaking engagements for the next few months at least.
More TV for me? Well I hope so, it is a great honour to be able to share all this stuff on all sorts of media. Lets see what 2012 brings (BTW I am my own agent at the moment 😉 )