CKD supercars

Further to my post on digital art I spent a good few hours last night on a version of the Choi Kwang Do logo so that I could plaster that on my custom paint work cars on Xbox One Forza 5. This is quite an undertaking as this is not pixel art. Various coloured stickers have to be manipulated to form the design. The font is not quite right but as this is often fast moving it is the general impression that is important.
****UPDATE 13/12/13 added video
Here is a mini video clip of all the various pieces of sticker used in this

What I really need to prove is that the Drivatar (that is my driving habits in an AI car) is actually using my design. This seems to be the case on the local machine for certain as my Drivatar and car paint work was appearing in other people races on the console. So if you see a CKD logo on a Subaru, Mini, LaFerrari or Viper let me know πŸ™‚ It could be this is the next frontier of in game advertising. Rather like a tweet or retweet can become. If it is no longer just restricted to delivering your designs to people when you race on multiplayer.
scoobyckd
scoobyckdbumper
VyperCKD
LaFerrariCKD
LaFerrariCKDarch
It is approaching 5 years since I first put the Feeding Edge logo out for a test run on a Tshirt and a few other places πŸ™‚ Here are some of the older ones
There is so much scope over and above banner ads and in game billboards to get a message across. Sounds like another use case I need to isolate and write up?

Art and technology

You may have noticed that whilst I am primarily a techie I do like to dabble and explore the potential of art with technology. I am well aware though, and often have to explain, that I am not a graphic designer or an artist. My creative toolkit is code and components. It may even be considered unusual combination of applications in a sort of collage of code. Nearly everything needs some sort of interface though. I am constantly frustrated that my vision of what something should look like is often not quite what I manage to achieve with the tools I have to hand. However, I have often thought that with a limited number of pixels on a screen and a pallete of colours with enough time and effort it must be possible to make something close to perfect as an image. Of course the key there is time. It is really just a rewrite of the concept of an infinite number of monkeys and typewriters generating the complete works of Shakespeare!.
I was struck, this week, by an ipad painted image of Morgan Freeman that was doing the rounds of social media. I had only seen the finished image. Whilst everything was saying it was a painting it seemed just too good. So I was amazed but skeptical. Then last night I the video below showing the stages of the painting captured as it happened. This is truly stunning.

It is the work of Kyle Lambert and looking at his portfolio it is hard not to feel that he has reached a level that very few other people ever will. What is great though is a link to a lot of tutorials he has created and various articles so he is not keeping this talent to himself.
I would have though a picture of the quality of Morgan Freeman would have put me off even ever trying but instead I am more inspired to learn more about artistic techniques now and try and level up my coder art just a little bit.
Even tools like Forza 5 let us express ourselves artistically through the media of the automobile. The tools (as in previous Forza’s are a decal based approach, similar to Second Life prims that you have to shape and skew, layering up to create the look you want.
I have now upgraded my feeding edge logo and predator face a bit more and my flagship car (which was the Ferrari 430 is now the LaFerrari The predator mask has always worked well (IMHO) on cars with front scoops like the subaru Impreza.

However the LaFerrari has a brilliant cut out front that is even better. The back of the car has hardly any surface so I popped on a very small Feeding Edge logo under the Ferrari logo.
GetPhoto.ashx
Ok so it’s not a 200 hour Morgan Freeman ipad painting but its a start πŸ™‚

Early adopting an Xbox one

Friday 22nd November was the release day to Xbox One. As an early adopter and Xbox fan (though I own all the consoles) I had my preorder with Amazon in since May. I sometimes think I should try the midnight queueing in the cold at a store for the ultimate release day excitement but instead I opt for the pain of waiting all day for a delivery to arrive. To be fair to Amazon it did arrive on day one but just at 4pm.
I was joking around on the day that it was a race between @elemming flying back from a work trip to Dubai and the Xbox One arriving. @ids pointed out on twitter that this sounded like a Top Gear race πŸ™‚
The Xbox actually arrived 20 minutes before @elemming due to some flight delays out of Dubai due to rain! However she had the last laugh in the race as having unboxed and plugged in the HDMI and audio cables and got it all going I was in the mandatory day one patch downloading sequence.
#xboxone won the Dubai race with @elemming by 20 mins or so
though @elemming may have lost the race with the delivery but she beat the mandatory software patch #xboxone
I have to say that this day one experience, or any day for that matter was incredibly annoying. It is of course part of the point of early adoption that you end up waiting on progress bars. It, like waiting for a delivery is a good exercise in patience!
My internet is slow, BT still have yet to provide Inifnity to me despite being in an enabled area. I had to leave work uploads to a code management system for 9 hours. So seeing a brand new mating sat there demanding that it does a mandatory 600Mb patch in order to do anything with it was strange. It did work for me though, it took a while but then I was in the new dashboard. It plays a nice video to welcome you, and I wonder if that video was part of the 600Mb “patch” if it was then I could have done without it!
The new Kinect 2.0 worked really well straight away as I signed in and picked up my Xbox Live account. I took the Day One giant QR code for Fifa 14 and the camera picked it up straight away and set about the digital download of the game. This of course was going to take ages too as it was 10gb. Oh well, I thought I would leave it in the background anyway. I had the discs of preordered games. I had Call of Duty Ghosts, Ryse Son of Rome and Forza 5.
So excitedly, now I had a brand new patched console I put the disc in the drive. There it was Forza 5. It then started to install, no more playing from the disc you have to fill up your hard drive. However I was met with a message that this also needed a mandatory patch. It then told me the “patch” was 6GB! With my 4Mbps line that would be about 4 hours. So I did not expect to be able to play anything on day one as it was already late in the day. It seemed each game required a huge update for Day one. I tried COD and Ryse and got the same message.
I left it for an hour or so and cooked tea put the predlets to bed etc.
I checked google a few people suggest a fix. Turn off your internet!
So I did! I cancelled the Forza installation and started again. The game then installed from the disc and started up. I was greater with the awesome introduction/ training of getting to drive the Maclaren P1 around a street circuit. Feeling the new features of the controller with independent rumbling on the triggers. This meant the wheel spin of acceleration or sliding feels different to the ABS judder of the brakes. I did a few races and was pleased with the depth of integration of Top Gear, especially the 3rd race around the Top Gear style “simulation” of London. I won’t spoil it for you but if you like Top Gear/Clarkson/cars its brilliant.
I was eager to try each game now I was disconnected. Each disc needed to do its hard disk install. Those install times seemed rather slow but they worked.
Ryse, as a new game, was instantly visually impressive. It is a hack and slash game with a relatively simple dynamic. It is a bit like a 3d version of Streets of Rage or Double Dragon. That is a good thing BTW! It is rated 18 for its violence, unlike Call Of Duty that is now a 16 though for swearing and violence. You engage as a roman centurion in sword and shield combat. Most of that occurs in real time. It relies on timing blocks, shield breaks and slashing with the sword. Eventually you wear down the opponent and skull appears over their head. Then it is time for an execution move. Here, when you trigger it, everything goes into close up slow motion, lots of cinematic camera moves. The moves become QTE (Quick time events). A subtle colour flash around the victim indicates yellow or blue buttons need to be pressed. Each time you do it speeds up and performs a suitable move. Before long you are seeing limbs loped off, swords driving up and through and out of victim and various stomps onto a downed opponent. It sounds horrible, but it has a sort of boom, boom, bash, take that bad guy feel to it. The viciousness of this is very much in the style of 300 or Gladiator. I am not sure it needs to be 18, though I have yet to complete the game. It is great fun though.
Finally I popped COD Ghosts in and was greeted with the usually things we see inn the franchise. I think knowing that the exact same game was out on the previous generation I was not instantly blown away. However I have played all the COD single player stories all the way through since the original one. I know it will drag me in.
After getting to play a bit on Day One despite Microsoft’s best efforts to stop me due to the patch sizes and mandatory nature of those (which was clearly not the case as the games ran fine!) I thought I would reconnect to the net. I sparked it up and popped Forza 5 back in. I was expecting to trigger a background download, but instead it went back to saying I couldn’t play until 6Gb was downloaded. This was surprising. The previous Xbox 360 did do patches but they were always really small, and quick, 5 minutes max. It was the Sony PS3 that annoyed me with its mandatory patches. You wanted to have a quick go on something and it would stop you and demand you patch, for hours. I double checked my Xbox packaging that Sony had not bought out the Xbox and applied this draconian patching to Xbox!
I gave up and started all the downloads. It only does one at a time, it queues the others. Obviously the pipe into the house is way to slow (Come on BT fix it please!!!!) but it seems to have a go downloading one thing, then swaps to another in a queue. I am not sure if there is a setting (who can tell in the confusathon of the interface) to ask it to finish one thing first but the end result after a nights sleep was this.
Still updating after all night d/l ! Microsoft gone all sony on me #xboxone
That’s right, only Ryse Son of Rome had “completed” Fifa was at 91% but was apparently able to start. Forza 93%, COD 97% and the demo of Kinect Sports only 28%. I checked the Xbox network settings and despite a 4Mb line it was claiming 1Mbps. I guess the servers were somewhat slammed by 1 million xbox’s trying to patch !
I clicked on Fifa 14, as it was ready to start. It started a bout then said it need to patch some more, and dropped back to 90% !
I was pleased then to just shout at the Kinect “Xbox Turn Off” (and was prompted with a yes/no) so I said yes definitely ! The downloads still happen when the box is off so I thought I would let the poor thing concentrate!
So I am not overly impressed with all this extra delay, which of course is compounded by horrendous internet speeds. However, I do not understand how, after all the online/offline fuss and reversals of policy Microsoft managed to pick the worst of both worlds?
Anyway, back to Forza 5. It is all working really well now (I am just dreading any more game stopping patches).
Forza 5 has utterly stunning models of the cars with lots of information, voice overs from all the top gear team and much more. Back once again is the really enjoyable car painting. Each Forza generation I end up spending a bit of time adding to the cars. 3 to 4 allowed an import of the vinyl sets, but this gen was a start from scratch moment.
Forza 5 #xboxone feeding edge livery scooby
I have once again created a predator style face, similar but different to previous versions. It still works really well on the front air scoop of the scooby. I have my Feeding Edge company logo, g33k script work too.
One of the interesting things the Xbox One provides is recording of game clips. This used to be only in a particular game. However now any game will record clips. If something interesting happens you can bark out to the Kinect “Xbox record that!” and it saves the last 30 seconds of gameplay. You can set up a recorder for longer and you can still use Forza’s replay to go back and find the moment. Once you have the moment recorded you can go to upload studio (another app you have to download!) and do some basic editing. This includes being able to record a voice over or picture in picture combination. It is going to be interesting to see what people do with this. I had a scary near miss at Spa (as I have full damage turned on in Forza 5 so this would have been race ending if I had hit the wall). I also added the “night vision” intro template in a predator style reference.

All the cars in Forza 5 are supposed to not just be in game AI but are Drivatar’s. When you play it collects data on how you drive, you style and pace. It then sends “you” to other games when you are not racing online. It seems a lot of early Drivatars want to brake going up the hill at Eau Rouge. It has a blind crest at the top and the compression on the slight right hand leading up to it unsettles the car, but don’t brake so much πŸ™‚
I think also the Drivatar’s may take their paint jobs with them too. There is an option to turn off accepting paint work. That means, in theory, when I am not driving my Feeding Edge company car with my epred and company logo is busy turning up in peoples living rooms around the world. Up until now that had only been if I took my car into an online race. I expect some more advercar/Drivatars will start appearing very soon?
There is a great video here of a real car on the real track hitting that corner and nearly losing it about 40 seconds in. I think it shows how far simulation are getting, and yet still being playable.

Finally Fifa 14. we are not really a football family, though Predlet 2.0 is growing to love footy. Fifa 14 (and I have had a lot of other Fifa games) is really slick. The players seem to be a koch more complicated physical model as running into one another rather than just slide tackles can cause all sorts of fouls. We have had some great games as 2.0 has got the hang of the mechanic of Fifa, mostly from playing the iPad version. Predlet 1.0 is also getting to grips with it. It’s a great game, and was free with the Xbox One Day one edition. That was a good call, it is just a pity they didn’t pre-install so that you could just plug and play in seconds.
There is a lot more to experience and figure out. Do I bother plugging the Sky box on the HDMI pass thru? So the Xbox in a pre Sky filter? I am not sure that gives me much? Especially as each box in my set up is an optical output to the 5.1 speaker amp. There is a “beta” setting on the Xbox to enable 3d surround to be passed through, but that means its going to have come down HDMI get processed by the Xbox and pumped out. It sounds a lot of messing around, though it will removed the channel selection on the AV receiver. However it also means the Xbox has to be turned on all the time the TV is? Oh well I beset go and experiment. Feeding Edge – Taking a bite out of technology so you don’t have to πŸ™‚

Xbox One Kinect Two

The title almost sounds like a football score but we are very close to the release of the next wave of consoles. I am particularly interested in the Xbox One for a number of reasons.
1. Most of my gaming has gravitated towards the 360 so I am geared up for the next gen franchises like the fantastic looking Forza 5.
2. The Kinect 2.0 has some features that are going to be really useful for any Choi Kwang Do applications.
3. Unity 3d has a tie in with Xbox One development.
4. Proper Cloud processing, utility computing looks like it is part of Microsofts plan. Not just streaming games from elsewhere, but farming off processing to large servers and delivering results back. (Grid computing as we used to call it πŸ™‚ )

As a Unity developer, whilst much of what I do is private rather than publicly available I am really interested in being able to deploy to the Xbox One. It opens up a lot of possibilities from a research point of view and may lead to some extra commercial work.
I have applied for the ID@Xbox scheme which is to help developers get onboard with the Xbox One. Eventually any Xbox One will be a potential piece of development kit, which is great news, but at the moment they are still in the old console model of needing a special Xbox One to develop on. Unity3d have announced free versions of Pro to go with those kits. As a Pro licence owner already I really just need access to the kit.
In particular when you see the different in how the Kinect 2.0 can deal with the human form as in this video.
***fixed the link as it was the wrong kinect video πŸ™‚

Having a richer skeleton, complete with real shoulders, but also the tools already in place to look at weight distribution, muscle tension, limb acceleration and interestingly too the heart rate from the face.
You can see that this provides me with a whole lot more tech vocabulary to be able to analyse what we do in Choi Kwang Do and provide a training aid, or training mirror. This is compared to where I am up to with Kinect 1.0 as in this previous post (one of my virtual/gaming technology Use Case examples)
I am not sure if I meet Microsoft’s requirements to be called a developer, but then most of what I do never fits on any of these forms that I have to fill in πŸ™‚ If I do and I get access to dev kit that is great. Either way this is much more useful than the alternative platforms, so I am hopeful.
As it is my Xbox One has been on preorder since they were launched so I am hoping the post delivers it promptly on 22nd November, just 10 days away. That is the fun side of playing, but it as ever is also part of learning and understanding the world of technology.

Rocksmith 2014 – Now that is clever learning software!

A few days ago my a new version of my favourite console application arrived Rocksmith 2014. I had seen the press release video of some of the new things in Rocksmith and they did look interesting. For once my expectations have been exceeded though. Rocksmith 2014 session mode is probably the best piece of software I have ever seen or used. That is a pretty big claim to make but it makes playing the guitar just feel right.
They have improved a lot of other things about the regular tune learning too. Just in case you are not aware this is a guitar “game” that you plug a real electric guitar into you console/PC and the device acts both as an amp but also as a guide. It knows the notes you are playing.
So to learn a tune the notes stream at you indicating the fret and the string (colour coded as well as positional). If you hit the note, great, it will start adding more notes and chords in to level you up. Miss and it will start to make life simpler. This is with real music, the original versions in every style under the sun.
The new version provides challenges in each song and coaches you through, there is less focus on the score, though score attack mode still exists. It is about the playing and about it feeling and sounding right.
The previous version had a way to isolate a section or a riff and repeat it, first slowly then speeding up. The new version has this but with lots more options about the number of repeats, the initial speed the level jumps etc, all easy to adjust. It also lets you just hit a button whilst playing the whole song and jump to riff repeating where you are in that song. Something that was awkward to navigate to before.
All the old DLC from the previous version is ported to the new version, though…. they rather cynically charge you another Β£7 or so to do so. If you have bought a lot of tunes then I would have expected some loyalty bonus. The additional tunes are now also Β£1.99 which seems steep, but then they are not just the music but the structure of the song and how to learn it. If you think how much guitar lessons might cost it is minimal really.
The real extra star of the show though is session mode. However it has been built it provides, to me at least, a fantastic backing band for any guitar noodling I feel like doing.

I love playing the blues scale, in the past I have tried playing along with famous tunes on CD’s and tapes, obviously that never gets to the standard of the stars of the blues world that are playing those tunes. I have also had backing tracks with books etc they are fine if you are trying a fixed thing to play. Session mode lets you load a band, lots of styles and types. Blues has 5 or 6 specific types on its own let alone Rock, Metal, Indie etc. In the set up you can pick a key, a scale and things around tempo and the relaxed or rigid nature of your backing band.
Session Mode
Then you start to play. Initially the band are doing nothing, but the first few notes on the scale you have chosen (which is highlighted on the fretboard on screen) starts to set the band in motion. The drummer and bass kick in at the sort of pace and intensity you start. Even just playing the scale slowly gets them going. Before long you find a little riff to play and maybe some chords and before you know it the other instruments keyboards, and guitars are joining in filling you room, in this case, with the blues. If you have set it to a more formal structure the band will start making the changes as in a 12 bar blues and the the onscreen scale suggests the notes that work in those step changes. Of course you can play what you want. Speed up slow down the band goes with you and it doesn’t just feel like a load of loops being triggered. There seems to be a more complex analysis of the notes you play that gets the band doing their thing. You can almost nod to them when you want to step it up or hit a quite solo. It is stunning.
I am not a musician by any sense of the imagination, but I have tried guitar many times. The previous Rocksmith taught me a lot, something that it continues to do in learning songs. However session mode is the delight. It is a very clever, very patient, non judgemental band. Experimenting with riffs and scales in lots of styles is truly enlightening and relaxing. It is not about trying to hit a note, keep up, make a score, copy a sound but instead find a self expression through making music.
It is not without it’s downside but one does have to suffer for one’s art darling
Ow #rocksmith2014
My fingers have now hardened a bit again, you just have to soak blisters in lemon juice and keep playing on them.
There was something else very cool about 2014. It is even better for 2 guitars now. Predlet 2.0 plugged his little electric guitar in and we jammed in session mode. He explored what happened as hit hit notes fast and slow and we even did a bit of riff turn taking. A right noise for everyone else in the house but we had fun. We then moved on an made more racket playing Billy Idol White Wedding which is a bit go a fave tune of ours πŸ™‚
I am not on their payroll, just a very enthusiastic fan. If you like guitar you have to get this !

Flush Magazine 10 – Airbags and ice skating

I took inspiration from @elemming recent fall ice skating in which she broke her wrist for this months article in Flush Magazine. It seemed a few things came to mind around safety and in particular the use of air bags in all sorts of places.
You can read it and see the really nice layout, once again great job making my words and ideas look awesome on the page. Thankyou again @tweetthefashion for all the hard work getting this issue out there.
The direct link is here


As usual its quite a mix but does share a common theme, mars landings, motorbikes and ice πŸ™‚
I hope you enjoy the article, and the rest of the excellent magazine.

Use Case 2 – real world data integration – CKD

As I am looking at a series of boiled down use cases of using virtual world and gaming technology I thought I should return to the exploration of body instrumentation and the potential for feedback in learning a martial art such as Choi Kwang Do.
I have of course written about this potential before, but I have built a few little extra things into the example using a new windows machine with a decent amount of power (HP Envy 17″) and the Kinect for Windows sensor with the Kinect SDK and Unity 3d package.
The package comes with a set of tools that let you generate a block man based on the the join positions. However the controller piece of code base some options for turning on the user map and skeleton lines.
In this example I am also using unity pro which allows me to position more than one camera and have each of those generate a texture on another surface.
You will see the main block man appear centrally “in world”. The three screens above him are showing a side view of the same block man, a rear view and interestingly a top down view.
In the bottom right is the “me” with lines drawn on. The kinect does the job of cutting out the background. So all this was recorded live running Unity3d.
The registration of the block man and the joints isn’t quite accurate enough at the moment for precise Choi movements, but this is the old Kinect, the new Kinect 2.0 will no doubt be much much better as well as being able to register your heart rate.

The cut out “me” is a useful feature but you can only have that projected onto the flat camera surface, it is not a thing that can be looked at from left/right etc. The block man though is actual 3d objects in space. The cubes are coloured so that you can see join rotation.
I think I will reduce the size of the joints and try and draw objects between them to give him a similar definition to the cutout “me”.
The point here though is that game technology and virtual world technology is able to give a different perspective of a real world interaction. Seeing techniques from above may prove useful, and is not something that can easily be observed in class. If that applies to Choi Kwang Do then it applies to all other forms of real world data. Seeing from another angle, exploring and rendering in different ways can yield insights.
It also is data that can be captured and replayed, transmitted and experienced at distance by others. Capture, translate, enhance and share. It is something to think about? What different perspectives could you gain of data you have access to?

A simple virtual world use case – learning by being there

With my metaverse evangelist hat on I have for many years, in presentations and conversations, tried to help people understand the value of using game style technology in a virtual environment. The reasons have not changed, they have grown, but a basic use case is one of being able to experience something, to know where something is or how to get to it before you actually have too. The following is not to show off any 3d modelling expertise, I am a programmer who can use most of the tool sets. I put this “place” together mainly to figure out Blender to help the predlets build in things other than minecraft.With new windows laptop, complementing the MBP, I thought I would document this use case by example.
Part 1 – Verbal Directions
Imagine you have to find something, in this case a statue of a monkey’s head. It is in a nice apartment. The lounge area has a couple of sofas leading to a work of art in the next room. Take a right from there and a large number of columns lead to an ante room containing the artefact.
What I have done there is describe a path to something. It is a reasonable description, and it is quite a simple navigation task..
Now lets move from words, or verbal description of placement to a map view. This is the common one we have had for years. Top down.
Part 2 – The Map
sniptop
A typical map, you will start from the bottom left. It is pretty obvious where to go, 2 rooms up and turn right and keep going and you are there. This augments the verbal description, or can work just on its own. Simple, and quite effective but filters a lot of the world out in simplification. Mainly because maps are easy to draw. it requires a cognitive leap to translate to the actual place.
Part 3 – Photos
You may have often seen pictures of places to give you a feel for them. They work too. People can relate to the visuals, but it is a case of you get what you are given.
snip1
The entrance
snip2
The lounge
snip3
The columned corridor
sni4
The goal.
Again in a short example this allows us to get quite a lot of place information into the description. “A picture paints a thousand words”. It is still passive.
A video of a walkthrough would of course be an extra step here, that is more pictures one after the other. Again though it is directed.You have no choice how to learn, how to take in the place.
Part 4 – The virtual
Models can very easily now be put into tools like Unity3d and published to the web to be able to be walked around. If you click here, you should get a unity3d page and after a quick download (assuming you have the plugin πŸ˜‰ if not get it !) you will be placed at the entrance to the model, which is really a 3d sketch not a full on high end photo realistic rendering. You may need to click to give it focus before walking around. It is not a shared networked place, it is not really a metaverse, but it has become easier than ever to network such models and places if sharing is an important part of the use case (such as in the hospital incident simulator I have been working on)
The mouse will look around, and ‘w’ will walk you the way you are facing (s is backwards a,d side to side). Take a stroll in and out down to the monkey and back.
I suggest that now you have a much better sense of the place, the size, the space, the odd lighting. The columns are close together you may have bumped into a few things. You may linger on the work of art. All of this tiny differences are putting this place into you memory. Of course finding this monkey is not the most important task you will have today, but apply the principle to anything you have to remember, conceptual or physical. Choosing your way through such a model or concept is simple but much more effective isn’t it? You will remember it longer and maybe discover something else on the way. It is not directed by anyone, your speed your choice. This allows self reflection in the learning process which re-enforces understanding of the place
Now imagine this model, made properly, nice textures and lighting, a photo realistic place and pop on a VR headset like the Oculus Rift. Which in this case is very simple with Unity3d. You sense on being there is even further enhanced and only takes a few minutes.
It is an obvious technology isn’t it? A virtual place to rehearse and explore.
Of course you may have spotted that this virtual place whilst in unity3d to walk around provided the output for the map and for the photo navigation. Once you have a virtual place you can still do things the old way if that works for you. Its a Virtual virtuous circle!

Dear BBC I am a programmer and a presenter let me help

I was very pleased to see that the Tony Hall the new DG of the BBC wants to get the nation coding. He plans to “bring coding into every home, business and school in the UK”. http://www.bbc.co.uk/news/technology-24446046
So I thought, as I am lacking a full time agent in the TV world, I should throw my virtual hat in the ring to offer to work on the new programme that the BBC has planned for 2015.
It is not the first time I have offered assistance to such an endeavour, but this is the most public affirmation of it happening.
So why me? Well I am a programmer and have been since the early days of the shows on TV back in zx81/c64/bbc model a/b/spectrum days. I was initially self taught through listings in magazine and general tinkering before studying to a degree level, and then pursuing what has been a very varied career generally involving new tech each step of the way.
I was lucky enough to get a TV break with Archie Productions and the ITV/CITV show The Cool Stuff Collective, well documented on this blog πŸ˜‰ In that I had an emerging technology strand of my own. The producers and I worked together to craft the slot, but most of it was driven by things that I spend my time sharing with C-level executives and at conferences about the changing world and maker culture.
It was interesting getting the open source arduino, along with some code on screen in just a few short minutes. It became obvious there was a lot more that could be done to help people learn to code. Of course these days we have many more ways to interact too. We do not have to just stick to watching what is on screen, that acts as a hub for the experience. Code, graphics, art work, running programs etc can all be shared across the web and social media. User participation, live and in synch with on-demand can be very influential. Collections of ever improving assets can be made available then examples of how to combine them put on TV.
We can do so much with open source virtual worlds, powerful accessible tools like Unity 3d and of course platforms like the Raspberry Pi. We can also have a chance to explore the creativity and technical challenges of user generated content in games. Next gen equipment like the Oculus rift. Extensions to the physical world with 3d printers, augmented reality and increasingly blended reality offer scope for innovation and invention by the next generation of technical experts and artists. Coding and programming is just the start.
I would love to help, it is such an important a worthy cause for public engagement.
Here is a showreel of some of the work.

There is more here and some writing and conference lists here
So if you happen to read this and need some help on the show get in touch. If you are an agent and want to help get this sort of thing going then get in touch. If you know someone who knows someone then pass this on.
This blog is my CV, though I do have a traditional few pages if anyone needs it.
Feeding Edge, taking a bite out of technology so you don’t have to.
Yours Ian Hughes/epredator

Talking heads – Mixamo, Unity3d and Star Wars

High end games have increased peoples expectations of any experience that they take part in that uses game technology. Unity3d lets any of us build a multitude of applications and environments but also exposes us to the breadth of skills needed to make interesting engaging environments.
People, avatars and non player characters are one of the hardest things to get right. The complexity of building and texturing a mesh model of a person is beyond most people. Once built the mesh then has to have convincing bone articulation to allow it to move. That then leads to needing animations and program control of those joints. If it is complicated enough with a body then it gets even more tricky with the face. If a character is supposed to talk in an environment up close then the animations and structure required are even more complex. Not only that but if you are using audio, the acting or reading has to be convincing and fit the character. So a single avatar needs design, engineering, voice over and production skills all applied to it. Even for people willing to have a go at most of the trades and skills that is a tall order.
So it is great that companies like Mixamo exist.They already have some very good free rigged animatable people in the unity store, that help us small operators get to some high end graphic design quickly. They have just added to their portfolio of cool things though with Mixamo Face Plus

They have a unity plugin that can capture realtime face animation using video or a web cam. So now all techies have to do is figure out the acting skills and voice work in order to make these characters come alive. I say all πŸ™‚ it is still a mammoth task but some more tools in the toolbox can’t hurt.
They have created a really nice animated short film using Unity which shows this result of this technology, blended with all the other clever things you can do in Unity 3d. Mind you take a look at the number of people in the credits πŸ™‚

Even more high end though is this concept video using realtime high quality rendered characters and live performance motion capture in the Star Wars universe.

The full story is here direct quotes from Lucasfilm at a Technology Strategy Board meeting at BAFTA in London. So maybe I will be able to make that action movie debut after all. There is of course a vector here to consider for the interaction of humans across distances mediated by computer generated environments. (Or virtual worlds as we like to call them πŸ˜‰ )