Get the next generation educated – making games

Today a report was published by NESTA written by Ian Livingstone OBE and Alex Hope into the educational and institutional shortcoming and development needs to ensure that ‘the UK can be transformed into the world’s leading talent hub for video games and visual effects.’ I realise the Uk angle is rather parochial but it is important to all countries and organisational units to pay attention to this.
Nesta Next Gen
(picture from NESTA with Little Big Planet Copyright Sony Computer Entertainment
The full report is here.
The report attracted my attention with my BCS animation and games development group hat on. The BCS as an IT professions organization seeks to offer an umbrella for the growth of skills and the nurturing of talent and interest in the technology sector. (Formerly called the British Computer Society). The BCSAGD group is trying to help promote awareness of the depth of skills and talent in the games industry as much as it is trying to help the games industry map to what goes on in the apprently more serious IT industries.
Having worked across all these industries, and now also encorouaging kids into tecnology via The Cool Stuff Collective TV show I know that whilst games have some specific challenges the core of the industry has the same profile as an computing based industry. The challenges of design, code, test and run and having to deal with constant disruptive innovation applies to all industries. It benefits, though, from being things that people can see and create. You don’t teach music by showing the inside of a piano, but its great to know how it works to get the best out of it.
I think the reach of the BCS into traditional games industries is pretty low, yet when you read this reports findings you cant help but notice we are all heading for the same noble goals.

The following is an extract of the key reccomendations from the above report.

Bring computer science into the National Curriculum as an essential discipline.
Sign up the best teachers to teach computer science through Initial Teacher Training bursaries and ‘Golden Hellos’.
Use video games and visual effects at school to draw greater numbers of young people into STEM and computer science.
Set up a one-stop online repository and community site for teachers for video games and visual effects educational resources.
Include art and computer science in the English Baccalaureate. Encourage art-tech crossover and work-based learning through school clubs.
Build a network of STEMNET and Teach First video games and visual effects Ambassadors.
Introduce a new National Video Games Development and Animation Schools Competition.
Design and implement a Next Generation of Video Games and Visual Effects Talent Careers Strategy.
Provide online careers-related resources for teachers, careers advisers and young people.

Universities, Colleges and Vocational education
Develop kitemarking schemes, building on Skillset accreditation, which allow the best specialist HE courses to differentiate themselves from less industry-relevant courses.
HEFCE should include industry-accredited specialist courses in their list of ‘Strategically Important and Vulnerable’ subjects that merit targeted funding. Industry commits to these courses through industrial scholarships and support for CPD for lecturers.
Raise awareness of the video games and visual effects industries in the eyes of STEM and arts graduates.
Give prospective university applicants access to meaningful information about employment prospects for different courses.
Develop a template for introducing workplace simulation into industry-accredited video games and visual effects courses, based on Abertay University’s Dare to be Digital competition.
Leading universities and FE colleges sponsor a high-tech creative industries University Technical College (UTC), with clear progression routes into HE.
Kitemark FE courses that offer students the best foundation in skills and knowledge to progress into Higher Education.

Training and continuous professional development
Skillset Creative Media Academies and e-skills UK’s National Skills Academy for IT to work with industry to develop specialist CPD training for video games and visual effects industries.
Support better research-oriented university-industry collaborations in video games and visual effects.
Continue to treat the 18 visual effects occupations on the Government’s shortages list as shortage occupations.

So take a look at the report and see if there is anything, either as a BCS member, as a teacher, as a volunteer, as a parent or as a leading employer that you can do to push these goals.

Having seen the enthusiasm of my kids (the predlets) for creating with technology at a young age we really should level up the sort of education opportunities we give young people from the earliest of ages. Scary for the teachers I am sure but there are lots of us out here to help.

3d printing in the desert, stand alone.

You cannot fail to be impressed with the innovation and future potential of this.

Markus Kayser – Solar Sinter Project from Markus Kayser on Vimeo.

@asanyfuleno spotted it on the Llewblog this weekend.
We had been talking about 3d printing and modelling things in situ in unusual environments. Then this popped up.
When I talk about 3d printing to people and suggest that it can be distributed across the planet to solve all sorts of issues there is usually the challenge of power and raw materials brought up. Here Markus Kayser powers with the sun, bit electricity to drive the steppers and the power of the sun to melt the abundant raw material of sand. This is also combined with some human intervention. Tools do not have to be completely automated if a quick bit of hands on solves the problem.
It’s like a real life Minecraft crafting table!

Metameets 2011 – part 1 ( of n )

This years Amsterdam based Metameets had a diverse and intriguing group of people present and attending online. It was also running in parallel with the 48 hour Machinima festival MMIF. When I was asked if I would moderate the event I did not take much convincing and I am very glad that I did. MC’ing an event can be done in two ways. Stand up, introduce the links and sit down again, or pay attention to it all and try and explore some threads in what has been said. I tried to do the latter, which meant paying total attention all the time. This is mentally tiring, but balanced by the fact that everything, without exception was intensely interesting!


One of the key things about Metameets is the diversity and clashing of worlds, and some new perspectives. This is the vision the Joja brings in organising and gathering us all for this event. Our two opening keynotes started that ball rolling. Corporate HR Catalyst for change and Legendary Games Developer.


The opening speaker was Ian Gee from Nokia. He is new to virtual worlds and has been introduced to it by Tim Goree. Tim is a fellow corporate metaverse evangelist, so it is great to see people like Ian getting the message and taking it somewhere new. Ian is Director of Organisational Development at Nokia. He spoke as an observer in the pioneering and exploratory directions many of the people at metameets take. The responsibilities and drivers we all have to explore. He was very much of the opinion that change needs to be radical. It is this “lets do it things a different way” attitude many virtual world people have that attracted him to us as that gels with his challenges to organisations as to why the working day is an 8 hour block, why is the management structure a pyramid etc. Ian also described the avatar representations and life altering experiences many of us are having as a gentle form of Metanoia

Next up was Noah Falstein. He is a long time game designer with a fantastic reputation. He did not disappoint with his presentation either. It is rare that “pure” games development gets anywhere near virtual worlds. Many virtual worlds people are not gamers, they attraction to them was not via games. Many games people don’t understand why virtual world interaction works or what the point of bothering is. Noah is that rare crossover. The early part of the talk was rally putting it all in perspective around evolution. How early we are really in the whole scheme of things. He used many unusual organic pictures of long dead creatures to set the scene. He then dived into showing everyone Habitat from 1985. This was one of the first environments to use home computers (commodore 64) and a modem to connect people with a graphic virtual world. In this case a 2d sprite based one. However Noah pointed out that then in 1985 the adverts for the game had to fully explain that it was real people on the other end of the game. We had not advanced enough or had the technical and cultural awareness that that sort of things was possible. He talked about to the leap this made in peoples understanding and expectation, the subsequent evolution(s) of virtual worlds and games and how some evolutionary avenues are dead ends, but others combine into much more. He has a stealth startup and was not able to share what he was doing with that  exactly but I got the impression that it was something of both a AAA game lineage and a virtual world/social interaction/UGC project. That is going to be very exciting (and I am sure it will be whatever it is. Just look up his games CV !)

So we had two presentations on seismic changes, both made analogies to both evolution and exploration combined with a risk taking pioneering attitude. So I knew we we off on the right track as a conference.


We then had some focus in the agenda on Opensim. This was great as this was scheduled as some technical discussion for the tech’s in the room and some business discussion for those who want to just use virtual worlds like Opensim.

First up was Justin Clarke Casey, a core Opensim developer known to many of us. I know Justin from back in the IBM days and also we live in the same area. It was good to see him get up and explain his vision of where Opensim is going. Much of this was focused on the gradual evolution to the generic interconnected hypergrid. Where your own opensim is able to interact with everyone elses. This does happen now but architecturally the hand over of you to another sim is evolving and Justin could see a time, and a gap for certain common services (akin to DNS) were available to broker this. It shows just how complicated evolving a grid is and the potential security implications and stability problems. At the same time it all looked like it was going the right direction.

(Picture from JustinCC’s presentation)

Next Ilan Tochner from Kitely stepped up to discuss his companies approach to creating virtual worlds on demand. In this case they use opensim, though the principle can be used to start to drive any appropropriate platform. llan vision is to be the Telecom company of the virtual world. Providing some middle ground of communication/hosting etc. Though without the rip off data charges 🙂 The principle is you go to the site, press a button (a large one to attract your attention) this provisions a virtual world for you. You then dive into it and invite others. When you don’t need it anymore you either park it (in storage which is very cheap compared to leaving a server running) or delete it. There are a few companies trying this but Kitely and llan are very active in driving standards and supporting the growth of the industry with a more open approach. llan had some interesting directions for the viewers too. Mirroring what happened with the web and mobile in the early days and moving to transcoding of any viewer code to a suitable renderable platform. Plugin and client free access. You can see why this would appeal both to users (no messing around) and to a service provider like Kitely. If virtual world y suddenly appears you need to be able to provision a y and get users logged on with y’s client. We had lots of back room chat about standards, common approaches, generic versus specialist interfaces. All good and the right sort of conversations to have.

So there was a clear link and need here to push certain function and responsibilities into areas (like a generic avatar service?) into a real layer. From both a service provider of generic virtual worlds and a developer of the core of one the requirement and the evolutionary path was similar.

coming up education, AR and  path to the future to round up Day 1!

Real to Real world Augmented Reality

As you may have noticed I think we can do more with Augmented Reality than simple data layers on top of the real world. The principles and technology of layering anything from anywhere make some things possible that can really help us.
An example of this concept is the considering the problem of driving along the motorway behind a massive great lorry of van. You keep your distance yet you have no real awareness of whats happening just in front.

We have an augmented reality of a rear view mirror to give us situational awareness of what is behind.
So what would happen whilst in your car you could have an augmented reality forward view, of HUD that showed you what was in front of the obstruction using their view of the world? This is an augmented reality application but not a traditional one. It is a magic lens but it is real world to real world.
It is not without problems of course, how big, how much screen, but it can be based on peer to peer networking between the vehicles. It also has the benefit that when in a traffic jam there is not reason not to zoom you view forward hopping from car to car until you get to the problem in the road.

A powerful part of the gaming recipe – Graphics

This video is doing the rounds and was used at the FCVW conference showing the powerful rendering capabilities of Cryengine 2, Unity3d and Unreal engines.

I think of all the elements in this the live linking to Maya showing the boned and rigged wireframe being manipulated at the start starts to open up peoples appreciation of what goes into these engines.
Also with the fantastic level of graphic quality you need a fantastic level of modelling and texturing. Awesome lighting still needs to be placed on set in the right place. All this is also before you get to the layer of gameplay, the way that a multiplayer environment might synchronise its data. The rules of any game, the testing of the entire product, the sound design, music etc etc.
A graphics and physics middleware selection is important, and these render tests really show the potential, but there is a lot more.
Having said that though lots of the rest is more standard software engineering and IT architecture. It has to blend with the rich visuals of the user experience. Many techies have traditionally not been interested in the front end, its too obvious when it goes slightly wrong I think 🙂
Anyway enjoy the video, think under the graphics and see what amazing creations can be put together for work and play.

Gadget Show Live 2011 – the long version

I cut a longer version of the Gadget Show Live video, this time from scratch, with a Garage Band tune form loops and some sound effects. Going through it was a good memory jogger if nothing else. It also makes me appreciate even more what the pro’s do on The Cool Stuff Collective.

The show day I went to was press/professional day so it was probably a more subdued affair that then the 20,000 people per day paying public days. For a start the arena show was off bounds as the guys were rehearsing. So no crossing of gadget and games show streams there then.
When I arrived I went straight to the back of the hall and found the UK computer museum, they had a huge stand for each decade and lots of that is in the video above.
There were a lot of 3D TV’s and laptops mostly with glasses. The Nvidia stand had a huge 101 inch 3d TV playing Crysis 2 on a high end PC and it was VERY impressive. The LG 3d projector show was good too though it was more for the pure quality of the picture than the 3D effect which seemed somewhat subtle as they played the Tron light cycle scene.
Elonex were in full force, and had huge number of tablets and e-readers. For a Uk company they are right up there in the rankings getting more market share than many of the major electronic giants.
The vehicle test tracks were great, lots of electric bikes and scooters. The TP Scoot was particularly impressive. Very stable, and very fast and road legal. I skipped going on the Segways as already done that 🙂 funs as they are they are of no use here as they are not street legal. WE had a go on a smaller electric scooter that was only £100 and that was fun too but the TP was my favourite.
Sugru we are the show and it was great to see them there, the hack anything putty was getting lots of attention over near the future tech section.
In future tech (4 small stands by one wall) there was laser object capture, robot guitar hero player, a working 3d printer, Neurosky and some robotic kits. The commercial stuff is great but they could do with a much bigger future tech section. I guess thats what the wired nextfests are for though 🙂
In the gaming zone, for some reason, people had to queue to enter the 3DS zone. Matt did point out that we were wasting time queuing as I already had one , had been to the launch party and we had had it on the show. So in dipping out of the queue I managed to rip my trousers. Not very rock and roll, but it was funny. It somewhat restricted my movements but luckily the badges we had all had safety pins so I risked it and pinned what I could back in place.
When we found the new games section where gears of war 3 was running and duke nukem I had to take a phone call about sorting out an Opensim server that was having a few issues. So I stood there looking at Gears of War 3 (awesome!), talking about Opensim and rebooting a remote machine via my iphone, wearing my TV g33k tshirt with the crotch of my trousers held together with pins. It was, not to put to fine a point on it, a surreal moment. Ninty you owe me some trousers!
Game had a big store in the game zone, they had pretyy much everything at full price, just a couple of pounds off Crysis 2. That seemed a bit of a pity, though saved me spending any money.
There were a lot of motion gaming chairs at the show, which looked very expensive. One of the more interesting cockpit versions though featured a projected dome. to give a 180 degree view from a regular project source, using a convex mirror. That in 3D would be awesome though complicated.
The paintball experts at were great to talk too. They have a great lazer tag system (as in the video) and also they run kids games using foam non exploding pellets (like a fancy Nerf). Though they showed a whistling sniper bullet for regular paintball that was very cool and their custom markers were stunning. 25 rounds per minute, precise control.
It was odd seeing Flip on the Cisco stand on the day that Cisco closed the doors on the poor little thing. I am sure they are getting some flack today!
I was looking forward to the Vuzix AR glasses, but they were there but not working when I popped in.
There was a very strange couple of stands for some wrist band that is like an acupuncture pressure band, but with a “hologram” on it instead. This was just odd.
The venue was certainly full enough of some amazing kit, though equally if you pay attention to gadgets and hang around in best buy and read the tech press it was probably less surprising, but still well worth the trip.

Introducing open source to kids TV – yes really!

I really enjoyed the chance to explain something really important on this weeks Cool Stuff Collective. The core of the piece this week was the principle of Open Source collaboration. I had started to lead up to this concept with the wikipedia piece a few weeks ago, showing the views that anyone can get involved can contribute and not just consume on the web.
The way to approach open source though had to be something other than the “traditional” software applications such as the Linux operating system. Whilst it is one of the most advanced and technically rich exemplars of the this self organisation and support eco system its really not compelling enough for kids.
The open source libraries for the Xbox Kinect however are spot on. It is a triumphant story of the explorers out there seeing what they could do with what is already an amazing piece of consumer technology. It being the big xmas hit only a few weeks ago most people can relate to it and what it does in the context of the Xbox. Many of the viewers will have played with one too.
The speed with which the open source community gathered and hacked the kinect, released the code and then people started gathering and building more and more things was so fast it highlights the speed disruptive innovation can side swipe large corporate entities. In the first few days of the hacking Microsoft took a “not with my box of bunnies” approach. Legal proceedings were threatened etc. Somewhere, somehow there was someone with enough sense to stand back and say… “wait a minute, at the very least this is selling even more kinects, people are buying kinects who don’t have xbox’s”. After all no harm was being done really, the kinect was not being stolen, it was not a DRM issue. The thing has a USB plug on it! Now it may have been all calculated to frown and them embrace the hacks but however it has worked out Microsoft come out pretty well having decided to join the party rather than stop it. Whilst not specifically part of the open source movement(s) they are releasing a home hacking kit.
The choice of how to work with your kinect on a computer is a varied one but just for the record (as we did not give any names/URLs out on the show)
I used (and hence was helping to support) the Libfreenect piece of software on my Mac. All the info you budding hackers need is at
This let me show Sy the depth of field display running on a Mac. The left hand colour picture reflects distance, one of the key points of the Kinect in sensing movement over an above a regular webcam. I was not altering any code just showing what was available at its very basic level.
I also demoed the audio hack of a Theremin the Therenect by Martin Kaltenbrunner of the Interface Culture Lab. I bumped into this demo via a serendipitous conversation about what a theremin actually is and how it works just before putting this piece together. Martin is also one of the inventors of the ARTag and TUIO integrations that I used in the AR show in Unity3d and the brilliant Reactable that I hope will be in the final Big gadget adventure film towards the end of series 2. (So a friend of the show as his stuff just works whenever I try it!)
There are of course lots more things going on and so many good examples of people working on the kinext and hooking up other free and accessible pieces of code, and more importantly sharing them. @ceejay sent me this link on twitter after the show aired.

Hopefully next (and final record for the series) I will get to do the Opensim piece, more open source wonderfulness to build upon this and the previous conversations.
Many people are not aware just how complicated Open Source is as a concept and the implications it has as part of any eco system. It is a threat and an opportunity, a training ground for new skills, a hobby and a political minefield of ego’s, sub cultures, competing interests. What come out of the early days of Open Source is usually very rough, but it works. If it does not work quite right you change it and contribute back. We have yet to see the ultimate long term effects of open source in a networked world. We have though seen it make massive changes to the software industry, but the principles of gathering and sharing and building applies to way more that our geeky business. It is about governments, banks, manufacturing and even the legal system. It is, not to put too much pathos on this, the will of the people. (just not always the same people who consider themselves in charge or market leaders.)
Open source projects also tend to spring up in response to a popular commercial event, challenging windows with linux as an example. Without something big and unwieldy, or not done quite how people really want it done, an open source movement will not form with enough passion and gravitas. That is not to say that people do not realise lots of things as open source. You write code and share it, build and show etc, but that is open sourcing and not the complexity of an open source movement I think.
So, a heavy subject once you drill down but it is the future and its already here.
Open source is messy, it about people, it tends to not fit all the preconceptions of a product. However people tend to expect a product to work and be supported the same as if they paid for it. Which is why there actually is a financial and business opportunity in wrapping open source up, and providing labelled versions and services with appropriate licensing. The people that build still need to eat and be recognised for their work too. So it is by no means just a load of free stuff on the internet, but you are free to join in and I hope some kids will be inspired to at least take a look or ask their parents and teachers about the social implications of all this too.

Browser to unity3d communication and back again

I have been doing a fair bit of unity3d recently again. In particular I have been looking at the ways it can take data and parameters. It’s all in the unity documentation but there were a few subtle things I missed first time around that I thought it worth sharing, as much for my memory as anything.
The first useful thing is the unity plugin is able to simply talk back to the browser and call javascript functions in the page. So in C# for example I am able to do this.

void Awake ()
Application.ExternalCall( "callbackunity", "The game says hello!" );

Where the page that the unity3d is served from has

<script type=”text/javascript”>
function callbackunity(arg)

Obviously you can do more than just an alert so I looked at what I can send back into unity3d from the page and started to do this in the calling page.

<script type=”text/javascript”>
function callbackunity(arg)
var unity = unityObject.getObjectById(“unityPlayer”);
unity.SendMessage(“Rezzer”, “inboundfunction”, “item1~item2~item3”);

This gets the unity plugin module and calls a function called inboundfunction that is attached to a unity game object called Rezzer. It is only able to pass a single value so I send a string with a known seperator ~ to be able to send more than one piece of data.
So the flow is that unity loads, awakes and then makes a callback to the page which then injects the data into unity.
On the unity side I have this sort of function

void inboundfunction(string indata)

string[] words = indata.Split('~');

data1 = words[0];
moredata2 = words[1];
anotherpiece3 = words[2];


At first this all did not work (of course). I thought I had not put the right script on the right object as I was getting a callback to the page but not injecting the data into the unity object.
This turned out to be something quite simple in the end. The unity docs example shows var unity = unityObject.getObjectById(“UnityContent”); However the page I got generated out of unity3d that I used to then add my code to used a different label for the Unity plugin in the setup. It called it “unityPlayer”. So my failing code was because the js in the webpage was not picking up the right object. As we know computers need to have everything exact.
This was almost code blindness. I was thinking it was getting the unity object, of course it was how could it do anything other, but its an obvious school boy error “UnityContent” <> “unityPlayer” 🙂
Once that little bug was sorted out it was all plain sailing. The parameters I pass as item 1,2 3 etc are generated by PHP from Drupal that embeds the unity3d. So I can send anything to the unity based on which page is being served and by whom.
One of the other things I do though is use direct WWW communication to a server from inside the unity3d. This initial set up code is to establish some start parameters, once running communication is not via the browser, but a hotline to server instead.
That all just works as documented, though you have to make sure you are serving from and talking to the same server or dealing with the cross domain policy documents that help protect web apps from rogue applications in browsers.
This is all very basic code really, but if you are not from a web world it can seem a little unusual.
e.g. in .cs in unity3d

IEnumerator sendIt ()
// Create a form object
WWWForm form = new WWWForm ();
// set up fields
form.AddField ("data1", "some data");
form.AddField ("userscore", score );

// Create a download object
WWW download = new WWW ("http://someurltohandlethings", form);

// Wait until the download is done
yield return download;

if (download.error != null) {
Debug.Log (download.error);
print ("Error downloading: " + download.error);

} else {
print ("Sent");



Due to .cs needing to multi thread this is an enumerated function which means you have to call it like this when you want to send anything

StartCoroutine (sendIt ());

As most of my quick unity3d pieces had been in .js this StartCoroutine was not as obvious, though it is in the Unity docs.
The URL I call has some PHP that gets at the form elements, but we are in normal web territory then.

$somedata = $_POST["data1"];
$somescore = $_POST["userscore"];

Of course all the error handling and data handling can (and is) much more elegant but this all seems to work very nicely and the core of the communication I am able to drop into anywhere.

Feeding Edge is 2 Years old

Another significant milestone today. Feeding Edge Ltd is now two years old. It is something I am extremely proud about and when I reflect back on this year it has been so varied, there have been some challenges but the worst of those have been resolved. For the most part it has been such an entertaining and stimulating year its hard to think of it all packed into 12 months.
feeding edge 2nd birthday
(When I added the second flame from last years I used Photoshop CS5 puppet warp on the flames, its amazing, it puts a mesh over the part of the image and you edit it like a 3d mesh would)
A year ago I could not have imagined where I am at today. The diversity of which would not have really fitted into any other company. The ability to go with the flow, trust in serendipty and gut feeling has been incredibly useful. If companies let the creativity of their employees flow, rather than focus on control and crackdown then I am sure we would be be generating some fantastic innovations and growth in business. Though, selfishly, if everyone does that then it makes it much harder for me.

So this year I have a few sparklers (though some customer names and projects are not public)

  • Consulted on virtual worlds and games for the government
  • Built a complex system of second life and open sim interactions with drupal and a java model for medical training
  • Toured washing away cave paintings at conferences and gathering all over the UK and elsewhere including Finland and Ireland
  • Appeared on shows in Second Life and given many talks too
  • Started to get the ball rolling as Chairman of the BCS animation and Games SG
  • Been a port of call for references and direction as a virtual world advisor to startups
  • Built a drupal based social and political hub as a proof of concept
  • Review games on Game People like Kinectimals and 3d GT5

  • The ultimate highlights though have to be.

  • Forming the as yet in stealth social games and transmedia company and getting seed funding and filing the patent for the idea.
  • Being given the chance to work on kids TV inspiring the next generation with future technolgy with The Cool Stuff Collective

  • The games company is a mix of having to architect design and direct some development and is very much hands on with the technology. The concept for our first product still amazes me and I am very proud of it. With a bit of luck we will get bigger very soon and we can deliver an even more amazing rendition of the concept, but to my partners in all this I say a huge thank you. I want to write more about what we are doing, but now is not the time or place. I still have a stack of code to write, but my coding partner out there is doing some awesome work making sense of the ideas we come up with for implementation.
    The Cool Stuff Collective has been an amazing journey too from the first conversation about being a technical advisor to being thrown into the studio to present, and now mid way through series 2 yesterday I was out with the crew filming at the Pure Tech racing simulators then dashing down to Intech hands on science centre. Being able to inspire or interest the next generation of techies, and maybe reach some of their parents with tech that is already here but seems like science fiction has been an incredible honour.
    Look at the list of things we have covered

    3d Printing, Haptics, Ardrone, 3d scanning, MMO Lego, AR, Kinect, Mind control, SMARt tables, eReaders, 3D cameras and glasses, Unity3d/evolver games dev, Cloud Computing, Wikipedia, Photoshop, Laser Holographic projection….
    To come is Solar Flares, Opensim and the outside video we have now done indoor skydiving, indoor snowboarding, Racing simulators, science gadgets and planetarium.
    So I have ended up on wikipedia and have over 20 TV records under my belt now. I have a showreel of sorts with its own page here and my new business cards say amongst the blurb TV Presenter. (I think that’s valid now isn’t it?)
    When people ask what it is I do and what Feeding Edge does, I think this does all some up in “Taking a bite out of technology so you don’t have to”.
    I think that because pushing things forward, thinking of the whole not just design not just tech but the social implications of it, but mashing in the fact that things should entertain and engage us as humans is my mission.
    So what does next year bring? Well for me more of the same is the answer.
    I am asked how I have time to do all the things I do. The answer is I don’t. Sometimes things have to slide a bit. Whilst many things seem diverse they are linked. I play games, looking at them for review, to spot trends, to see how things might be used in other gamification contexts and for enjoyment. Then I write about them, present about them and even build them. It’s all part of the flow. The same goes for the other emerging tech. If you are interesting in 3d virtual worlds, then naturally how to create 3d content, how to experience 3d content and how to use 3d environment to reach an audience becomes part of everyday life.
    Then there is the social media side of things. I tweet, blog, share photos on flicker, put game achievements up on facebook and raptr. It is both a personal sharing of whats goidn on to those who need to know or are curious, but it is also a social experiment in how it feels to do these things and the impact it has on my life. Having that personal experience lets me share it with others and with companies and get them to the good part of this communication revolution rather than stagnating.
    People I know often say to me they only understand 1/3 of my tweets. That is great as probably the 1/3 was for their benefit the other 2/3 for others. Mixing busines, social, tech and existence on one channel in 140 characters is still fascinating. It is a microcosm of the whole of what I do with Feeding Edge.
    So to all my customers, partners, competitors, friends, mentors and fellow virtual world evangelists I say a huge thank you for all your support.
    Right, back to it, now what was I do again?

    He has done it again! Telepresence this time

    This guy Johnny Chung Lee, now “rapid evaluator” at Google, is amazing and I love how he approaches things. Way back he did the wiimote hacks that made 3d motion control out of wearing the sensor bar and keeping the wiimotes steady. As Feeding Edge’s tag line is “Taking a bite out of technology so you don’t have to” you can see why I have an affinity for his work!
    This time he has hacked together a $500 telepresence robot. These things seem to keep popping up so he is definitely “bang on trend”. They are a weird combination of a physical avatar used to navigate physical space and “be” somewhere. Like all new ideas they may seem daft, and I did have a “that’s stupid” fleeting momentary thought. However, as I have mentioned before, when I see something and think that I know I have to look further into the idea, especially when serendipity subconsciously shouts out about the subject
    I bumped into this because I was looking at these more commercial telepresence robots from vgocom. This version though uses the IRobot Create (Roomba)

    They had featured a telepresence ER/A&E robot on BBC Horizon and combined with a piece we filmed on wednesday for The Cool Stuff Collective where I was trapped in a video box it all started to link up. Further re-enforced by a conversation about driving robots from a virtual world that started the very same afternoon!
    This has a nice circular element to it in that you will notice from the great Johnny Chung Lee’s blog featuring this quick build is called Procrastineering and the tag line, which is something I live by is “giving in to productive distractions”. It brilliantly sums up the flow of serendipity and the combination of tech and art and ideas mixed with human conversation that seems to lead in a positive direction. I know it is not for everyone and it is a seat of the pants existence but for me it feels right.