Gameduino 2 vs Raspberry Pi?

Often when I use examples referring to the Arduino micro controller board I am asked about the Raspberry PI. It is often implied that the PI, with much more function, is “better” than the delightful simplicity of the Arduino.
Arduino’s are a programmable switch. They require almost no additional setup. They let people learn to code simple programs that, with the addition of a few wires, turn lights on and off, make sounds etc. It blends the physical and the digital( and is in my TV showreel). In it’s raw form its simplicity is its key.
The Raspberry PI is the other end of the spectrum. It is a full computer. Hence it can do everything Arduino does and lots more, but… it takes a different level of set up.
Typically getting a PI running requires the user to do some basic system administration, getting an operating system (picking one from the many available) and then plugging in keyboards and monitors. When you do though you then have lots of choice of languages to use, development environments etc.
Both are great, and so this is really not a versus platform war.
Recently I received my additional shield for the Arduino. (A shield is an additional bit of hardware providing extra function to the Arduino) This one is called GameDuino2.

Many other shields provide things like network connectivity, more ports etc. This Kickstarter funded shield instead adds a screen to the Arduino. However it is much more than that. The screen is touch sensitive and there is also a tilt sensor. In addition it provides hardware to be able to put interesting visuals on the screen. It extends the Arduino development environment with some more library calls and is specifically designed (as indicated in the name) for building games.
I have looked at the extensive tutorial code. It seems that it is brilliantly straight forward.


This will simply get the details of any touch interaction on the screen.
Whilst this might be straightforward to do on other devices like iOS, but the lack of expense and the simplicity of the environment to be able to develop or prototype a touch based application is fantastic. All the Xcode, developer registration etc needed to just begin to tinker is obviously off putting for many people.
There is much more though with GameDuino2 things like hardware sprites, audio and other game goodness.
Not only that though, to quote the kickstarter information “Does it work on the Raspberry PI? Yes, it hooks up directly to SPI port on C13 on the Pi. And Raspberry Pi software support is done, so the GD library and samples/demos/games all run fine on the Pi.” So you can combine both worlds.
Right at the moment though one of my Raspberry PI’s is being a dedicated set top box for the TV using the XMBC open source media centre version Openelec something the Arduino certainly can’t do πŸ™‚
Either way go out, get one or both and enjoy coding and sharing.

Rocksmith 2014 Session mode

I am a big fan of Rocksmith, it has been a breath of fresh air in ways to learn and enjoy guitar now they are ramping up for the next edition of it. Luckily all the downloaded songs are still transferable. (Well less luck and more a necessity!)
I often use it as an example in my talks like the one I am giving this Wednesday at BCS Shropshire.
An interesting new feature has been added though, and so my talk has to evolve to keep up. That of session mode with a reactive and generative music backing band. You play and they keep up. This is really interesting and I can’t wait to see how it pans out with my guitar tinkering. The E3 conference saw Jerry Cantrell of Alice in Chains take to the stage. Just listen how, (once he has managed to go through a few customisation options using voice) the band kicks in and responds to his playing.


Goto; Amsterdam part 2 of 2 –Some Choi then Makers gather

Day 2 (part 1 is here) of Goto started very early in the morning for me. I woke up and thought, hmmm I should do my Choi Kwang Do stretches and patterns, not realizing it was only 5am. Still it made me feel pretty good after the slightly heavy night out previously. Conferences are weird time shifts too, the intensity of being in conference mode needs something to balance it and this did. Besides I was going to be talking about Choi in my presentation and I had not been to class since the Saturday. It was now Wednesday!
Room with a view and an iMac  :)
So I entered the morning keynote pretty refreshed and ready to hear some interesting things.
The twitter wall was up and running again, as were my tweets. The wifi was rock solid the whole conference too !
Twitter wall #gotoams that was a well timed shot :)
First up was Martin Fowler, author of many books I have owned and read on patterns, UML etc. He had picked a couple of his talks that he has in his kit bag. For pure software engineers these were probably very useful. Schema’s still being there when there is no Schema made sense as at some point anything needs a structure put on it.
The tracks for the day were, It’s all about the people, stupid, Agile Closing the Loop, Hard Things Made Easy, Mobile, Case Studies, Legacy Systems and our Emerging interfaces track.
I stuck with the It’s all about the people, so that I could hear Linda Rising (@risinglinda) talk again. She talked about the power of the agile mindset. This was nothing about the Agile development approach, but really about human motivations and how they get messed up depending how they are addressed. Linda cited an experiment that gave an easy test to a group of students. After the test the group was divided into to by a subtle difference. This was not revealed until the rest of the story had been told. Instead Linda introduced Fixed and Agile thinking groups. Fixed being of an attitude that any task, intelligence, talent etc cannot be improved, you stick with what you have got and make the most of it, versus an agile mindset that is not fixed but is intrigued and motivated by the challenge and the effort aiming to improve.
In the story the fixed group were asked if they wanted to take a new easy test or a new hard test. They all chose easy again. The effort/agile group chose harder tests, thriving on the challenge.
There were several elements to the research that had been done that Linda recited, but it showed that the fixed mindset tends to measure itself against others being worse, assuming it can’t improve it maximises others flaws. The agile mindset looked for challenges, understood that failure was a learning experience and enjoyed the entire process comparing only to themselves and wanting to coach others to join them.
Now it turns out the only difference in the groups in the experiment was that the fixed group were handed their results and told that they were very clever. The agile group was handed the results and told they must have worked very hard. There are lots of examples of this but also that the fixed thinking tends to be destructive. The “rank and yank” approach of Enron and other corporates that seek to measure and find “the best” cut the others out etc. which leads to a set of people only wanting to not be in the bottom of the pool. This was compared to organisations like Southwest Airlines who seek to grow people, help them get better at whatever they do.
This is all out there in research, that is obviously ignored as it is a bit scary. However, linking back to my morning Choi exercises, in CKD there is no competition.We all want to learn, we want to grow and improve ourselves and help others. Nothing is ever wrong, it is a way to learn to do it better. Instructors are helped to understand how ti give positive re-enforcement and to praise effort. I don’t often hear “you are brilliant” used about people in the art, instead “that was a great effort”. Find you limit and push a little past it, then a little more. Just strive to get better not be the best. it is so simple and effective and it works.
(It has got me pondering an evolution of my blended learning piece of the pitch that features CKD and dive more into the similarity with how to do any good team growth and nurturing based on the CKD experience.)
The next presenter was Simon Brown on Sustainable competence – the people vs process and technology. This was more of a consulting experience presentation, but about the same subject. How and where it works to let people take an agile approach. It also was important to point out that Agile as a buzzword did not mean quick nor sort it out without the complications of design, build and test. In fact the examples were all of how teams that trust one another and are self organised take time. It is something that needs to be trusted to get on with itself. I had flashbacks to previous teams and how we tried to do that (without the Agile word). Always a corporate control freak would try and crush it at the wrong time.
A spot of lunch and then it was me. 50 minutes of cool stuff collective, games tech, 3d printing etc. It is my same slide deck, in a slightly different order but it is here and if you were there it might make sense πŸ™‚ I felt the crowd were engaged and enjoying it. There were some interesting shows of hands, or not to some of my questions to see who did what where. 80% of people knew about 3d printing but the viral nature of reprap was a surprise to many.
I was really glad that all of us presenting had some freaky and interesting things to say but in particular next up after I had shown some custard pies being thrown (usually quite hard to follow) Daniel Hirschman @danielhirschman had more than enough to follow that madness. He has several angles to his work. As an artist and physical designer he has a different perspective to developers. However he also wants the world to learn to code, to be a maker to hack. This is a very cool combination. He is a fan of the Arduino and of processing, and builds real things with it.
This was fantastic, all built with arduino and some other hacks to make a corner shop a musical instrument for a beer advert by his company Hirsch and Mann ltd. Check out the other work, like the Turin interactives at the science museum.

However he also showed lots of the work too of his educational company Technology Will Save Us that makes kits with arduino and alike to let kids or any makers play with an idea and build some interesting things. His final mad example was Bright Eyes. Which he got a kickstarter going for and raised some funds

(We speculated that Andy Piper would have been one of the backers, and yes he is :))
These came out later at the party. They were very popular.
We then changed tack to several lightning 10 minute talks. We had kinect for shop windows being demoed, Dan (@mintsource) showed a clever web sockets sort of local network distributed pub quiz with real prizes. I missed out by 1 point on a prize grrr. Dan also showed Leap motion working.
I did a quick piece on Unity3d and hospitals it was great to be able to talk a bit about code and how it worked. For my own brain it was good but also to not just be the crazy virtual world guy πŸ™‚
It was a maker fest really πŸ™‚ It all seemed to fly by and lots of people wanted to talk afterwards to it seemed to hit the nail on the head.
I had not mentioned this conference had lots of breaks, good 30 minute ones. Not a quick 10 minutes to dash to the next talk, but ones to stop, chat, reflect etc. It’s pacing was really good. They have been doing it a while though.
The final keynote was different in that we all stood up. The chairs had gone. The speaker was Mike Lee @bmf He was talking about the App universe after the big bang. It was a war story presentation, and he admitted to being a bit jet lagged after the alternative WWDC conference he had run. He is ” Mayor of Appsterdam” and brought a typical ebullient American delivery but blende with a love of the art and culture in Amsterdam. His main thing was “don’t make games” basically he was saying it is not going to make you rich and it is too hard. He is making games, he is suffering for his art. He managed to get his plug in at the end, but as it is an educational game, or at least one that tries to blend learning and fun it is worth a look. It was entertaining and depressing in equal measure, but finished with the line “lets go drink beer”.
We all stayed at the venue for a while as it was meet the speakers time, and as a speaker I was there to be met πŸ™‚
Then it drifted back to what must have become a very expensive bar bill at the hotel.
As mentioned the Brighteyes came out, but the also went head ot head with a Google Glass rig (and won)
Google glass meets kickstarter #brighteyes #gotoams
It was also very cool that the father of OTI and VisualAgeSmalltalk and Java Dave Thomas also took to them πŸ™‚
Behind the lens flare that's Dave Thomas (visual age) wearing led #brighteyes 100+ LEDs playing patterns #gotoams
Anyway I had some awesome chats with people, made some great contacts, enjoyed what I heard and had a great trip.
So thankyou again Gotocon and trifork

IGBL – Learning from Games Based Learning

Last week I popped over to Dublin for the Irish Symposium on Games Based Learning at the Dublin Institute of technology. It is a few years since I was there last at Metameets listening and talking virtual worlds.
There is a very influential core of virtual world people there and many of them were at the conference so that is alway good to have a real world/SL meetup.
Dr Pauline Rooney from DIT opened up the conference and it felt like I was in the right place.
The opening keynote was by Dr Nicola Whitton from Manchester Metropoliton University. “Game Based Learning in an Age of Austerity”. I then knew I was in the right place because everything Dr Whitton said resonated with some many projects. The key theme was that you don’t have to spend millions of pounds making games for learning, they also don’t all have to be technical deliveries of games. Making a game out of making a game can be as beneficial as putting people through the end product.
She showed an example of the sort of thing that happens in medical sims and games. Lots of huge expensive graphics and realism with a text multiple choice question on the top. Having had to build some things like that, to meet requirements, but having also subverted that with simpler ideas and gaming concepts I was sat there nodding in agreement.
#igbl2013 keynote take home points
The other point as you can see in the take home points above was that it is ok to play. Humans learn through play and it still gets regarded as not serious enough. Tales of people expecting to put away childish things and then how restrictive their thinking gets when faced with exploring and creating in a playful way.
After a quick coffee I chose to stay in the same room for the next track.
Neil Peirce from Learnovate Centre Trinity College Dublin presented his academic paper “Game Based Learning for Early Childhood“. I did not get so much form this, though I am sure the research was very as really I needed to read the paper. However, this was an academic conference and I am not technically an academic so it was not aimed at me πŸ™‚
I was more interested in “An investigation into utilising a Theraputic Exergame to improve the Rehabilitation Process” where the Waterford Institute of Technology had worked with physiotherapist and used a Kinect, Unity and Zigfu to explore repetition of exercise in the home. This appeared successful, and fitted well with my experience of kinect and how we can broadcast and work together physically using these devices.
Next up was a great Prezi presentation by S. Cogan from the National College of Ireland. He was explaining “engagement through ramification”. Whilst the word gamification is now tired and over used he presented it from a point of view of not knowing what it was, how he was going to use it and shared waht worked and didn’t work in getting the student he lectured to be bothered about class. The positive results were that he had got much better attendance and pass marks, but also that he himself felt a greater sense of achievement making some of the activities more game like. It was very inspirational to hear a younger lecturer looking to change things around a bit, but not completely overturn the system. It all just made sense, and sharing what failed (wrong rewards, too much game in some things etc.) was very refreshing.
After lunch it was 2 hour workshop time so I popped along to “Developing games for Learning using Kinect”. Now I was not sure what to expect, I thought I might be able to help out as well as have a bit of fun myself. I had not fully realized that it was Stephen Howell who is the creator of the very cool Kinect for Scratch. It was great to hear @saorog explain scratch to the audience of mostly non techy people and get them programming in the same way its done with kids, then hit everyone with the simplicity of using the kinect in scratch. It was very well done and very well delivered, he is a clever bloke πŸ™‚ . He also showed us his Leap motion controller in action πŸ™‚
After a bit of Irish culture on the evening.
Followed by almost no sleep as my hotel room had no double glazing so the ongoing partying in Camden Street and the subsequent bottle deliveries pretty much filled all the early morning….
Elfeay and I did out little pitch. “I am a Gamer. Not because I don’t have a life, but because I choose to have many”. We each did a 8 minute chat on how we have found our various tribes, how games and games technology and culture have led us to places that then feed into more games and games culture. My example being the journey in Choi Kwang Do via games and tech, plus a few other things thrown in πŸ™‚ It provoked some good workshop style discussion too. It was also great fun to do in that format and huge thanks to Elfeay for getting it all on the roster πŸ™‚
It also dovetailed nicely into some Pecha Kucha sessions where slides are timed and there are a given number in a given time slot. It is almost the presentation equivalent of a vine or a tweet. Concise, well planned to fit and delivers a lot in a small package.
Dudley Turner from University of Akron in the states did a great piece of “developing a quest-based game for university student services”. Making the discovery of what is available to new students on campus through a mix of alternate reality pieces of information like emails and mini websites linking a narrative to real world tresure hunting using Aurasma AR location specific tags to get them to places. It is all part of a course that they have to take anyway, but this gets them out and about and engaged with the places they need to be at, not just reading and looking at maps.
The next was P.Locker presenting “The snakes and ladders of playing at design:Reflections of a museum interactive designer, game inventor and exhibition design educator”. This was refreshing as it was not really about the tech end of things, but the core of play and interaction. Physical installations and how being a board game designer helped create museum pieces and teach others how to make them. There are elements of practicality and robustness in the physical aspects to consider as well as the learning aspects.
Finally, last but by no means least, S. Comley the University of Falmouth talked about “Games based learning in the Creative Arts”. It was surprising to hear that arts were not heavily involved in the use of games in the learning process. It seems human nature to stick with what you know hits everywhere. It will be interesting to see where her research takes her, as this was a precursor to a much bigger piece of work, stating the problem and the potential benefits. In particular to get cross discipline interaction in academic arts.
That was it for the main tracks. I missed a lot as lots was on at the same time but it was all brilliant.
It ended with a keynote from Fiachra O Comhrai from Gordon Games. “Using game science to engage employees and customers in the learning, knowledge sharing and innovation process”. What was interesting here was that Gordon Games was formed by non gamers. Mr Comhrai said he formed the company then decide to look at some games. I felt slightly uneasy at that, but good on them for doing it anyway. As a long time gamer where I both play and analyze what is going on it felt that my 40 years of gaming experiences was being summed up as something that you pick up in weekend on xbox. I am not sure that is what he meant. Also much of their work is in call centres, though we did not get to see many examples of Gordon Games, we did get to see some fun videos from other people.
After that some of us Second Lifers headed off for lunch before I dived into a taxi and headed home.
Here we all are looking very shifty πŸ™‚

@elfeay @acuppatae @inishy @d_dreamscape @iClaudiad
So thankyou Dublin and thankyou everyone at the conference it was a blast and very inspiring to be with so many clever and passionate people.

Training in a virtual hospital + zombies

It is not very often I get to write in much detail about some of the work that I do as often it is within other projects and not always for public consumption. My recent MMIS (Multidisciplinary Major Incident Simulator) work for Imperial College London and in particular for Dave Taylor is something I am able to talk about. I have know Dave for a long while through our early interactions in Second Life when i was at my previous company being a Metaverse Evangelist and he was at NPL. Since then we have worked together on a number of projects, along with the very talented artist and 3d Modeller Robin Winter. If you do a little digging you will find Robin has been a major builder of some of the most influential places in Second life.
Our brief was for this project was to create a training simulation to deal with a massive influx patients to an already nearly full hospital. The aim being several people running different areas of the hospital have to work together to make space and move patients around to deal with the influx of new patients. It is about the human cooperation and following protocol to reach as good an answer as possible. We also had a design principle/joke of “No Zombies”

Much of this sort of simulation happens around a desk at the moment, in a more role play D&D type of fashion. That sort of approach offers a lot of flexibility to change the scenario, to throw in new things. In moving to a n online virtual environment version of the same simulation activity we did not want to loose that flexibility.
Initially we prototyped the whole thing in Second Life. Robin built a two floor hospital and created the atmosphere and visual triggers that participants would expect in a real environment.
Second LifeScreenSnapz002

Note this already moves on from sitting around a table focussing on the task and starts to place it in context. However also something to note is that the environment and creation of it can become a distraction from the learning and training objective. It is a constant balance between real modelling, game style information and finding the right triggers to immerse people.

For me the challenge was how to manage an underlying data model of patients in beds in places, of inbound patients and a simple enough interface to allow bed management to be run by people in world. An added complication was that of specific timers and delays needing to be in place. Each patient may take more of less time to be moved depending on their current treatment. So you need to be able to request a patient is moved but you then may have to wait a while until the bed is free. Incoming patients to a bed also have a certain time to be examined and dealt with before they can then be possibly moved again.

A more traditional object orientated approach might be for each patient to hold their own data and timings but I decided to centralise the data model for simplicity. The master controller in world decided who needed to change where and sends a message to any interested party to do what they need to do. That meant the master controller held the data for the various timers on each patient and acted as the state machine for the patients.
In order to have complete flexibility of hospital layout too I made sure that each hospital bay was not a fixed point. This meant dynamically creating patients and beds and equipment at the correct point in space in world. I used the concept of a spawn point. Uniquely identified bay names placed as spawn points around the hospital meant we could add and removes bays and change the hospital layout without changing any code. Making this as data driven as possible. Multiple scenarios could then be defined with different bay layouts and hospital structure, with different types of patients and time pressures, again without changing code. The ultimate aim was to be able to generate a modular hospital based on the needs of the scenario. We stuck to the basics though, of a fixed environment (as it was easy to move walls and rooms manually in Second Life, with dynamic bays that rezzed things in place.
This meant I could actually build the entire thing in an abstract way on my own plot of land, also as a backup.
I love to use shapes to indicate the function of something in context. The controller is the box in this picture. The wedge shape is the data. They are close to one another physically. The torus are the various bays and beds. The flat planes represent the white board. They are grouped in order and in place. You can be in the code. Can think about object responsibility through shape and space. It may not work for everyone but it does for me. The controller has a lot of code in it that also has a more traditional structure. One day it would be nice to be able to see and organise that in a similar way.

This created a challenge in Second Life as there is only actually 64k of memory available to each script. Here I had a script dealing with a variable number of patients, around 50 or so. Each patient needed data for several timer states and some identification and description text. Timers in Second Life are a 1 per script sort of structure so instead I had to use a timer loop to update all the individual timers and check for timeouts on each patient. Making the code nice a readable with lots of helper functions proved to not be the ideal way forward. The overhead of tidyness was more bytes in the stack getting eaten up. So gradually the code was hacked back to being inline operations of common functions. I also had to initiate a lookup in a separate script object for the longer pieces of text, and ultimately yet another to match patients to models.

The prototype enabled the customers (doctors and surgeons) to role play the entire thing through and helped highlight some changes that we needed to make.
The most notable was that of indicating to people what was going on. The atmosphere of pressure of the situation is obviously key. Initially the arriving patients to the hospital were sent directly to the ward or zone that was indicated on the scenario configuration. This meant I had to write a way to find the next available / free bed in a zone. This also has to be generic enough to deal with however many beds have been defined in the dynamic hospital. Patients arrived, neatly assigned to various beds. Of course as a user of the system this was confusing. Who had arrived where. Teleporting patients into bays is not what normally happens. To try and help I added non real work indicators, lights over beds etc that meant a across a ward could show new patients that needed to be dealt with.
If a patient arrived automatically but there was no bed they were places on one of two beds in the corridor. A sort of visual priority queue. That was a good mechanism to indicate overload and pressure for the exercise. However we were left with the problem of what happened when that queue was full. The patients had become active and arrived in the simulation but had nowhere to go. This of course in game terms is a massive failure, so I treated it as such and held the patients in an invisible error ward but put out a big chat message saying this patient needed dealing with.
I felt this was too clunky to have to walk around the ward keeping an eye out so as I had a generic messaging system that told patients and beds where to be I was able to make more than one object respond to a state change. This led to a quick build of a notice board in the ward. At a glance red, green and yellow status on beds could be seen. Still I was not convinced this was the right place for that sort of game style pressure. It needed a different admissions process once that was controlled by the ward owners. They would need to be able to still say “yes bring them in to the next available bed (so my space finding code would still be work)” or direct a patient to a bed.
The overal bed management process once a patient was “in” the hospital
SL Hospital
The prototype led to the build of the real thing. It was a natural path of migration to Unity3d as I knew we could efficiently build the same thing in a way we could then simply use web browsers to access the hospital. I also knew that using Exit Games Photon Server I could get client applications talking to one another in in synch. From a software engineering point of view I knew that in C# I could create the right data structures and underlying model to be able to replicate the Second Life version but in a much better code structure. It also meant I could initiate a lot more user interface elements more simply as this was a dedicated client for this application. HUD’s work in Second Life for many things but ultimately you are trying to not make things happen, you don’t want people building or moving things etc. In a fixed and dedicated unity client you can focus on the task. Of course Second Life already had a chat system and voice so there was clearly a lot of extra things to build in Unity, but there is more than one way to skin a virtual cat.

The basic hospital and patient bed spawning points connected via Photon in Unity was actually quite quick to put together, but as ever the devil is in the detail. Second Life is server based application that clients connect too. In Unity you tend to have one of the clients as a server, or you have each client responsible for something and let the others take a ghosted synchronisation. Or a mixture as I ended up with. Shared network code is complicated. Understanding who has control and responsibility for something, when it is distributed across multiple clients takes a bit of thought and attention.

The easiest one is the player character. All the Unity and Photon examples work pretty much the same way. Using the Photon Toolkit you can instantiate a Unity object on a client and have that client as the owner. You then have the a parameters or data that you want to synchronise defined in a fairly simple interface class. The code for objects has two modes that you cater for. The first is being owned and moved around, the other is being a ghosted object owned by someone else just receiving serialised messages about what to do. There are also RPC calls that can be made asking objects on other clients to do something. This is the basis for everything shared across clients.
For the hospital though I needed a large control structure that defined the state of all the patients and things that happened to them. It made sense to have the master controller built into the master client. In Unity and photon the player that initiates a game and allows connection of others is the master. Helpfully there are data properties you can query to find this sort of thing out. So you end up with lots of code that is “if master do x else do y”.
Whoever initiates the scenario then becomes the admin for these simulations. This became a helpful differentiation. I was able to provide some overseeing functions, some start, stop pause functions only to the admin. This was something that was a bit trickier in SL but just became a natural direction in Unity/Photon.
One of my favourite admin functions was to just turn off all the walls just for the admin. Every other client is able to still see the walls but the admin has super powers and can see what is going on everywhere.

No walls

This is a harder concept for people to take in when they are used to Second Life or have a real world model in their minds. Each client can see or act on the data however it chooses. What you see in one place does not have to be an identical view to others. Sometimes that fact is used to cheat in games, removing collision detection or being able to see through walls. However here is it a useful feature.

This formed the start of an idea for some more non real world admin functions to help monitor what is going on such as cameras textures that let you see things elsewhere. As an example wards can be looked at as top down 2d or more like a cctv camera seeing in 3d. Ideally the admin is really detached from the process. They need to see the mechanics of how people interact, not always be restricted to the physical space. Participants however need to be restricted in what they can see and do in order to add the elements of stress and confusion that the simulation is aiming for.


Unity gave me a chance to redesign the patient arrival process. Patients did not just arrive in the bays but instead I put up a simple window of arrivals. Patient numbers and where there we supposed be be heading. This seemed to help, though a very simple technique, in a general awareness for all participants that things were happening. Suddenly 10-15 entries arriving in quick seemingly at the door to the hospital triggers more awareness than lights turned on in and around beds. The lights and indicators were still there as we still needed to show new patients and ones that were moving. When a patient was admitted to a bed the I put in the option to specify a bed or to just say next available one. In order to know where the patient had gone the observation phase is now additionally indicated by a group of staff around the patient. I had some grand plans using the full version of Unity Pro to use the path finding code to have non player character (NPC) staff dash towards the beds and to have people walking to and fro around the hospital for atmosphere. This turned out to be a bit to much of a performance hit for lower spec machines, though it is back on the wish list. It was fascinating seeing how the pathfinding operated. You are able to define buffer zones around walls and indicated what can be climbed or what needs to be avoided. You can then tell an object an end point and off it goes dynamically recalculating paths avoiding things and dealing with collision, giving doors enough room and if you do it right (or wrong) leaping over desks beds and patients to get to where they need to go πŸ™‚ )

One of the biggest challenges was that of voice. Clearly people needed to communicate, that was the purpose of the exercise. I used a voice package that attempted to package messages across the network using photon. This was not spatial voice in the same way people were used to with Second Life. However I made some modifications as I already had roles for people in the simulation. If you were in charge of A&E I had to know that. So role selection became an extra feature not used in SL where it was implied. This meant I could alter the voice code to deal with channels. A&E could hear A&E. Also the admin was able to act as a tannoy and be heard by everyone. This also then started to double up as a phone system. A&E could call the Operating Theatre channel and request they take a patient. Initially this was a push to talk system. I made the mistake of changing it to an open mic. That meant every noise or sound made was constantly sent to every client, and the channel implementation meant the code chose to ignore things not meant for it. This turned out to be (obviously) massively swamp the Photon server when we had all out users online. So that needs a bit of work!

Another horrible gotcha was that I needed to log data. Who did what when was important for the research. As this was in Unity I was able to create those logging structures and their context. However because we were in a web browser I was not able to write to the file system. So a next best solution was to have a logging window for the admin that they could at least cut and paste all the log from. (This was to avoid having to set up another web server and send the logs to it over http as that was added overhead to the project to manage). I create the log data and a simple text window that all the data was pumped to. It scrolled and you could get an overview and also simply cut and paste. I worked out the size was not going to break any data limits or so I thought. However in practice this text window stopped showing anything after about a few thousand entries. Something I had not tested far enough. It turns out that text is treated the same as any other vertex based object and there are limits to the number of vertices and object can have. So it stopped being able to draw the text, even though lots of it was scrolled off screen. It meant the definition of the object had become too big. i.e. this isn’t like a web text window. It makes sense but it did catch me out as I was thinking it was “just text”.

An interesting twist was the generic noticeboard that gave an overview of the dynamic patients in the dynamic wards. This became a bigger requirement than the quick extra helper in Second Life. As a real hospital would have a whiteboard, with patient summary and various notes attached to it, then it made sense to build one. This meant that you would be able to take some notes about a patient, or indicate they needed looking at or had been seen. It sounds straight forward but the note taking turned out to be a little more complicated. Bear in mind this is a networked application multiple people can have the same noticeboard open, yet it is controlled by the master client. Typing in notes needed to be updated in the model and changes sent to others. Yes it turned out I was rewriting google docs ! I did not go quite that far but did have to indicate if someone had edited the things you were editing too.
We had some interesting things relating to the visuals too. Robin had made a set of patients or various sizes, shapes and gender. However with 50 patients or so, (as there can be any number defined) and each one described in text “a 75 year old lady” etc it meant it was very tricky to have all the variants that were going to be needed. I had taken the view that it would have been better to have generic morph style characters in the beds to avoid content bloat. The problem with “real” characters is they have to match the text (that’s editorial control), and also you need more than one of each type. If you have a ward full of 75 year old ladies and there are only 4 models it is a massive uncanny valley hit. The problem then balloons when you start building injuries like amputations into the equation. Very quickly something that is about bed management can become about details of a virtual patient. IN a fully integrated simulation of medical practice and hospital management that can happen, but the core of this project was the pressure of beds. i.e. in air traffic control terms we needed to land the planes, the type of plane was less important (though had a relevance still)

It is always easy to lose sight of the core learning or game objective with the details of what can be achieved in virtual environments. There is a time and cost to more content, to more code. However I think we managed to get a good balance with the release we have, and now can work on the tweaks to tidy it up and make it even better.

The application has also been of interest to schools. We had it on the Imperial College stand at the Bang education festival. I had to make an offline version of this. I was hoping to simply use Unity’s publish offline web version. This is supposed to remove the need to have any network connection or plugin validation. It never worked properly for me though. It always needed network. I am not sure if anyone else is having that problem, but don’t rely on it. That meant I then had to build standalone versions for mac and windows. Not a huge step but an extra set of targets to keep in line. I also had to hack the code a bit. Photon is good at doing “offline” and ignoring some of the elements but I was relying on a few things like how I generated the room name to help identify the scenario. In offline mode the room name is ignored and a generic name is put in place. Again quite understandable but cause me a a bit of offline rework that I probably could have avoided.

In order to make it a bit more accessible Dave wrote a new scenario with some funnier ailments. This is where were broke our base design principle and yes we put zombies in. I had the excellent Mixamo models and a free Gangnam style dance animation. Well it would have been silly not to put them in. So in the demos is the kids started to drift off the “special” button could get pushed and there they were.
I have shared a bit of what it takes to build one of these environments. It has got easier, but that means the difficult things have hidden themselves elsewhere.

If you need to know more about this and similar projects application the official page for it is here

Obviously if you need something like this built, or talked about to people let me know πŸ™‚

Rez Day again – Reflection on 7 years of joy and pain

All the joys of trying to move physical home and my focus on training in Choi Kwang Do, plus the Predlets being on holiday meant my Rez Day in Second Life nearly passed me by! It has been just over 7 years now since diving into SL and that has been a catalyst for a great number of changes and opportunities (and quote a few threats) in my life and the lives of many of my friends and colleagues.
I am still amazed at the power of what happened back in 2006 the power of people to gather and share in a virtual space. The creativity and buzz is something that many of us will never fully experience again.
This may seem crazy but this formed part of a customer briefing on security!
We really didn’t know the potential(which is still there), we just knew that exploring code and shapes, interactions with people etc was going to take us somewhere. I mean what do you do when your scripted space hopper rolls away into someone else space?
Postcard from Second Life
There are not many pieces of software that you can look back on and say that of. Of course the software, the networks etc was just an enabler for people to communicate and explore.
I have a collection photos of real events that happened in world. These are as memorable to me as any other photo or holiday snap. Real people doing real things, just mediated through bits and bytes.
I am still amazed though at the fear and negative vibes that many of us endured, and in some cases still do from the actions we took online with one another. It is hard to see why, when something is actually so positive it needs people to act against it. Not act against Second Life but against the freeform organisation of others. I doubt anyone who as experienced this will ever be able to fully share the full details of their particular lows. Many are deeply personal. Those acting to destroy such a positive wave of energy know full well what they were doing, who knows they may think they have won some fairground prize. In reality they have lost something and probably strengthened something and in a way done us all a favour.

What do we take as a positive from that though? Well anything that generates that much passion, both for and against it is not just another fad, another niche. It is obviously tqping into some deep needs in humans to either communicate and share, to gather together, or on the other side of the virtual coin to control and break things that they do not understand.
Unfortunately as it was the people not the software that was the problem, it was the people’s potential that got attacked not the bits and bytes.
2006-2009 was a technology bubble for virtual worlds, it was also a cultural bubble. We had to go through that and experience the joy and pain of it all in order to be here for the next wave. Culture takes a while to change but we are seeing much more sharing, much more open source. A realisation the power is in people not organisational structures. You can have a balance. You can have rank, a meritocracy. You can have rules but yet creative freedom within them.
What happened for a small portion of us in Virtual Worlds back then called eightbar was that we all worked to improve our own understanding of what the potential was, but in doing that we wanted to help others who were on the same journey. It was not about money or power. It was not about glory or control. That is hard for many people to understand who were not feeling the buzz.
I am now able to reflect on what we did instinctively for the good of the art so to speak but seeing how the martial art I study works. This does not feel a tenuous link as the conversations we have resonate with 7 years ago for me.
A martial art is a meritocracy. Through your personal goals and willingness to better yourself at your own level your earn belts. In Choi Kwang Do this is not through beating people or competing. It is not through negative comments of how badly you are doing, how much you missed a target. Instead it is active encouragement to enjoy mistakes to evolve and reach a goal. Lining up in belt rank order is never to say those with the higher belt are better. They are more experienced but they still learn, they still add to their skills. This is where it may lose some people though. We have control. A person, usually with lots of experience, will run a session. They will give commands, set tasks. We do them. It may seem regimented, yet each person is aiming to improve their technique, to learn and evolve. The person in charge at the time is also improving their own knowledge through observation, helping others with positive pointers. People step up to lead and are allowed to lead because they have earned the respect of everyone. However they never, ever apply any ego to that.
If I was to head back to 2006 and the wave of awesome virtual world discoveries, the teamwork and the sense of adventure we all had I am pretty sure I would do it all again. For me now, having seen another very positive gathering of humans trying to explore something exciting, yet applying some structure, I may have been able to help us be something a little stronger to deal with those less enlightened individuals. Maybe even to help them achieve more. The martial art is focussed on defence against a threatening force though. With out the potential for threats, without a counter to all this positivity it would not need to exist. So maybe we just enjoyed such a productive and interesting time because were were up against some people who were not so interesting or productive πŸ™‚
This year I still did some work in Second Life too. I was part of a build of a hospital experience that was about dealing with a mass influx of patients. The doctors and hospital staff (real ones) had to decide which patient to move where to make space for the sudden influx. There was a lot of code I had to design and put in place to be able to make patients deteriorate, take time to move from one place to another. Deal with multiple decisions, provide visual feedback etc. We did all this in order to help lock down the requirements. Then went on to build a web based version networked with voice too in Unity3d. We have then used it to inspire some kids too, we even added a few friendly dancing zombies. Kids love zombies. It is a fine line between playing and being serious. Virtual worlds and games bounce across that line, twist it, warp it and sometimes rub it out all together. Still they can be used for so many reasons and why wouldn’t you?

Sharing Arduino experiences with STEMnet ambassadors

The STEMnet programme here in the UK enables people who are interested in sharing their expertise in STEM subjects with the next generation are given some support to do so. Schools ask their local STEMnet coordinator for help and volunteers step up and go along.
Yesterday at the Intech science centre (our local hub for STEMnet) I helped run an ambassadors intro into Arduino (and also a little bit of Scratch too).
Intech had bought 8 arduino starter kits. These are fantastic combinations of components and projects that have now become more official within the arduino community.
Unboxing arduino
The packaging and collection of it is very professional and whilst still all based on open source it provide way more than I was expecting in the pack. Before seeing the kit I has thought we could start with the basics of the blinking light (a standard piece of code) using the onboard LED, then build the flashing light, then cut and paste and build 2. Basically following the 3 minute piece we did on Cool Stuff Collective
The basic presentation I used was this one, it was not designed to be overly pretty as it was just a catalyst to get things going.
I also did not know what the experience of each of the attendees was likely to be. As it was we had a great and helpful mix of people who knew lots, and were very advanced hardware engineers, some traditional IT professionals and programmers and some very enthusiastic newcomers to the subject (but technically literate)

Arduinointech from Ian Hughes

***update 13:23 14/2 (Just uploading again as it seem the Slideshare conversion repeated some words and removed others!)
The aim of the pitch was to suggest the basics (inspired by the basics in Choi Kwang Do)
I thought most kids would want to be into programming initially because of games. There is an instant disconnect between seeing all the code and the effort for a AAA title that can be quite off putting.
So I settled on the light switch as a real world blended reality example. Layering on that the Arduino is a swicth we can control, but that the basics of input, process, output or sense, decide, respond are the fundamentals of everything. So if you get a basic piece of the experience dealing with getting some user input, deciding what to do and then making a change with an output you cover an awful lot of ground very quickly.
Very often as techies and engineers we all see the intricate detail, we become very aware of everything that we don’t know, how complex details can be. However if we treat the knowledge as a set of repeating patterns, like a fractal image we can talk about a basic building block without worrying about the complexity of the overall picture. After all you have to start somewhere.
Anyway, a huge thankyou to Sarah at Intech for hosting and for getting all the kit and for asking me to help in the first place πŸ™‚ A huge thankyou to the group of Ambassadors that braved the potential snowstorm and dived and all had a go and got everything working in the few hours we had. It helped to debug what we need to tell students and other ambassadors.

Born to be wild – how to break a guitar

As I may have mentioned before I like guitar games πŸ™‚ Rocksmith has been a great advance and forms part of many of the talks I give including the one I will be giving a week today at BCS Hampshire as part of the Animation and Games development specialist groups events.
Last week I dived back into Rocksmith as some new DLC arrived in the form of Steppenwolf “Born to be wild”. As I tweeted at the time this felt like a circle was complete. Born to be wild was one of the tunes in the original Quest For Fame guitar game. It featured in an amusing encore where a very gruff knuckle dragging cartoon character shouts at you “PLAY STEPPENWOLF!!!!” and you then launch into a tennis racquet strumming rendition. One of those video game phrases that has become part of some of our vocabulary.

There are 3 new songs in the DLC, Born to be wild is in the middle of this video πŸ™‚
Now of course this is for real guitar not plastic buttons or cricket bats. It would appear though that my poor old Fender Squire (a cheap version of a Stratocaster) had had enough though. It was starting to wander out of tune after every song and felt a bit funny. Then there was a loud shattering noise and it sort of exploded. After 23 years the inner metal parts seemed to have given up the ghost.
Corroded gubbins on 23 year old cheap strat
The whammy bar nut sheared but also many of the parts holding the strings in place were none too happy either. It was a sad moment but these things happen. I pondered buying a more expensive guitar replacement, but thought instead I would pop down to Argos (of all places) and get another cheap(isn) fender. So came back with this
Shiny new fender starcaster for #rocksmith
Now I know a bad workman always blames his tools but my old guitar was actually making life a lot more difficult for me. I had forgotten that about 20 years ago ( in its 23 year life) the string guide at the top of the neck had broken. I went and got a new piece from a proper guitar shop but it was a bit to large. As I was only tinkering I thought it would be ok but it meant all the strings were a lot higher from the fret board then they should have been. (It should have been cut or filed down but I dind’t have anything that would deal with it.) This meant extra pressure and fraction of a second extra time to push strings down.
The new guitar now feels amazing, even for a Β£99 one. The strings were much lower any chord changes just flowed a little better. As it was a similar style of guitar everything else was pretty much the same. This change was instantly measurable too. As Rocksmith is scoring your work too there was a noticeable uplift. I still can’t play really properly but I am getting closer πŸ™‚
So it does pay to have something decent to use as a tool for the job, but equally working with what you have makes you appreciate improvements more.

Under pressure – games and simulations XCOM

This week I have been looking into how we get one of the virtual worlds projects to put some pressure on the people taking part in a simulation, but not make things so over the top that they give up. There are lots of gaming examples for entertainment that place layers of pressure on people, so there is lots of inspiration to draw upon. In particular though, the latest incarnation of XCOM, Enemy Unknown provides a lot of pointers.
XCOM is a turn based strategy game. For non gamers who may thing it is all fast moving running around and trigger reactions they my be surprised to find out that sometimes games offer a bit of time and thought. XCOM gets you onto a battlefield with a small squad of soldiers against an enemy alien force of invaders. Different units have different abilities on both sides and how you use these and how you take your turns is the key. In many ways this is like a game of chess. You can take as long as you like to asses the situation and make each of your turn based allocations of movement.
(Image from
Just these battles alone provide a degree of stress induction. When you are stressed you make worse decisions, so you start to work out when to act carefully or rashly to deal with an ever changing situation. Probability and chance throw curve balls as you decide to take a percentage shot at an alien only to be let down by the hidden dice rolling against you. You also have to consider when is a time for a tactical withdrawal. Soldiers that survive missions get promoted and gain extra abilities. Lose a high ranking squad and you are on the back foot with a group of rookies.
This on the ground tactical part of the game is the major part of the experience but there is a higher level strategy as you have to finance this ongoing war and research new weapons and engineering facilities. There are so many facets to this and resources are very tight that you do have to make some big decisions that impact the flow of the game. Do you focus on putting up satellites and fighters to deal with incoming ships, direct scientist to research defensive capabilities or powered up weapons for the soldiers. How much do you expand your base, considering the power consumption of new facilities.
(Image from
The multi faceted nature of what has to be considered is what makes this game work particularly well. You may be in the thick of it in a combat situation, considering the intricate detail of which move or weapon to use but you have in the back of your mind what needs to be done before the next mission and what to focus on there.
As with many games though the odds are against you, interestingly the game does have an impossible setting. Indicating that the alien invasion will be successful, but how long can you survive the inevitable loss. Interestingly this harks back to the original arcade game space invaders, where you are pretty much never going to win, you run out of money or time, but you always lose.
So the key here is that you need to feel that you can make a difference, but you have to know that there are other forces at work, other decisions that need to be made in and around the specific detail of any event. Any simulation needs to layer the pressures on the participants. That may be actual decisions or levels of detail of information that inform the narrative but aim to overload and confuse just to the point of stress. Too much and there is almost no point proceeding.
Balance in games and simulations is the key. Atmosphere and a story let humans fill in some of their own levels of details as that engage with it. Whilst there may be an algorithm as work obscuring it, or making it less obvious and throwing in random elements help a greta deal.
XCOM is a great game (as was its 1995 original). Turn based strategy games still work too which is great to see.

A real educational game that rocks

I am typing this with some very sore finger tips on my left hand as yesterday Rocksmith arrived in the post from Amazon. I known this has been out stateside for a year but it has finally hit these shores. If you have not seen Rocksmith or been dubious about it, it is a music game where you play along with tunes with a guitar. Hey that sounds like Rockband/Guitar Hero! Well this time you get to use your real guitar and play the full notes (hence the sore fingers).

I have dabbled with guitar since I bought a Fender Squire and an amp with my first pay check. I never played anything much though I know a few chords and the blues scale is always nice to wibble around.
Loving music and not being quite able to match my guitar playing to the music I sort of lapsed. Music games were a great way to get back into playing something in some way. I wrote about the early, pre Rockband, experiences back in 2007 on eightbar whcih then got picked up by
the Boston Globe in 2008 !
I had noticed that my ability to play rhythm had improved using the basic Rockband instruments. It is why I upgraded to the 102 button guitar to get closer to real guitar with a game and even got to play a bit on the TV on the Cool Stuff Collective
I recently finished the tour mode of Rockband 3 (I had been playing for ages but not bothered with the tour too much it was always a party game)
Feeding Edge is also embedded in Rockband both as tattoos and our bands name and Logo as a bit of fun
Feeding edge rockband 3 style
Having finished Rockband 3 it seemed time to upgrade to the all new Rocksmith. My Fender was still sitting in the corner and Predlet 1.0 has also started guitar lessons at school.
I plugged the guitar in with the lead that is jack to usb into the xbox, loaded it up and I was instantly amazed and how well it worked. You are straight into a tuning section and the guitar was spot on in seconds. Clearly it was picking up the right signals!
Like Rockband/Guitar here the notes drive down towards you. They are coloured based on the string, a picture of the guitar fret board shows whats about to be played. When the notes arrive you pluck them.
The really clever part is the auto adjusting skill level. If you start to get the basic single notes and rythm it will start to add more. In the career mode it gently eases you through songs, but you start instantly with the Rolling Stones Satisfaction with the iconic sounds of that tune. Predlet 1.0 dived straight in and was really happy to play the song. She already had the guitar basics from her lessons so knowing strings and frets made it easier as it did with me.
I carried on the career line for a while then just dived into playing house of the rising sun. This is where I saw it adjust to my level very quickly. It gave a few notes and before I knew it it was showing the proper chord progression. Chords that were stored away in the back of my brain, C, A, E etc. So there I was playing rhythm guitar on my old electric with a responsive backing track that if things went slightly wrong it would turn down the level let me get my act together and back into it again. That makes this the best of all music games.
Now is it orange peel I rub on my finger tips to harden them? ouch!