I was pleased to get a chance to head to London last Friday for the launch of Kate Russell’s new book Working the Cloud. Amazingly this is Kate’s first book which given how often she has provided interesting tech news on the TV on shows like BBC Click is pretty amazing. Mind you, I have not written a book yet either 😉
So yes I was at the launch party along with other media people and fellow tweeters, but I wouldn’t write about something unless I thought it had something going for it. This book and its supporting material certainly does.
So what is the book all about ? Well it is a fantastic curated collection of useful tools to help anyone who has not yet fully realised the power of the web in getting things done. Starting a business, communicating with customers, creating interesting content etc. it is all covered.
It is not merely a collection of URL’s as each is explained as to the benefits of a particular tool or web application. Each section also has more than one alternative and is also peppered with tips. This is because Kate is writing about things she has used, is using and in some cases maybe stopped using for a better alternative.
This is really an expression of Maker culture, but for the more apparent mainstream ability to just get on with things. Too often people want to read lots of instructions for each application, making decisions as if there was not way to change afterwards. I think the book points to a spirit of doing. There are a stack of tools out there, they are accessible, often free or very cheap and there is no excuse for not using them.
To prove the point that there is more than one way to uv unwrap a feline polygon model Kate has done more than just publish a book. There is the ongoing companion website http://workingthecloud.biz/ and apps for mobile devices too.
The cover of the book itself is an augmented reality trigger for Aurasma’s app too. Something worth having a look (and listen to) if you see a physical copy of the book in a shop. The intro (without sound on) can raise a few eyebrows someone is looking over your shoulder 😉
Kate also did a virtual book signing. Oddly I missed this as I was in a Choi Kwang Do class at the time. However it was a transmedia gig using Google Hangout
There were some tweets with @andypiper about the other first virtual book launch back in 2006 in Second Life. Which I remember very well 🙂
This is of course a different sort of event 🙂 So we are not going to take Kate’s first away from her 😉 Though the next edition of the book must have some 3d virtual world communication in it. Who knows maybe I can guest write that 🙂
I am looking forward to Kate’s next book which she crowdsourced the funding for on Kickstarter to write some science fiction for the new Elite Game which again proves that she is doing all the things in the book not just writing about them.
This week has been a very busy one of sharing ideas with all sorts of people in all sorts of places. It started Monday with the BCS Animation and Games Specialist group AGM, with a bit of the usual admin, making sure we are all happy with who is on the committee and inviting a new member to the fold. Then I got to do another presentation. As I had already done my Blended Learning with games one a few weeks before I talked about how close we might be to a holodeck. Mostly based on my article in Flush Magazine on the subject Afterwards though the discussion with some of the game development students turned into a tremendous ideas jam session. It was a very cool moment.
Tuesday I did a repeat visit (after 2 years) to BCS Birmingham for the evening and gave the ever evolving talk on getting tech into TV land for kids. I have to de-video the pitch and pop it onto slideshare as I use footage from The Cool Stuff Collective and I don’t have the media rights to put those bits up online. Something I am always asked about why? I don’t have a good answer as it is out of my hands.
Wednesday was a quick trip to Silicon Roundabout to talk funding and games and startups. Getting off the tube at Old Street and walking to ozone for a coffee based meeting felt like stepping into a very vibrant hub of startup and tech activity. Which it is of course.
Thursday was an early start and off to London for the BCS members convention. It was good to catch up with a lot of people I have met over the years through this professional body. It was very refreshing to see a definite fresh approach by the BCS with some great presentations, lots of tweeting and some interesting initiatives.
The most important was the work the BCS Academy has done (along with industry) to get Computer Science recognised by the department of education as the fourth science in the new syllabus. They really are going to start teaching programming from 5 years up! Result!
The day had already started well when I was honoured to receive an appreciation award from the BCS for the work I do as chair of Animation and Games. As I had done two talks the two days before this was very timely 🙂
Friday was a trip to help on the Imperial College London stand at the massive education science fair The Big Bang Exhibition.
Excel was buzzing with thousands of visitors.
There were tech zones, biology zones, 3d printers, thrust sec and even an entire maker zone.
The robot Rubiks Cube solver was there in all its glory too.
Space was well represented with real NASA astronauts chatting and drawing quite a crowd.
The Imperial College London stand was based around the medical side of things and on the corner of that stand was a Feeding Edge Ltd production 🙂
There will be a proper post on this but here is a link to the official line on the Multidisciplinary Major Incident Simulator
This unity3d and photon cloud solution (that I wrote the code for) allows for staff to practice a kind of air traffic control situation of dealing with a massive influx of patients. Having to shuffle beds and decide who goes where.
It was great being able to show this to a younger generation at the exhibition. Many came to the stand because they were interested in medicine. However a few came over to talk tech 🙂 They wanted to be programmers and build games. We had some good chats about minecraft too. Many of them appreciated the zombies that we threw into the configurable scenarios too :). If just a few % of the people who came to the fair get excited about science it will be worth it. It seemed though there was a new vibrant interest in all sciences and it is not as gloomy as it may appear sometimes.
Now it’s time to get my Dobok on and go and celebrate the opening of Hampshires first full time Choi Kwang Do facility and help get more people interested in the wonders of Choi. Phew!
Ye si know some of us of a certain web age thill think CGI is “Common Gateway Interface” and gets us this thinking in Perl and about $POST and $GET but CGI is now commonly known as Computer Generate Imagery. Thinks have certainly changed over the last few years with respect to what can actually be generated and just how good it has got.
Last night I saw the new Galaxy chocolate ad. Normally if there is some CGI my brain goes into how did they do that mode, travelling around the uncanny valley. However this ad I watched and thought wow that actress looks just like Audrey Hepburn. I did a quick google to find that it was Audrey Hepburn the CGI version. Now watching it again I can see that it is CGI, though not all the time.
The other ad doing the rounds that is obviously CGI, unless horses really can moonwalk is 3’s dancing pony. Strangely it is the real life bits that jar with me on this. The close in on what are model hooves is jarring compared to the smoothness of the rest of it.
It is nice to see 3 doing a pony dance mixer as a youtube application. It is a lot more Monty Python than slick CGI but its worth a few moments to have a go 🙂 Mine is here
What I saw today though whilst researching more about Holodecks was this great twist on the sort of building projection projects that animate entire cityscapes. Here the project canvas is a person. This is really brilliant I think. Lots of the images certainly stick with you after watching it. It works by scanning the person (who then sits very still) the scan informs the projector how to deliver and map the textures onto the surface. The same way designers use a 3d application and create a UV map of the textures and even baked lighting.
Props to @andypiper for tweeting this to me today. Whilst I had my virtual self in a unity3d build I had not had a surf around for a few days. Andy sent me an article about Myo a gesture control armband. It is currently available for preorder but it appears to be a way of determining muscle movement in the hand and lower arm to act as a blended reality controller. It will be a simpler version of the technology used in the latest prosthetic arms (I am assuming).
What is does do is free you from a camera based approach which means you can gesture anywhere. That becomes relevant if the thing you are controlling is not a computer screen but another active device. Their website shows an ARDrone.
It mixes sensing electrical muscle signals and general motion sensing, connects via bluetooth. It looks like they are providing an API and toolkit for Windows and Mac from day 1 🙂
I am very tempted to pre-order but as we have an impending house (and business) move keeping track of what I have on long term order and changing things is enough of a knightmare already. If only I could just wave my arm and solve it all… hey wait a minute…..