I am not sure how many people remember Rock em Sock em and all their variants. Here is a version of the original advert for them.
There were small figures connected to mechanical rods with thumb press buttons. You could move them back and forth and side to side with the aim to punch and hit the chin of the other bot, causing their head to pop up into the air.
I did not have this version, but I did have a human shaped boxing set. They were more free moving not stuck in a ring and had a counter of how many hits they had taken.
All in all they were great fun.
Which is why I know I would love to have a go on the Robot Combat League. I just happened to bump into this channel hoping. I initially thought it was Robot Wars. Which had arena combat between very differing radio controlled wheeled vehicles carrying big hammers and circular saws.
It turned out to be something different. A giant game of Rockem Sockem.
In Robot Combat league a team of 2 people control each robot device. One moves back forward and side to side. The other wears a rig that takes their arm movements to make the bot punch.
The movement around is a bit odd as the robot is not really free standing. The legs are fairly cosmetic as the bot is held of a large frame. The punching then is based on timing and targeting. Each bot is designed to look different and have a different style to it, but they are all rigged with sensors for pyrotechnics so lots of sparks fly when a good hit is made. That was all good but could obviously get a bit samey. It then seemed that the robots were actually able to damage one another as one bot swung in the arm of another, broke a hydraulic line and the robot was disabled and left dripping robot blood onto the arena floor for the cheering and jeering audience.
All very exciting blended reality fighting.
This is of course what happens in the film Real Steel. Which is one potential future of combat to consider.
The fact these are just engineered puppets, just like the toy of the 60’s, put a human element into the fight. It is people wanting to win against other people. I couldn’t help but wonder what would happen if we were able to have a fully free standing robot able to accept all the subtleties of human movement and how we would be able to use those in Choi Kwang Do as a training aid 🙂 Of course there is one problem to consider though in mirroring human movement. If a person makes contact with something or someone during a technique that technique and where they land and what comes next flows from the situation. The return impact is important. Punching into the air means that after contact you and the robot are no longer in synch. Your fist follows through, theirs stops on the armour of the other one. Now giving you haptic feedback to get you to feel what has happened would be a good start, but of course then you will be shoved around pushed etc so why have the robot at all 😉
The real excitement will be to have suitable fighting intelligence. One that can free form fight against another bot. Then of course we find ourselves approaching Asimov’s territory and robot rights. Is it ok to watch robots destroy one another as opposed to remote control machines.
There is no doubt though this phase of robot combat is an interesting one to watch. I am just of to get the welding kit out 🙂
Or we could just build it in minecraft – look at this 🙂
Yesterday I made what has become an annual pilgrimage to the NEC in Birmingham for the press and preview day of The Gadget Show Live. Initially we started to go to this for The Cool Stuff Collective as a reccy for the show. Now though it is just out of interest for all things tech.
The show has a lot of Televisions and Wireless speakers and a massive Windows stand too. There are also the regulars of Game, lots of manufacturers of cases for iPhones etc.
However there are always some more interesting corners to explore. This year there were more 3d printers than usual, some great real holography and some very cool very fast RC cars. I quick video, shot and edited up solely on the Iphone 5 has some the hightlights.
This 3d printed guitar body was cool.
A water powered battery
Sphero a ball that you drive around with your ipad
Some great collectibles
A brand new Star Trek game (which looked fantastic)
The Igloo with a sensor based gun to give 360 degree gaming (I would like one of these please!)
Some gadgets to help you fly out of the water and the crazy bouncy spring shoes too.
So a whole bunch of stuff. It only took a couple of hours to see everything, but I got there early and the halls were not as packed as they will be on public days.
Still a lot of interesting stuff even when you live on this sort of information 24/7 🙂
This week has been a very busy one of sharing ideas with all sorts of people in all sorts of places. It started Monday with the BCS Animation and Games Specialist group AGM, with a bit of the usual admin, making sure we are all happy with who is on the committee and inviting a new member to the fold. Then I got to do another presentation. As I had already done my Blended Learning with games one a few weeks before I talked about how close we might be to a holodeck. Mostly based on my article in Flush Magazine on the subject Afterwards though the discussion with some of the game development students turned into a tremendous ideas jam session. It was a very cool moment.
Tuesday I did a repeat visit (after 2 years) to BCS Birmingham for the evening and gave the ever evolving talk on getting tech into TV land for kids. I have to de-video the pitch and pop it onto slideshare as I use footage from The Cool Stuff Collective and I don’t have the media rights to put those bits up online. Something I am always asked about why? I don’t have a good answer as it is out of my hands.
Wednesday was a quick trip to Silicon Roundabout to talk funding and games and startups. Getting off the tube at Old Street and walking to ozone for a coffee based meeting felt like stepping into a very vibrant hub of startup and tech activity. Which it is of course.
Thursday was an early start and off to London for the BCS members convention. It was good to catch up with a lot of people I have met over the years through this professional body. It was very refreshing to see a definite fresh approach by the BCS with some great presentations, lots of tweeting and some interesting initiatives.
The most important was the work the BCS Academy has done (along with industry) to get Computer Science recognised by the department of education as the fourth science in the new syllabus. They really are going to start teaching programming from 5 years up! Result!
The day had already started well when I was honoured to receive an appreciation award from the BCS for the work I do as chair of Animation and Games. As I had done two talks the two days before this was very timely 🙂
Friday was a trip to help on the Imperial College London stand at the massive education science fair The Big Bang Exhibition.
Excel was buzzing with thousands of visitors.
There were tech zones, biology zones, 3d printers, thrust sec and even an entire maker zone.
The robot Rubiks Cube solver was there in all its glory too.
Space was well represented with real NASA astronauts chatting and drawing quite a crowd.
The Imperial College London stand was based around the medical side of things and on the corner of that stand was a Feeding Edge Ltd production 🙂
There will be a proper post on this but here is a link to the official line on the Multidisciplinary Major Incident Simulator
This unity3d and photon cloud solution (that I wrote the code for) allows for staff to practice a kind of air traffic control situation of dealing with a massive influx of patients. Having to shuffle beds and decide who goes where.
It was great being able to show this to a younger generation at the exhibition. Many came to the stand because they were interested in medicine. However a few came over to talk tech 🙂 They wanted to be programmers and build games. We had some good chats about minecraft too. Many of them appreciated the zombies that we threw into the configurable scenarios too :). If just a few % of the people who came to the fair get excited about science it will be worth it. It seemed though there was a new vibrant interest in all sciences and it is not as gloomy as it may appear sometimes.
Now it’s time to get my Dobok on and go and celebrate the opening of Hampshires first full time Choi Kwang Do facility and help get more people interested in the wonders of Choi. Phew!
Props to @andypiper for tweeting this to me today. Whilst I had my virtual self in a unity3d build I had not had a surf around for a few days. Andy sent me an article about Myo a gesture control armband. It is currently available for preorder but it appears to be a way of determining muscle movement in the hand and lower arm to act as a blended reality controller. It will be a simpler version of the technology used in the latest prosthetic arms (I am assuming).
What is does do is free you from a camera based approach which means you can gesture anywhere. That becomes relevant if the thing you are controlling is not a computer screen but another active device. Their website shows an ARDrone.
It mixes sensing electrical muscle signals and general motion sensing, connects via bluetooth. It looks like they are providing an API and toolkit for Windows and Mac from day 1 🙂
I am very tempted to pre-order but as we have an impending house (and business) move keeping track of what I have on long term order and changing things is enough of a knightmare already. If only I could just wave my arm and solve it all… hey wait a minute…..
Last night many of us watched on Ustream as Sony unveiled the PS4. As I tweeted at the time it was great to see Ustream in such a high profile event as we streamed a Second Life and real world interview at Wimbledon back in 2007. Just a few of us with portable cameras and a willingness to try something different.
Sony had a pretty impressive stage set up. There was a very large main screen, but they went for a CAVE type approach and around the screen and the side walls formed part of the experience.It was a bit like Microsoft’s Illumiroom ?
There was a massive influx of tweets when the first person on said “over the next 2 hours we will…..” and shouts of get on with it please!
Anyway, they did get on with it. The PS4 is pretty much a ramped up general purpose computer again. It has significant extra memory and additional processors to do some of the extra tasks. One of the things they mentioned was that it would be able to download patches and updates in the background. i.e. unlike the PS3 which sits and downloads patches not allowing you do anything else. Of course this really should be what they do. As part of this they announced being able to play games whilst they were downloading. So this is down to the design process to give you the bits you need at the start, not just a loading screen game as we used to see in the 8 bit days.
The new controller was unveiled, it is pretty much the old controller with a touch pad. A significant feature was a big light on the front. The said this was to help identify one another’s controller more easily. That was not really true as it was then announced a dual camera sensor would ship with the unit and it would be able to see the controllers. i.e. they stuck the glowing PS Move lights into the controllers 🙂 We will have to see what benefit that gives.
Sharing and connecting featured heavily. Being able to access content anywhere and targeted content. “A dynamic preference driven path through the world of content” apparently! The PS Vita was often mentioned too as the perfect way to enjoy streamed content when the main TV could not be used. Remote play.
Then there was cloud streaming. It had to feature as Sony had bought Gaiki It is a core of the social experience too it would seem. So if something is on the PS Store you can play it straight away, no need to download (or pay) making any game have a free trial period. As network speeds increase this makes a great deal of sense. Of course it then does make you wonder why you need a massively powerful console device if rendering is done remotely. However the best of both worlds can’t hurt can it.
The PS4 controller now gets a share button. A physical share button rather then the web page ones of Facebook and alike. This lets you share whatever you are doing to whoever in all the various permutations. They have put a dedicated video compression processor on board that basically encodes the last n minutes (I think n is 15) of whatever you are doing. You hit share and it throws the video to the world. This is to cut out all editing apparently. I hope it does not remove replay options in games as being able to recreate a moment from data and view it from anywhere is still great fun. Everyone replaying the same in game view of the same thing may get a little… samey.
Other than the PS Vita all the other second screens got a mention. So we will have to see how designers use these.
**Updated I almost forgot (it was a long show) they extended the social aspect and the remote play so that you can hand over the reigns of a game to someone somewhere else in the world. It is a bit like when one of the predlets ask me to do a bit of a level for them and hand me a controller. Whether many gamers want to hand over control is another matter. I am reminded of a patent idea @martinjgale and I did years ago at the old company. It was one that detected your level of inability to complete a level, or frustration and patched in remotely a guide or helper to get you past your moments of despair and enjoy the rest of the game. It was a great idea but somehow it didn’t get through the review process as they regard it as silly. Well….. 🙂
The main UI looked very Xbox 360, with the current trend for huge great video icons laid out in Mondrial style patterns, some animated. Primarily so that our clumsy touch interfaces can target them. Still it looked nice.
Then we were treated to a whole host of games, probably in game footage, but where cut scenes end and game begins is not alway clear. We did though get straight into first person shooting and then first person driving. Great genres, now shinier and smoother. Not a massive leap in game design though.
So I am sure the PS4 will be great, I am sure when they finally release it I will have to add to the console collection. The test will be if it has great content that is worth all this sharing. That of course is down to the games industry. They have a new box ready to do massive triple A titles. As long as they don’t forget the innovation that brings us awesome experiences like Minecraft! which the talking heads video seems to indicate 🙂
Here are the games though
The new edition of Flush magazine has just gone live and it features a little departure from my normal format with an interview I did with my friend Mike Edwards (@asanyfuleno) about his fascinating project gathering people together to build a revolutionary new electric racing superbike. It starts at page 105 (this is a direct link to it)
There as loads of other interesting articles in the magazine as usual.
I am really pleased we got this done as we had talked about it for ages. It would be great to be able to get a behind the scenes documentary going about this project too. Mike is also taking a crowdsourcing route as we mention in the article to get the project stages moving. This is all hot of the presses so look out for his helpusinnovate.com and if you happen to be intrigued or can help then I am sure Mike would love to hear from you 🙂
The importance of solving some of the challenges for a race bike will have a direct knock on to commercial everyday use. I have been pondering an electric car replacement for the shorter journeys when we finally move house in the next few weeks. I was suprised at how few options there were and still a heavy early adopter premium to consider. One major manufacturers car looked less expensive with its on the road price much lower than others, it then turned out you have to rent the battery and pay a monthly fee. So there is definitely some need for some innovation in the electric vehicle business to make it work for people.
Hopefully Mike’s project will make for some very exciting racing and I look forward to seeing an awesome independent bike take on the factory machines (petrol and electric)
This month marks the 4th year of Feeding Edge Ltd (est. 2009) and it has been interesting to be able to just get to the Twitter archives for my @epredator account and look back at some of the things that have happened.
I thought in celebration I would generate a Tag Cloud using processing and WordCram it took a bit of massaging of memory levels as there are a lot of words 🙂
This seems very representative of most of the time I have had Feeding Edge up and running. It is after all 2/3 of the tweets time wise and I am probably a more prolific tweeter (and Retweeter as RT is the biggest word 🙂 ) now than ever.
I obviously have talked about the Predlets a lot, and talked to some particular people more than others. It is interesting that in general the words look positive as I try not to be too negative, even back when things were very difficult.
So happy birthday Feeding Edge 🙂 right back to the Unity3d development then 🙂
The STEMnet programme here in the UK enables people who are interested in sharing their expertise in STEM subjects with the next generation are given some support to do so. Schools ask their local STEMnet coordinator for help and volunteers step up and go along.
Yesterday at the Intech science centre (our local hub for STEMnet) I helped run an ambassadors intro into Arduino (and also a little bit of Scratch too).
Intech had bought 8 arduino starter kits. These are fantastic combinations of components and projects that have now become more official within the arduino community.
The packaging and collection of it is very professional and whilst still all based on open source it provide way more than I was expecting in the pack. Before seeing the kit I has thought we could start with the basics of the blinking light (a standard piece of code) using the onboard LED, then build the flashing light, then cut and paste and build 2. Basically following the 3 minute piece we did on Cool Stuff Collective
The basic presentation I used was this one, it was not designed to be overly pretty as it was just a catalyst to get things going.
I also did not know what the experience of each of the attendees was likely to be. As it was we had a great and helpful mix of people who knew lots, and were very advanced hardware engineers, some traditional IT professionals and programmers and some very enthusiastic newcomers to the subject (but technically literate)
***update 13:23 14/2 (Just uploading again as it seem the Slideshare conversion repeated some words and removed others!)
The aim of the pitch was to suggest the basics (inspired by the basics in Choi Kwang Do)
I thought most kids would want to be into programming initially because of games. There is an instant disconnect between seeing all the code and the effort for a AAA title that can be quite off putting.
So I settled on the light switch as a real world blended reality example. Layering on that the Arduino is a swicth we can control, but that the basics of input, process, output or sense, decide, respond are the fundamentals of everything. So if you get a basic piece of the experience dealing with getting some user input, deciding what to do and then making a change with an output you cover an awful lot of ground very quickly.
Very often as techies and engineers we all see the intricate detail, we become very aware of everything that we don’t know, how complex details can be. However if we treat the knowledge as a set of repeating patterns, like a fractal image we can talk about a basic building block without worrying about the complexity of the overall picture. After all you have to start somewhere.
Anyway, a huge thankyou to Sarah at Intech for hosting and for getting all the kit and for asking me to help in the first place 🙂 A huge thankyou to the group of Ambassadors that braved the potential snowstorm and dived and all had a go and got everything working in the few hours we had. It helped to debug what we need to tell students and other ambassadors.
I tweeted about this the other, but after it came up in the Q&A session at yesterdays blended reality pitch I realized I had not put any more here about this interesting device.
The QUMARION is rather like the posable wooden mannequins that artist use to practice drawing figures
It is instead fully instrumented with sensors to work with a digital description of a human skeleton.
So as you pose the figure that translates to poses in the 3d modelling package.
A purist 3d designer may regard that as undermining their skills with manipulating and understanding the interface on a 2d screen. However this came up as an answer to a question about blended reality as I was talking about how sometime the technology can get in teh way, other times it disappears and lets us use what to know to enhance an experience.
The QMARION is rather like using the real guitar in Rocksmith, it may be an appropriate tool for understanding and communicating with an application.
I know that when I use 3d packages there is a barrier in having to deal with a mental translation of a 2d representation. Being able to just pose a physical device and explain what is needed physically would work for me.
A long while ago I was trying to make some tennis animations for a well known Second Life project. I found myself standing and looking in a mirror, performing the action then sitting down making that action work on a very simple digital rig, but then I had to tune it so that it looked better for the screen. I had no motion capture which would obviously have helped in the first place, but it is the extra artistic interpretation and subtle tweaks that it would have helped a great deal to have had a hands on device to help.
Now this device is only input as far as I know so there is an obvious extension in using it as an output device too. If I mocap a move, but then the device can play that back in physical steps and frames then I could tweak and enhance it. Obviously in games there are some moves that just don’t exist, you cant get certain flips and jumps happening. You can however start with a basis of what you can do.
Again of course this relates to studying the forms in Choi Kwang Do. A physical, but digitally recorded recreation may help someone even more to understand. Also a mannequin can be made to hold a position that may be a transfer on one move to another that a person bound by the law of physics cannot. It becomes a physical pause button.
Another extension to the idea is that this restricts you to one rig. A component model that lets you build any size or shape of joints to create the armature for any creation would be incredible. Combine that with the ability to 3d print the components in the first place, but them together, have that create the rigged model and then animate away. There are some fantastic opportunities for people to create interesting things as this approach evolves.
I am a big fan of Skylanders and their blending of physical and virtual play and it seems that they just keep coming up with new things. Last year we had Skylander Giants with new larger characters that also lit up (using induction current from the portal that you place them on). This year they have taken the physical side of things a bit further with this.
Swap Force makes the characters have interchangeable components. You pick a set of legs and a top and plug them together. This makes for many more combinations and ways to play. It is of course ramping up the character collectible side of things (as we discussed in Wesley’s podcast) and the “need” to buy more things, but they are toys that kids can play with.
Maybe they are getting one step closer to me be able to print the extra pieces that I win in game?