For many years I have seen the CES show appear in magazines, then TV and then of course all over social media. As a long time tech geek and early adopter I have always wanted to attend, but never been able to. In my corporate days getting approval for a train to London was a chore. As a startup I never had the time or money either, preferring to invest in the gadgets like the Oculus rift or paying for a Unity license so I could build things. On the TV show we talked about CES, and if we had gone to a 4th series it was on the cards.
This year, with my industry analyst role in IoT I was able to go. Of course as a work trip it was a bit different to just being able to take the show in.
I had briefing after briefing with a bit of travel time in between for the 2 main days I was there. Once thing that is not always obvious is just how big the show is. Firstly there is the Vegas convention centre with North and South Halls that is bigger than most airports. It was so big that I only got to really visit the south halls, the north hall of cars, a motor show in its own right alluded me. All the first days meeting were around the south halls. Day 2 was at the other areas of the show, the hotels have their own convention centres and also floors and suites get rented out. The Venetian Sands, Bellagio and Aria all had lots going on, each as big as any UK show it seemed.
Walking around 9-10 miles a day, still not seeing everything, at a trade show gives you an indication of the size.
Again I pretty much missed most of the expo floor with meetings but the day I felt out I had an hour to pop back to the Sands main hall and see some things.
The split across the entire show of giant corporate powerhouses to tiny startups with a single table was amazing. I had assumed it was all the former, but the latter is heavily supported and with kickstarters and maker culture now mainstream it will continue to be really important.
One thing I was there to see was how much Augmented Reality was taking off running parallel with the VR wave, there were a lot of glasses and of course the Hololens and the industrial focussed Daqri smart helmet. Still not there as a consumer focus really yet, though the Asus Zenfone AR powered by Tango and Qualcomm was announced but not on sale until later in the year (no date given) which may put true AR into people’s hands.
DAQRI Smart Helmet
It was CES’s 50th anniversary, and that was fitting given I turn 50 this year too. It may not be the most exiting bucket list tick but I have already done lots of mine, and need to refresh the list anyway. I am not sure that will include riding in this human size quad copter that you fly with a smart phone though!
I guess we best experience everything before these guys and their bretheren take over.
Still at least we can 3d print new parts for ourselves
As you will see in this album the whole place just becomes a blur of everything looking the same, lights, sound, people, attract loops etc. All very fitting to be in Vegas.
So that made a whirlwind start to this year. This time last year I was published Cont3xt and wondering what the next steps were going to be. This year I have stacks of IoT research and writing work to get on with, a 50th birthday to not get worried about, imminent wisdom tooth removal (yuk) and all being well a 2nd Degree/Il Dan black belt text in Choi Kwang Do. So onwards and upwards. Pil Seung!
Controlling robots directly with your own physical motions is something we see a lot in sci-fi. Films like Real Steal have battling bots tearing one another apart in the ring. As a kid I had a version of the battling robot boxing game. Rock ’em Sock Em was the main contender. It had mechanical bots on the end of some push rods. The robots were locked into the boxing ring. Thumb presses aimed to hit the other bot square on the chin lifting his head up for the win. My boxers were free moving versions, but looked like real people. The trolley wheels on the base of the 12 inch figures meant lots of positioning and jostling as you tried to free move and get those punches in. Of course all that went out of the window when the digital fighting games arrived and Street Fighter et. all blew away the physical fighting toys.
Now the bots are fighting back and @FamilyGamerTV has some coverage here of motion controlled free moving fighting bots that are soon to hit the shelves. The video explains it all, but it seems the motion is directional, hooks and upper cuts, not just the thumb pressing single motion of the old fashioned fighting bots.
Big Robots, as they are not so imaginatively called, add a little spice to the radio control market. Now if we could just make them a little bigger we would be ready to fight Godzilla when he walks out of the ocean Pacific Rim style.
I tweeted about this the other, but after it came up in the Q&A session at yesterdays blended reality pitch I realized I had not put any more here about this interesting device.
The QUMARION is rather like the posable wooden mannequins that artist use to practice drawing figures
It is instead fully instrumented with sensors to work with a digital description of a human skeleton.
So as you pose the figure that translates to poses in the 3d modelling package.
A purist 3d designer may regard that as undermining their skills with manipulating and understanding the interface on a 2d screen. However this came up as an answer to a question about blended reality as I was talking about how sometime the technology can get in teh way, other times it disappears and lets us use what to know to enhance an experience.
The QMARION is rather like using the real guitar in Rocksmith, it may be an appropriate tool for understanding and communicating with an application.
I know that when I use 3d packages there is a barrier in having to deal with a mental translation of a 2d representation. Being able to just pose a physical device and explain what is needed physically would work for me.
A long while ago I was trying to make some tennis animations for a well known Second Life project. I found myself standing and looking in a mirror, performing the action then sitting down making that action work on a very simple digital rig, but then I had to tune it so that it looked better for the screen. I had no motion capture which would obviously have helped in the first place, but it is the extra artistic interpretation and subtle tweaks that it would have helped a great deal to have had a hands on device to help.
Now this device is only input as far as I know so there is an obvious extension in using it as an output device too. If I mocap a move, but then the device can play that back in physical steps and frames then I could tweak and enhance it. Obviously in games there are some moves that just don’t exist, you cant get certain flips and jumps happening. You can however start with a basis of what you can do.
Again of course this relates to studying the forms in Choi Kwang Do. A physical, but digitally recorded recreation may help someone even more to understand. Also a mannequin can be made to hold a position that may be a transfer on one move to another that a person bound by the law of physics cannot. It becomes a physical pause button.
Another extension to the idea is that this restricts you to one rig. A component model that lets you build any size or shape of joints to create the armature for any creation would be incredible. Combine that with the ability to 3d print the components in the first place, but them together, have that create the rigged model and then animate away. There are some fantastic opportunities for people to create interesting things as this approach evolves.
Happy new year everyone. n.b. I updated the title to use program and programme to avoid confusion as my not so subtle play on words 🙂 This new years eve over on CITV was the last in this series of Cool Stuff Collective. It is running for the next month on ITV player here
It was notable for a double custard pie on the wall of fame but aside from that we did a different future tech. I was intrigued what was going to make the edit as we could have done an entire show I think with all the things Vicky and I went through.
I talked about Robotics in 2012 and covered Asimo, Nao and also exo-skeletons. This was a general robotics discussion and its spin offs, plus a small piece about artificial intelligence. We did this as a talking piece with footage played in between of the various things as they were going to be impossible to squeeze into the school. In transport I also talked about the Toyota car with a giant OLED customisable surface. All very big things.
I finally got to talk about Maker Culture too, and spin off from the open source ideas to things like Sugru that let you hack things better. i.e. physical hacking.
I have my headphone mute button that I enhanced a while ago and I demonstrated altering a PS3 controller button using it. It is wonderful stuff and fits nicely into maker culture.
However the closing statements were my wish for the year. To paraphrase….
Education and teaching of computer programming needs to be done properly. We have got stuck with ICT, which is important, but it is about using computers not building with them. All the gadgets we show and all the block buster games we talk about need to be built. So I want everyone to hassle their teachers to ask to be taught computer programming as it is one of the most important skills that will be needed in the future. Without it we will not be making the next generation of gadgets or fantastic games. (That of course also relates to other business areas, banks, medical establishments etc….) Knowing how to build is important, knowing how things work not just being a user is important.
A much used quote that was a bit too heavy for the show “if you are not programming you are being programmed”
I think 2012 will be a year many more things drive this point forward to build on all the work people have been doing up to know to get this message across. So if we can keep pushing we may have a chance to save the economy and everyones future, and have a rewarding career for people too. It’s all good !
Anyway thats this series done. I may do a new year retrospective on all the pieces of the puzzle that I have talked about this series, but its all on this blog 🙂
It’s also a major part of my speaking engagements for the next few months at least.
More TV for me? Well I hope so, it is a great honour to be able to share all this stuff on all sorts of media. Lets see what 2012 brings (BTW I am my own agent at the moment 😉 )
I hope you all had a good christmas holiday. It’s friday already and I am only just blogging about the penultimate Cool Stuff Collective that started airing on christmas eve! We ventured into yet more unusual tech using a telepresence robot as both the prop and the subject of the piece.
So for this one I got to talk to vicky via Skype, but I was delivering out via a controllable camera and screen using a Mantarobot.
This piece of kit contains a netbook, but it is encased in a motorised column that allows control signals to be directed to it via Skype plugin, so you can drive around (with parking sensors sending info back to the driver). You appear on the screen and you can also pan and tilt the camera on the device which gives much more freedom than a regular webcam.
Being used to expressing what I need to say through a digital medium it was an interesting feeling to know there was an actual physical presence, a real avatar at the other end. The school floor was a bit slippery for the device so when I was driving around in rehearsals I kept facing not quite the way I wanted too. I am sure with practice that could be rectified.
I am still not a fan of the video conference though. It was made harder by the fact that I was also being filmed from over my shoulder.
It creates a weird sensation even when you know the person you are talking too. It was interesting to contrast this with the experience of talking to Vicky as a Panda avatar, rather than me as a laptop on wheels. The former was more relaxed and normal. It almost felt as if there was a “oh no what has happened to you?” feel from the telepresence bot. One of the questions from the audience was “could you make it look more real?” I suggested some tinsel.
That said though it has its place, it works and it was using easily available technology. It is not just Mars Rovers that telepresence and ultra remote control can work with.
The show is on the ITV player for a few weeks and the last show goes out tomorrow new years eve. Though for some reason we are just only CITV and have lost the ITV slot. It is a roundup and a bit of a prediction show so I will see what made the edit and talk about that tomorrow 🙂
The something different in the title refers both to the content of the future tech slot and also the format of it again this week. Saturday saw the first showing of this weeks Cool Stuff Collective (available now on the ITV player) and features the rather unusual KASPAR robot and one of his creators Dr Ben Robins.
This future tech I got to be a full on TV presenter. I.e I did the intro and link into and out of the item and interviewed Dr Robins. It was great fun to do, and as I had a raging sore throat at the time it was good that someone else could do the talking.
However (my TV presenting CV enhancement aside), how we did the item is less important than what the item was about.
KASPAR is part of an EU project looking at the minimal expressions needed in a robot rather than aiming for 100% realism. In a way this lives in and around the uncanny valley rather than trying to cross it.
Dr Robins’ specialism though is in using various techniques to aid children with varying levels of autism to engage with other people initially mediated through KASPAR.
The robot is not an autonomous one (though it could be), instead it is a mix of responsive sensors and touch sensitive skin and the skill of the mediator to puppet him in an interaction with someone.
We had a good chat in the “green room” (a.k.a. Staff room) before the piece as often robots are considered as replacing people.
Dr Robins was keen to point out that in the delicate human relationships that need to form in helping those with autism the robot becomes a conduit for the teacher to engage with the student. This is because this is about therapy, not technology.
The look of KASPAR seemed to divide both the crew and the audience from fascination to being a little freaked out by him. It is precisely the open nature and blend of tech and features that appears to endear him to those students that Dr Robins works with though.
It was a real honour to have something that is part of some very different research in robotics that is aimed at making a real difference to peoples lives.
Dr Robins was also a great sport in the “oooh touchy” storming off that he did, as he is a massive Asimov fan but the script asked him to play otherwise just for the long running joke. He did this with great charm and a smile on his face too. You can read more about KASPAR and the wider project here.
I have been looking at a lot of robot advances and various clever kits. Some inspired by nature, some just plain brute force servo controlled kits.
This however is something that is uncannily amazing. It is a Rubik’s Cube solver robot. Solving a cube is relatively straight forward as a maths puzzle. This however has to visually examine the cube and then solve it and manipulate the answer
The Cubestormer II is, as you can see, using LEGO (four MINDSTORMS NXT kits) a Samsung Galaxy S II smartphone is running the app to analyse and trigger off the solving. i.e. easily accessible technology combined in a very clever way.