future


It does what? Vessyl – Knows what is in your drink?

It is getting really difficult, even as an future tech specialist, to separate science fact and science fiction sometimes. Some of the devices that have appeared the past few years seemed unlikely only a decade ago. Not impossible, I try to avoid thinking anything is, but unlikely to be in everyones hands at a reasonable price. Today I saw an link to this Vessyl. It has a name that has its clever internet spelling as we have seen since Flickr and alike. Vessyl is a container to drink out of. It is selling as a pre-order of just $99. Thats a lot for a stylish designed mug for your coffee.
However, it is claiming, and this is where I feel a little skeptical, that you can pour any drink into it. It will detect everything about that drink from the amount of caffeine, to the calories, the type of drink and even some brand names?
The idea being it will track your hydration, talk to your smartphone and generally be a good admin assistant to your personal wellbeing.
If this is actually true, then it is brilliant. $99 for general purpose deep sensing and network capability, plus something to hold your coffee?
If it is not true then it is also brilliant. The production values of the video, the cool chic nature of the product visualisations and the website, and the product name are great.
If its a scam to get $99 out of people, well then the internet is obviously an evil place and we should all not use it πŸ˜‰
I feel I want to order one to just support such a clever fabrication of a product and company, if that what it is. Like supporting a clever artist. I fell I want to order one because it might work and hence be fascinating to explore. I don’t see what they are focussing on selling us drinks containers when that have apparently got sensors that could probably with the “tricorder” x-prize though.

I look forward to watching the reveal either way.

A new wave of tech

The next few months is going to see an interest second wave of technology that I am very interested in.
The first is the Windows version of Kinect 2.0. This is the consumer packaged full body sensor that uses the same base as with the Xbox One. The original Kinect from the Xbox 360 was just a USB device so had the maker and hacker community exploring how it use it on regular machines before the official Windows development version arrived. When the Xbox One launched it’s Kinect 2.0 cmd with a completely different plug, making it impractical to explore. The Xbox One was slated to be a development machine for all (consoles at the moment have specific machine models for developers and a different one for consumers). This development kit has not been forthcoming (it may or may not turn up later). All this means I could not continue the work I wanted to do with the kinect for Choi Kwang Do.
v2-sensor-front
The new Kinect sensor stye body better and in particular shoulders, and also weight transfer. So I have had to pre-order a Kinect 2.0 for Windows. It is due in September. I am hoping it will all work with Unity3d again!
I have applied to id@xbox the developer scheme but a one person company working on a non game related piece of work on the fringes of the games industry probably didn’t flag up as a priority πŸ™‚
Talking of Unity3d, it’s 4.6 patch is rapidly approaching. This upgrade will feature the much needed GUI changes. I love Unity3d development but anything with buttons or sliders and GUI layouts is so incredibly awkward it is hard to see how we get anything working. The new GUI system is going to treat GUI objects just like any other object. They will appear on the scene during development. At the moment GUI objects only show up when you run, making getting things lined up and working a bit of a black art. This also sets us up for Unity3d 5.0 a major new release. Lets just hope all my code still works !

The third big piece of kit is the Oculus Rift DK2 (Development Kit 2).
camera_dk2
Now owned by Facebook but still approaching construction in the same way the original Kickstarter did. This is really exciting as the first Oculus Rift was and is still a liberating experience. With DK2 the resolution of the screens is much higher now a 960×1080 per eye. It also comes with an additional relative position sensor. One that can, like Kinect, see where the headset is in the space in front of it. It is fully supported by a Unity3d library too (as with the previous version) It is not the only VR headset heading to the market but it is the one you can get your hands and eyes on.

The future of broadcasting via super8, Wimbledon, CKD Road Trip, twitch and tango – flush 13

Another exciting monday morning as the new edition of Flush The Fashion magazine goes live.
As usual there are loads of great articles and also some great prizes to be won in the magazine. This time Kano raspberry Pi kits πŸ™‚
On page 74 I have my article “We go live in 3.2.1…” which once again looks great with the layout and the images that @tweetthefashion put together.
I discuss the evolution of home movies from the 70’s and super 8 film to the amazing changes in being able to broadcast live right now to everyone. On the journey are the experiences of trying to some early multi platform broadcasting and virtual worlds elements at Wimbledon (which starts today so that’s good timing) to the ability to just say xbox broadcast to share your game play with commentary to anyone anywhere. Being an emerging tech section I also have to consider the wonderful Project Tango from Google too.
The full magazine is here


The ipad version will be live very soon.
This will get you to my little article

Forza Horizon 2 and Crackdown- E3

Anyone who has been to one of my talks will know I always mention Forza on the Xbox as an example of some interesting ways that games technology is used and can be used to share information. It means I can show some really nice pictures and cover any subject. I am also a big car racing fan. So I was made up when this trailer and the extra information appeared yesterday about Forza Horizon 2.

I also know from experience that it is going to be a brilliant game. Forza 5 is a track based racing game. The original Horizon let you drive around a free roam area, racing and discovering things with the same dynamics and setup at Forza5. Horizon is one of Predlet 1.0’s favourite games too.
It looks as is the Drivatars will be free roaming too, which will be interesting. How that data is collected and the illusion of the real players driving style in such a free environment is quite tricky to pull off.
From the video there is also the statement that you can drive anywhere. In Horizon there were a lot of very strong fences, unlike in Just Cause 2 or Test Drive unlimited where you could drive across any field and over a cliff. So that looks like its been rectified. Free roaming games are the best sort for a gameplay magpie like myself as I dip in and out of various games. The ones that hold my attention let me go off and make my own experiences work. Driving fits that pretty well though as the constant flow and balance of cars are a microcosm of free roaming in their own right.
As part of the E3 bonanza Turn 10 also released the “free” patch to Forza 5 to give us (back) the NΓΌrburgring in its entirety

I dived straight in and did an 8 minute lap in a Ferrari 458 and lo and behold I ended up 41st in the world πŸ™‚

That will not last, but it was a pretty good time. Whilst it is a very unforgiving track it can be learned and the previous Forza on the Xbox 360 had the track. They definitely have upped the detail on it, the video mentions laser scanning the most accurate model yet. No mean feat. It does suggest its a free update, and I applaud them for delivering it, however as there were not very many tracks in Forza 5 compared to Forza 4 the next gen upgrade felt a little short on content. Now releasing the ring at E3 to help promote Horizons 2… well its almost as if it that free content was held back as part of a marketing plan? Of course it was. Still, I am pleased it is here and look forward to September’s release.
Another game I was very very pleased to see was Crackdown making a return. The original game was one of my all time favourites. (back in 2007 I said I thought it might be good πŸ™‚ )
Crackdown 2 was ok but this reboot and the original creator running it will hopefully get back to its roots but jazzed up to current tech.
The trailer is CGI bit anyone who has played Crackdown will get a little tingle in the back of the neck watching this. “We like Dominoes!”

An interesting game tech workshop in Wales

Last week I took a day out from some rather intense Unity3d development to head off to North Wales to Bangor. My fellow BCS Animation and Games Dev colleague Dr Robert Gittins invited me to keynote at a New Computer Technologies Wales event on Animation and Games πŸ™‚
It is becoming an annual trip to similar events and it was good to catch up with David Burden of Daden Ltd again as we always both seem to be there.
As I figured that many of the people there were going to be into lots of games tech already I did not do my usual type of presentation, well not all the way through anyway. I decided to help people understand the difference between development in a hosted virtual world like Second Life and developing from scratch with Unity3d. This made sense as we had Unity3d on the agenda and there were also projects from Wales that were SL related so I though it a good overall intro.
I have written about the difference before back here in 2010 but I thought I could add a bit extra in explaining it in person and drawing on the current project(s) without sharing too much of things that are customer confidential.

Why SL development is not Unity3d development from Ian Hughes

I did of course start with a bit about Cool Stuff Collective and how we got Unity3d on kids TV back on the haloween 2010 edition. This was the show that moved us from CITV to ITV prime saturday morning.
I added a big slide of things to consider in development that many non game developers and IT architects will recognise. Game tech development differs in content to a standard application, the infrastructure is very similar. The complication is in the “do something here” boxes of game play and the specifics of real time network interaction between clients. Which is different to many client server type applications (like the web)

After that I flipped back from tech to things like Forza 5 and in game creation of content, Kinect and Choi Kwang Do, Project Spark and of course the Oculus Rift. I was glad I popped that in as it became a theme throughout the pitches and most people mentioned it in some way shape of form πŸ™‚

It was great to see all the other presentations too. They covered a lot of diverse ground.

Panagiotis Ritsos from Bangor University gave some more updates on the challenges of teaching and rehearsing language interpretation in virtual environments with EVIVA/IVY, the Second Life projects and now the investigations into Unity3d.

Llyr ap Cenydd from Bangor University shared his research on procedural animation and definitely won the prize for the best visuals as he showed his original procedural spider and then his amazing Oculus Rift deep sea experience with procedural generated animations of Dolphins.
Just to help in case this seems like gobbledegook. very often animations have been “recorded” either by someone or something being filmed in a special way that takes their movements and makes them available digitally as a whole. Procedural generation uses a sense and respond to the environment and the construction of the thing being animated. Things are not recorded but happen in real time because they have to. An object can be given an push or an impulse to do something, the rest is discovered but he collection of bits that make up the animated object. It is very cool stuff!

Just before the lunch break we had Joe Robins from Unity3d, the community evangelist and long term member of the Unity team show us some of the new things in Unity 5 and have a general chat about Unity. He also did a session later that afternoon as a Q&A session. It was very useful as there is always more to learn or figure out.
We all did a bit of a panel, quite a lot of talk about education of kids in tech and how to just let them get on with it with the teachers, not wait for teachers to have to become experienced programmers.
After lunch it was Pikachu time, or Pecha Kucha whatever it is called πŸ™‚ http://www.pechakucha.org 20 slides each of 20 seconds in a fast fire format. It is really good, covers lots of grounds raises lots of questions.

David Burden of Daden Ltd went first. VR the Second Coming of Virtual Worlds exploring the sudden rise of VR and where it fits in the social adoption and tech adoption curves. A big subject, and of course VR is getting a lot of press as virtual worlds did. It is all the same, but different affordances of how to interact. They co-exist.

Andy Fawkes of Bohemia Interactive talked about the Virtual Battlespace – From Computer Game to Simulation. His company has the Arma engine that was originally used for Operation Flashpoint, and now has a spin of with the cult classic Day Z. He talked about the sort of simulations in the military space that are already heavily used and how that is only going to increase. An interesting question was realised about the impact of increasingly real simulations, his opinion was that no matter what we do currently we all still do know the difference and that the real effects of war are drastically different. The training is about the procedures to get you through that effectively. There has been concern that drone pilots, who are in effect doing real things via a simulation are to detached from the impact they have. Head to the office, fly a drone, go home to dinner. A serious but interesting point.

Gaz Thomas of The Game HomePage than gave a sparky talk on How to entrain 100 million people from your home office. Gaz is a budding new game developer. He has made lots of quick fire games, not trained as a programmer he wanted to do something on the web, set up a website but then started building games as ways to bring people to his site. This led to some very popular games, but he found he was cloned very quickly and now tries to get the mobile and web versions released at the same time. It was very inspirational and great to see such enthusiasm and get up and go.

Ralph Ferneyhough of newly formed Quantum Soup Studios talked about The New AAA of Development – Agile, Artistic, Autonomous. This was a talk about how being small and willing to try newer things is much more possible and needed that the constant churn in the games industry of the sequel to the sequel of the sequel. The sums of money involved and sizes of projects leads to stagnation. It was great to hear from someone who has been in the industry for a while branching out from corporate life. A fellow escapee, though from a different industry vertical.

Chris Payne of Games Dev North Wales gave the final talk on Hollywood vs VR:The Challenge Ahead. Chris works in the games industry and for several years has been a virtual camera expert. If you have tried to make cameras work in games, or played one where it was not quite right you will appreciate this is a very intricate skill. He also makes films and pop videos. It was interesting to hear about the challenges that attempting to do 360 VR films is going to have for what is a framed 2d medium. Chris showed a multi camera picture of a sphere with lenses poking out all around it, rather like the star wars training drone on the Millennium Falcon that Luke tries his light sabre with. This new camera shoots in all directions. Chris explain though that it was not possible to build one that was stereoscopic. The type of parallax and offsets that are needed can only really be done post filming. So a lot has to be done to make this giant 360 thing able to be interacted with in a headset like the rift. However that is just the start of the problems. As he pointed out, the language of cinema, the tricks of the trade just don’t work when you can look anywhere and see anything. Sets can’t have crew behind the camera as there is no behind the camera. Story tellers have to consider if you are in the scene and hence acknowledged or a floating observer, focus pulls to gain attention don’t work. Instead game techniques to attract you to the key story elements are needed. Chris proposed that as rendering gets better it is more likely that the VR movies are going to be all realtime CGI in order to be able to get around the physical problems of filming. It is a fascinating subject!

So it was well worth the 4am start to drive the 600 miles round trip and back by 10pm πŸ™‚

Game mechanics are not …

I often get involved in conversations and projects about how to engage people with technology. This is, of course, often using game technology to connect people, to collect and show information or just to hang out and chat or meet. It covers nearly every project I do. It is as applicable to a TV show like the Cool Stuff Collective as it was to corporate life’s use of virtual worlds. There is always a line of confusion though. It is around what enables the technology and what you actually do with it.

“A pack of cards is not a game”

Building with any technology, whether it is pieces of paper with pictures on or with virtual environments across a network does not mean something is a game yet.
The best reference for all this is of course Raph Koster’s Theory of Game Design which I suggest everyone read πŸ™‚

This book sets out a lot of the principles of what makes something interesting to do, to play. How and when repetition and skill gets trumped by boredom and familiarity.
I don’t want to get embroiled in the discussion of what “Gamification”is (too late) but very often it focuses on producing the equivalent of pack of cards. The mechanics of a potential game. This needs to happen but at it’s heart the question should be what are people going to do with the pack of cards.
Cards are a good example as most people have seen or played cards at some point. They are an easy form of technology to understand. There is a mathematical order to them, there is a visual design component. A real pack of cards also has a tactile element. Yet that pack of cards can be used for thousands of different types of game. It is the game mechanic that defines the game and the cards are just a small (but essential) component in the mix. I am sure many people have sat down with their family on holiday with a pack of cards and said “right what do we play then?”. The cards don’t tell you, they are just a medium in which to operate. So you can’t always expect a freeform environment to get people to play in. Some will of course, some people find enjoyment or mischief in any environment which leads to types of gameplay.
Card games very often feature chance. Luck places a big part however so can skill and experience. If you think of high stakes Poker games millions of pounds/dollars change hands on the turn of a card. However it is the ability to read people, to bluff and double bluff as much as the ability to calculate odds that apply. Yet the same cards can be used to play a simple game of snap. A reaction game, visual matches, fast reactions maybe a little sleight of hand as the cards are turned over.
Cards also have emergent game play. I know as a kid I used cards to layout race tracks for matchbox cars. People attempt balancing games making houses of cards and of course there are magic tricks.
So approaching a game of any sort we need to not just ask how the technology will work but find a few seeds of an idea of who is interacting with what and why. Is there jeopardy? is there luck? is there teamwork? However we should also not restrict ourselves to tightly defining rules. Allowing gameplay to be discovered.
Discovery though only really comes with familiarity or the need to break the rules. So any game needs some rules, some structure and an idea or most people (not all) are going to drop out or be less bothered. Something has to matter. There are lots of triggers to make things matter, but they don’t just happen. Someone has to make something, relate it to something.
Having been a gamer for many years I still find I play games both as a player and as a developer, I also realise I look at them as a game designer too. Why did that make me feel I needed to continues, what was so cool about that.
We all played games as kids, made up games, set rules and parameters, then found ways around them in a spirit of fun. Tapping into that as adults is much harder. People are less willing to think about why. So like many skills we all have it, its a question of unwrapping the gift of play and exploring it πŸ™‚

Google Project Tango

If you have aver looked at anything in the emerging technology world you may not have spotted how Google have gathered some of the best minds in the business to create this fantastic project.
A phone, a handheld object that has a complete sense of the environment it is in. Seeing the depth and the 3d nature of the world, not just GPS which only tells you where you are.
It is going to be fascinating to see what else can be done with this. I am really blown away by how cool this is, and the people involved are names that I have a great deal of respect for too which makes me even more excited. Johnny Lee did some fantastic work with the wiimote a few years back (2008!)

Anyway check out Project Tango and ponder the blended reality future this gives us.

Feeding Edge – Five years

It is getting quite exciting now to see that five whole years have passed since I started Feeding Edge Ltd. Soon I will have to say Feeding Edge Est. 2009 in everything. I was looking at what the anniversary gift is for five years and it appears to be wooden things. That was quite funny as really it was the whole Second Life experience that precipitated the formation of this company and stepping out of corporate life. The social implications of being able to communicate and share and work globally clashed with more rigid old fashioned corporate structures. Some were less able to see the benefit of this evolution than others. I experienced some very strange a actions by people worried about their power base across the company. I am constantly amazed by how many of the core of the movement we had back then left the old company in various ways too. The first thing you create in Second Life, BTW, is usually is wooden box. So in a way I pre-empted this 5th anniversary celebration some time ago!
Predicting the future is of course very tricky. The past 5 years have been so diverse and interesting that I can only hope the next 5 are anywhere near that. I also wouldn’t change what happened in the lead up to Feeding Edge. 2006-2009 will remain an amazingly vibrant and overall positive experience.
We just got back from a very exciting snowboarding and skiing holiday as a family. The first time the predlets have experienced that much snow and height. I sing the praises of virtual, of game experiences etc but not at the expense of physical experiences too. It is hard not to come away with a positive outlook on the future when you spend the week in places like this.

The thrill of learning or mastering a new skill, of seeing things in a different way, a mix of adrenalin and gravity all add to the mix. Next time we go I may have to take more hi tech things with me like GPS goggles. Just need the phone companies to stop charging mad rates for data ! 15Mb for Β£1.99, I turned that on my iPhone for about 2 minutes and it was used up!

I have of course not been just doing one job or thing all these years. Just to reflect (for my own sense of worth and 5 yearly appraisal….)
32+ speaking engagements
39 TV show recordings of Cool Stuff Collective
9 game reviews on Gamepeople
10 articles in Flush magazine
1 academic paper Virtual worlds, augmented reality, blended reality.
Helped form and chair the BCS animation and games development group.
Became a STEMnet ambassador and helped my fellow ambassadors.
Plus a lot of unity3d and second life work including the ongoing virtual hospital to keep my coding eye in.

My office has moved too as has the family home. Change being the only constant in this industry. Something that we all have to roll with.
As a family we have also taken up Choi Kwang Do, now well over 2 years into that journey. I am learning how to be a coach and instructor and hopefully applying some of the peaceful attitudes we live in the art to work and life.
Looking back 5 years the massive explosion in accessible technology has been the most exciting thing. Many of the things that I have been an early adopter of are now part of everyday life for many people. Sharing experiences online being the greatest change I have witnessed. We still have a way to go to help people evolve and learn how to share, or not. Of course I hope that the next wave will help people explore richer environments on top of the text and photos/videos that exist now.
I am also very happy that nearly everyone who connects online is becoming a gamer. Whilst many attitudes to games are still rooted in 3d shooters, we now have the social gamers who see different styles of games and engage. They are not all the best games in the world, we do of course have the downside of being rinsed for money in pay to play. However again the etiquette and acceptable side of the business will settle.
It is not all good news put there though. We still have vast inequality in the world. Huge poverty problems even in so called developed countries. The richest are still gathering more and more of the planets resources and wealth. In part that is driven by technology. The ability for digitally transfer assets, trade and make deals. Many hidden away in systems we have no access too. We have the spying scandals, mass surveillance of the public, because it is so easy to do as we are so connected.
Amongst all this though I am optimistic we will achieve some equality through the sheer abundance of technology and it blending with the human spirit. As we have seen with open source movements people can work together on things that they care about regardless of location on the planet. If that is the case then just as the bad guys, criminals and corrupt can leverage it, so can the forces of good.
Knowing what is out there, what is interesting to experience to use and sharing it with people who may find an even more positive use for it is my mission. Taking a bite out of technology so you don’t have to is the company tag line. Looking, feeling, sharing and understanding the implications and the possibilities is my job. I don’t see any sign of running out of any of these things just yet!
I also hope that the predlets continue to be as inspired and amazed by it all too. They have an exciting future ahead of them


The biggest thanks for the support over the past 5 years goes to my wife. She to changed jobs from our old firm too, a few years after me. The move was to a much more rewarding yet much more responsible and higher level position. She commutes everyday to London and is the financial stability the family needs amongst the constant churn of my small company life.

So, future, I see you hiding there. Let’s get on with it shall we?

DRM our eyeballs?

I had already posted about how I was a little surprised at the lack of streaming media integration and the inability to get the devices I have to do what I need with the Roku streaming box. I tweeted that is seemed media companies would not be happy until they DRM our eyeballs. A screen is a screen, if you can see it you can pirate it, either passively (pointing a camera at it) or actively by bypassing the DRM/electronics/cable blocks etc. It is all a bit of a waste of everyone’s time and money. Though the hardware providers are raking in the cash as I found with my apparently non standard Apple cable charger in my car that the Iphone after a recent update now delights in telling me that this is a non standard accessory that won’t work properly. Its a cable! a bit of wire πŸ™
Anyway, this is a journey to try and get to legal content, using consumer facing devices. Not hacking, pirating etc. Though it seems it is much easier to pirate and hack than it is to get to stuff that your are entitled to view or actually own in the first place.
I thought I would try an Apple TV. We have lots of Apple devices, phones, pads and Macbook Pro’s after all. This was now not just for the early morning attempts to watch BBC breakfast which started this, but now to push live and on demand TV into my newly refurbished home office. Working at home all the time I need the background noise of radio or TV. Peace and quiet are not always conducive to getting things done (for me anyway).

We have Sky and a full subscription in the house and a recent order for multi room means we now have the online Sky Go extra with a whole 4 devices able to be registered to view the sky channels we pay for.
So Apple TV is definitely a lot slicker and less jacky than Roku seems to be. To be fair though I already have an Apple account so I didn’t have to do the extra account creation steps. However Roku won’t let you use it unless you register, Apple TV was happy to just startup.
I thought I would try and push iplayer from my older Macbook Pro that I use mainly for testing. After all you can, according to the blurb, just mirror the screen to Apple TV. Well you can… but not on an older Macbook Pro for no real reason I can see. I then tried on my newer MacBook Pro and sure enough it worked. The trouble is my main MBP is often doing other things like compiling code and running Unity3d development so thats a non starter.
I tried using iplayer on the iPhone and just hitting Airplay on the application. Sure enough it worked πŸ™‚ So I wondered if I could do the same mirroring with my Android Samsung Galaxy Tablet. A bit of hunting around and it seems there are applications to beam existing content but nothing to just show the screen as is, and hence bypass any of the apparently inconsistent checks for what you can watch where. This was not overly successful. Certainly not something I bothered chasing up.
I looked around and found that there is a Mirror application that sends screen to Roku and Apple TV from PC or Mac. This is called AirParrot. I downloaded the trial and used it on the 2009 MBP and sure enough it found the AppleTV and sent the screen to the Apple TV connected to the normal HDMI TV. I had to install an extra driver to get it to send sound as well though. The application is only $9.99 so I also bought a license for my Windows 8 laptop.
The Windows laptop is a gaming spec one and only a few months old so I decided to move to use that rather than the older MBP. This was mainly when I tried to use Sky Go on the MBP. I logged in as a subscriber to Sky and then attempted to AirParrot to the Apple TV. However machine seemed to be suffering from having to download over the wifi/internet and then wifi send to the TV so it all got a bit choppy. On windows and a more powerful machine is seemed to be not too bad. This was despite the MBP being on the 5ghz network and the windows machine on 2.5ghz.
I changed the windows to extend to the extra TV and AirParrot does a perfectly acceptable job of pushing content. It is not perfect HD and I am still doing the sound directly on the PC but it works. I now have TVCatchup playing Channel 5 The Wright stuff as I write this.

I went back to Sky Go and downloaded Iron Man 3 onto the Sky Go application on Windows. The aim being to reduce the double network effect of streaming in and pushing back out again. This worked fine on an Airparrot mirror, again it had a lot of compression going on, AirParrot has a few quality settings to tinker with later though. it is good enough to have the films on in the background.
Then it dawned on me that I really should move the Xbox 360 to the office. The 360 has the Sky application on it and so I would not need to use the Apple TV and this extra hope. Unfortunately I can’t even try this until next month. Why? Well Sky Go claims “Watch TV from Sky however you like on your compatible mobile, tablet or laptop wherever you are in the UK & Ireland with an internet connection”. Which is does but… With our multi room subscription we are allowed 4 devices to be registered. As soon as you log into Sky on a laptop or phone and hit play that device is then takes up one of the four slots. Apparently you can only have 2 of those 4 logged on at any one time too. In all my experiments I had gone in to manage the list and I removed 1 device we don’t use. Leaving the original iPad and another windows laptop I experimented with over xmas. Using the windows box and the mac to test Sky Go and mirroring means I had refilled 2 slots. However, the ridiculous terms stop you editing your device list any more than 1 device a month. I can see they are worried we might all just keep chopping and changing but they should look at Apple’s device limits!
I am asking Sky nicely if they will reset my devices now I know the configuration, so I will see what happens there.

So what works after all that?

Live BBC 1 iplayer
Iphone/Ipad airPlay mirror screen to Apple TV
Web new macbook pro live BBC 1 mirror screen to Apple TV
Web old MBP + AirParrot Application to Apple TV
Web windows + AirParrot Application to Apple TV

What doesn’t work
Roku Iplayer application no live stream
Roku and Twonky beam won’t mirror BBC1 Live from Iphone/Android
Apple TV has no native iplayer app

With Roku and Apple Tv there are native applications to get at various services. The one I really cannot understand is why the Sky Now TV is not unified with the SkyGo service? Mind you with the ludicrous device limit and management setup this probably is not a surprise. I should be able to just log on to the Roku and register that device as one of my 4 Sky Go devices with a normal Sky Id?
***Update 16/1/14 Credit where credit is due to good customer service. Having emailed Sky and explained I had maxed out my 4 slots but would prefer to remove the machines I tested with they cleared the required slots and I now have the Xbox 360 connected back to Sky so I no longer need to beam sky go from one device to the Apple TV. Next up I am now trying XMBC and Play To on windows freeing up the Apple TV to replace the Roku and we will just use the iphones/ipads for iplayer. Not totally ideal but certainly getting close to the solution.

Each streaming device should really have Tvcatchup or equivalent for all the terrestrial channels. Excuses about rights management just do not hold water as the combination exist to see these things on a screen, and in particular on “mobile” devices like laptops and phones. However static devices attached to a large non mobile screen are very limited, unless you spend lots of money on doubling up subscriptions or download illegal DRM free copies of movies. It seems, as per usual DRM is just costing us all. Until they get to instal a restrictor chip in our brains to stop us receiving content that they own it is all pretty much a waste of time.

Grow me a virtual universe – No Man’s Sky

I was excited to see the new project from Hello Games get a lot of blog and press time the last few weeks. If you don’t know who Hello Games are they are an indie development team who brought us the delightful cartoon fun of Joe Danger. A guy on a motorbike leaping and ducking around a number of tricky levels. It is hard not to bump into Joe Danger on every platform now, but I remember their stories of toil and trouble at Develop where they had houses mortgaged to the hilt in a last ditch attempt to get the game out. Whilst they have capitalised on, and followed up Joe Danger they have also been hard at work on an incredibly diverse project.
Joe Danger is a cute, side scrolling reaction game. It has a certain something, using an existing format but making it more fun, rather like Angry Birds has done.
The new work though is stunning in its claims. They have built a sandbox universe, not a sandbox city, desert or planet. No, a huge space of planets with detail on those planets. The claim is that everything is procedurally generated. Just to clarify, that means that rather than sitting drawing everything by hand, it is code that generates the places in a defined style. Behaviours of creatures, races, vehicles etc can also be put into this. Way back elite had a procedurally generated universe of names of planets. However you did not visit those planets and swim in their oceans. It was also not a shared space with others online. Terrain and in particular trees, or anything of a more ractal nature are often procedurally generated, like the famous Barnsley fern. This ramps that up a whole level. Of course in fractal maths complexity is the same at every level of abstraction so the detail of a leaf is the same as the pattern of the universe. It, rather like probability maths, tends to go against our common sense approach to the world. However when you ponder it a bit longer it does start to make sense.
With this sort of background information the video below gets even more interesting. Procedural generation at an atomic level. A world that your presence in influences its development. A place that is not technically infinite, but so big it would take more than a lifetime to explore. That is all very exciting stuff!

There is a good article with some more detail and a proper interview, rather than my speculation based on how I would do this over at destructiod
I am looking forward to this one though. Well done Hello Games welcome to the metaverse industry πŸ™‚