From ICE to EV – Test driving Nissan Leaf

Just before Christmas my Subaru Impreza had a seemingly catastrophic coolant problem. I arrived at my school governor training (4 miles away) in a cloud of steam from my bonnet. The car needed a low loader to get it home, but all that took a while, so it was lucky I had the course to attend.
We had been considering a new car. We have a Honda FRV with has 6 seats and loads of room, is automatic etc. Then we have my scooby which doesn’t really get the chance to get used as much. as I work from home almost all the time most of my journeys are to take @elemming to the train station and pick her up. Driving to the supermarket or shops and most importantly heading off to Choi. All of which are about 6miles round trip.
I have an interest in tech of course, so electric vehicles (EV’s) always sounded interesting but there is a problem with causing the entire family to be early adopters if it is just for the sake of early adopting and spending significant money.
We had looked at normal Internal Combustion Engine cars (ICE). All very nice, but very few things were going to be as close to my driving experience with the Scooby. Hence all the new cars I figured would disappoint in some way. I have had the scooby since before predlet 1.0 was born so that’s 11 years. So it has seen some major changes in life, predlets born, leaving corporate life, moving house, taking up a martial art etc. It has also been very reliable. However things have to move on.
We took a punt and test drove a Nissan Leaf. This was particularly good test drive wise as they gave it to us for 7 days. Having an EV for a week gives you a chance to see if it really does fit the in with the family and out needs. It let me do some experiments too.

The first objection most people have to an EV is the range. Unless you buy a £100k Tesla you are going to be getting a car that has a 100 mile range, a 50 mile round trip. When you are used to 250-350 miles in an ICE that seems not great. However, remember I only needed this for the short 6 mile round trips. You can get a lot of those on 100 miles of battery, plus they return home, where a recharge can occur is need be. In addition we already still have a petrol Honda FRV for those long journeys. So really the range is only a problem if the other car is in use and one of us need to do a longer range journey. If this situation is going to occur is can be planned for as many EV suppliers offer petrol hire cars as part of the package. Of course this is assuming you couldn’t charge on a journey. Which you can.
So on these short hops locally an EV is ideal. It has plenty of charge and plenty of scope to just jump in and use it.
One of my concerns that any car would just not be as entertaining as my Scooby was kicked right into touch straight away. Even a relatively low end car like the Leaf has a fantastic feel to it. They feel very light, in a responsive lightness not a flimsy lightness. They also accelerate. The 0-60 on the Leaf is abut 7.5 seconds. My Scooby was about 5.9 when it was new. However… the profile of an EV means it has constant torque through the entire range. There is no specific gearing. Even a ICE automatic changes gear, or has kick down to accelerate. The Leaf just accelerates, whatever speed its goes. Without the engine noise it, and very little road or air noise the speedometer is really the only indication of speed. You hit 50, 60 and 70 very quickly on the motorway, but once at them the speed almost feel the same. This is a good thing, it means cruising at any speed is easy and acceptable. You are not between gears or struggling to justify the 5th gear change.
My drive to Hedge End I did firstly just normally in regular drive mode having not topped the car up after the station and school run. So I was on about 85/90% battery. So I very quickly experienced the anxiety that all EV driver get. In ICE you have a petrol needle, its not very accurate but you know you have a few miles left after any warning light or hitting the red. Up until that point you just drive. The you maybe have to eek it out a bit more to make it to home or a petrol station. In the EV you have a lot of information, accurate information, telling you battery levels, power usage and estimated range. If you hit the aircon/heating and floor the accelerator on the motorway you see you project range tumble. The climate control knocks 7 miles off the range. I realised I was burning too much as I reach what would have been a half way point and headed for home, this time with the eco button and full mode B energy recovery. I approached home and the warning lights started flashing. I only had 15 miles left of range. Panic!. The sat nav asked if I needed directions to a charger! PANIC!!!
Of course this is all slightly ridiculous as by now I am 3 miles from home. Even with climate control on full I would make it, but it does play on your brain.
The next day I thought I would try the same journey but top up to 100% at home before I went. Home charging is just plugging into a 13 amp plug (though you can install higher rate quicker chargers for a few hundred pounds). It means if you go to a friends house and need a top up you can (though thats a bit cheeky, I am not sure we have the social etiquette for that sorted out yet)
This time I drove on the motorway back down to the south coast on full Eco (where the max power is dropped and the effects of the throttle are lessened) and mode B where more engine braking is applied when you ease off and the battery is charged with that power.

So we have the simplicity of just putting in drive and pressing the accelerator, but we have the complexity and thought of energy management and recovery. Easing off before coming to halt to get your green lights lit on the left of what would be a rev counter.
I got to the same point with plenty of return charge but still drove back on Eco. Because of the perceived speed it didn’t seem to matter in a cruise on the motorway to have turned the wick down on the car. The energy management and recovery made for fun game. Something Nissan have spotted by rewarding you with a christmas tree/lives indicator that gradually fills up as you do good work. It sounds mad but it works.
I managed to max it out on one journey- Yay !

On the second Hedge End trip I was slightly less freaked out by the battery indicator but as I got to Winchester services, and as the car had an Ecotricity card in the sunglasses holder I thought I would try a rapid top up charge.
There were 2 bays both without cars in them, so it fitted to experiment. I held the RFID card up and chose my rapid charge mode. The nozzle comes with the plug, it was a massive device compared to the home charger as it had lots of sliders and locks. Once engaged charging started.


I was on about 33% and went for a coffee and a comfort break. It was about 15 minutes and I was not waiting around tutting.
I was impressed that Costa reminded me I was driving a leaf and had my little Eco xmas tree game to play in the car by drawing on the top of my coffee with a similar motif.

I returned to the car, went to cancel the charge and it asked me to swipe the card again. This of course makes sense to stop people unplugging you out of spite. I disconnected and saw I now had 66% charge (64% when I took the photo and faffed around before hand). Not bad!

So for the 10 miles home I dropped the eco and the full energy recovery. Pulling onto the motorway it was like having a brand new car as it woke up from its space cruising stasis and went full alien monster.
When I got home I had used 20% of the battery already. So it really hits home how to moderate driving when needed, not accelerate quick so hard. Much more so than a petrol car that you don’t really think about it much until the needle gets low.
One of the other gadgets on the Leaf is is great set of cameras surrounding the car. The dash display that doubles as the GPS, radio and everything else switches to cameras automatically for reversing, but also can be switched to camera when at low speed/ stationery heading forwards.
A reversing camera makes a lot of sense, but also the ability for it to generate an apparent birds eye view is fantastic for parking in bays.
Here I am safely parked and stationary with the forward camera working. It is in black and white as it is in night sight mode. You can see in the dark

The birdseye view can be replaced with a camera on the front left quarter, i.e. the turning blind spot, by pressing a button.
Reversing also provides the angle of trajectory super imposed Augmented Reality style on the reversing camera. Here I a parked but in reverse on out drive way showing the obstacles (in colour this time)

We had a top of the range Leaf so it had everything on it. A spoiler with a solar panel to charge the second battery (a regular car battery used for lights etc). Voice control, bluetooth hook up, it understood my iPhone playlists when plugged into the USB, heated seats etc etc. It had GPS builtin and knew where a particular network of chargers were to navigate too. Full ownership gets you hooked up (along with your charger) to CARWINGS, which is Nissan’s internet of things network for automotive. This lets you control the car setting via an app, asking it to warm up in the interior in the morning. Or you can, as I did, just use the timer. When on charge it will draw house electricity to power the AC and warm the car for a set time and temperature. There is of course no point going out and starting the engine as it doesn’t heat up like and ICE.
I enjoyed looking at the energy information screen quite a lot. It felt like playing Elite Dangerous or Eve Online balancing the power usage.

I also was entertained by the graphic for the “don’t put a baby seat in the front” which was the most extreme one I had seen. The middle picture almost looking like a theme park ride icon.

So it seems that I am quite taken with the Leaf, it has a very particular set of skills 😉
We are going to test drive the BMW I3, though that offers a range extender, a motorbike engine and 9l tank to give another 50 or so miles. However I am not sure if that is worth us getting as it would almost never be used except in extreme circumstances when it might better to hire a car or take the train. Also BMW only seem to offer a 30 min test drive, and after that if they think you are worthy a few hours test drive. Our nearest BMW dealer is in Eastleigh too which is a pain. However BMW, and all car people you need to ramp up the test-drive time!
Lets see what happens next 🙂

Happy 2015 – Flush 16 – Are you Santa?

I hope everyone is getting ready for a great 2015 and had a good break (if you got one). For a first post of the year I am going to start early and mention Christmas :). This though is just to let you know that issue 16 of Flush The Fashion magazine is live and my slightly esoteric “Hello, Are you Santa Claus?” article this issue looks at probability and how that features in Quantum theory and a light hearted look at multiple universe theory.
The whole magazine is cool and interesting as usual from @tweetthefashion. I say this each time but it is still true, the amount of effort and skill that goes into making this magazine is fantastic and it’s an honour to have my content included.
If you look at the embed below you will find me on page 22 but just before that there are some great images of food skateboarding by Benoit Jammes called Skitchen
So where else do you get such an eclectic combination of quantum physics and skating fruit?
See what you think.


The direct link to my article to get your brain kicking in for the new year is here
Now would be a good time to kick in with my epredator theme tune to introduce it 🙂
Now to work out what 2015 is going to bring. It looks like there will be a lot of VR and AR with so many headsets and combinations of tech emerging. It is very exciting and it all needs to be powered by virtual world content. So another year of metaverse evangelizing before it becomes the norm then 🙂

Theme tune

A while back I invested in the Unwinnable Kickstarter campaign to create a weekly quirky look at video games. I picked as one of my rewards a personal them tune composed by George Collazo I was intrigued at what might come out of such an exercise. Those of us getting a tune done filled in an interesting set of questions. I got a tweet just now for George saying my tune was ready. I click play and loved it, then read what George had used as inspiration.
” I tried to combine super spy James bond style coolness with pure Bruce Lee badassey with Enter the Epredator.” Yes tech and martial arts in an awesome tune!
Thankyou George! I love it.

2.3.4 dum diddle um dee dum dum, dum diddle um de darm.

Primary school computing – Barefoot

The past few years have seen a real recognition that computational thinking needs to be part of everyones education from a very young age. The UK national curriculum had some massive changes to introduce this, which is great, but it was not always matched with any resources or training to help anyone who was not a techie and programmer from understanding it let alone then be able to teach it.
I like to make people feel comfortable with technology and explain how it fits into what they know, how it can improve, or threaten their experience. So was very happy to see this initiative. Barefoot Computing is an initiative from the BCS The Chartered Institute for IT and supported by Computing at School. It seems to be a complete system of curated content that is not solely about programming or tech.
Instead it looks at how to transition what teachers know and teach already and frame that into the basis of the next steps in teaching computing. It also them provides free resources to help teach, lesson plans, Scratch programming assets etc.
It is also not simplistic, it is not a “just code”. One of the most important and striking use of word as an entire category is “tinkering”. The fact that lesson plans and the resources include specifics about letting go, allowing exploration by the students to see what happens struck me as very different.
Each of the plans is tagged with specific helpful icons that indicate the point of the lesson. These in their own right are very useful and telling about how this is all structured.
I have image captured from the site, just as an example to be able to talk through, but go and take a look, there is a free registration. It is worth us all knowing the resource is there, to share with schools and colleagues. Or better still get one of us the STEMnet and Barefoot computing volunteers to come into school and share it with the teaching staff.
Comp thinking
Here we have the core approaches to computational thinking.
Tinkering, finding out what happens and discovering new things is the first on the list!
Debugging sounds a bit more techie, but it is spotting whats wrong it anything and fixing it.
Persevering. This is not something you usually see in “tech” presentations. Something not working, not operating as you expected is a normal part of the business. It is an important aspect of life.
Collaborating. Working with others is also essential.
You can se these are not computing specific. They apply to every walk of life and and experience. In my martial art Choi Kwang Do, Perseverance is a key principle. We tinker and debug as we learn about how to perform our moves, adjust, apply a what if approach and finally we work together in a friendly environment. This just happens to be focussed on programming, but it is so much more!
comp concepts
Clearly this next layer looks more like the technology words we use in the industry, but again many of them are applicable in non keyboard related instances. Breaking problems down, spotting patterns, making things repeatable are all key elements.
All these are setting the scene before the specifics that make up computer science. The nuts and bolts of what we do as programmers and architects. Now if you came at computing with this list below it would be a lot more overwhelming, even though each element is relatively simple as a concept.
all
Very often as techies this is what we hit people with. Often in order to help them understand the beauty and elegance in the art of programming, or as a weapon to maintain a technical dominance.
In all these icons though my favourite is the variables one. It specifically shows the word lives and 3 hearts. The number of lives you have in a game is a variable. It goes down (sometimes up if you hit a bonus). For me the variable was the part of the aha moment, when all computing became clear for me.
I started computing at school to work out how Space Invaders worked. We were writing a program in class dealing with electricity bills. It had a loop to recalculate a bill based on different inputs. It was a variable we were setting. It all seemed like a weird mess of symbols until it clicked that the thing we were setting and how it stayed set or got changed by the code was the same as the score in space invaders! It was also the same as the position of each invader, where the player was etc… Kaboom! a life changing understanding of something that seems laughably simple when you spot it and then apply the pattern repeatedly to other situations.
So, a big thumbs up from me and I will be helping predlets school, and any others that need it with this set of resources as a starting point.

Black Belt experience

Last sunday 7th December 2014 I achieved another life ambition and graded to receive my Choi Kwang Do Il Dan (1st Degree) Black belt. What is interesting about this is that the ambition and reason to get a black belt has turned out to be very different in almost every way than my initial thoughts. So this is not a post to say “look at me aren’t I clever” which is what many qualifications need to be for. Instead it is about a transformation and a journey and a whole set of other ideas.

As a kid I thought martial arts looked amazingly cool. The 70’s posters of Bruce Lee and the occasional glimpses of stylish fighting on TV certainly left their mark on me. We didn’t really have much in the way of martial arts near my home. I also, and this might sound strange, wanted to avoid learning them because the techniques were so potentially deadly. I never experienced full on physical bullying for very long at school as I was always willing to fight back if need be. Being relatively mild mannered it was usually a shock to a potential bully if I retaliated. I thought I might learn things and then turn to the dark side and go and use them, or go all vigilante.
As I got older I then wished I had started younger and was probably put off by the fact that the later I started the less high a level I would be able to attain. If it was worth doing something I wanted to make sure I was really really good at it. All very odd reasons and excuses not to do something.
My friend Jamie and I got a self defence book from out local library when we were 8 years old and occasionally practiced what you do if someone came at you with a bat or a knife. It was all very informal, and a bit scary but us would be super heroes had to learn some stuff 🙂
Zoom on a good few years, I was feeling very overweight (and actually very overweight) and I also wanted predlet 2.0 to get a chance at a martial art that I never had. Something father and son could do together. I thought there would be no chance for me of rocking up at a place of fighting in the state I was in, so strangely I trained for about a year to get ready to go and do a martial art. That and the tech I used features in the article I wrote in Flush The Fashion in 2012
A leaflet in Predlet 2.0 school bag for Choi Kwang Do got my attention. It was coming up to his 5th birthday and I was intrigued as Choi Kwang Do looked like something new. I like new!
The trepidation and nervous excitement with which I took myself and predlet 2.0 to our first session is something I will no forget. 40+ years of wanting to try a martial art but not, plus a year of getting in some sort of shape and learning some fight moves so I didn’t get whooped straight away all bundled together.
Within seconds of walking into South Coast CKD and meeting Sabunim Webster for the first time, and all the other students and instructors I realised how completely and utterly incorrect my perceptions of all martial arts had been. Everyone was calm, happy and above all friendly. I had nothing to worry about with respect to getting beaten up or schooled in front of predlet 2.0. CKD just isn’t like that. So having stepped into an art with thoughts of gaining some sort of prowess, a degree of domination and aggression to get something, to win something… all that was suddenly not what I wanted to do, nor happen and we were in a place where that was not what it was all about.
This particular AHA! moment is only the second such one I remember. The first was when I suddenly realised what you could do with computer code and how it all worked, from just one little concept. It is when something clicks, and the excitement that there is a world of answers out there waiting to be discovered and visited. That set me on my career. It led to a degree in Information Technology, twenty years in IBM with some amazing people and equally amazing projects and firsts in the world. It still keeps me going to day and is the cornerstone of Feeding Edge Ltd.
My CKD aha moment has not had quite so many years to bed in but is unfolding nicely. The learning in Choi has proved to be so much more than just punches and kicks. It has been, and continues to be constant personal challenges. We all learn how to defend ourselves in what is a very dramatic, fast and potentially deadly way. That though is just a small part of the journey and experience. We learn to help our own bodies physically and mentally improve. We have have think faster, think and operate calmly under all sorts of conditions. we learn how to help other learn. Teaching and sharing is done in a completely positive way. We all aim to point out the good things, offer a tweak in technique or attitude, then strengthen with more positive comments. That creates a virtuous circle. It means all the things we learn feel good. Frustration, anger, pressure to conform etc all are out of the equation. We all push ourselves just a little bit more each time because it is a safe environment to explore what we can do. As there is no competition, there is no need to hunker down into the particular ways in order to survive.
The family nature, and subsequent family involvement in Choi (all 4 of us do it now) was a major stabilising factor when we upped and moved house. Whilst we had to say goodbye to South Coast CKD we got to say hello to Basingstoke CKD and to Master Scrimshaw. (Master is a term used for anyone with a 5th Dan (i.e. 5 levels of Black belt) or above). That family atmosphere and friendliness we experience on day 1 applies everywhere. So it is alway great to meet and train with new people, but you do know it is going to be good and welcoming.
As months pass for each belt rank we all learn a piece of syllabus. Each thing we learn builds on everything that we have learned before. At a grading you show you belt syllabus. You don’t go to a grading unless you do know it. So gradings are not exams, they are not there to trick you but to offer a place to focus on what you know. Before you know it though 3 or more years of training, of experiencing the art form, of practice and sharing lead to a Black Belt grading. This is where I found myself last Sunday.
I would say that most people, even if they are not into “fighting”, action movies, or martial arts get what a Black Belt can mean. It is, rather like a university degree, a qualification. It is not obscure, in that clearly it is about physical effort. It is not a quota system either, you do the work, you can achieve a black belt, but you do have to do the work, for real.
I was excited and nervous, but very focussed on getting to the black belt grading. I wanted to make sure I had prepared mentally and physically for it. I received lots of help and focussed training from Master Scrimshaw and my fellow students and assistant instructors. So there we are lining up doing the same pledge and principles we do in every class. I am thinking about everything that led me to this point. Yet I am not thinking about me and what I needed to get out of it. I felt pressure in that I did not want to fail, or mess up, but I felt an overwhelming need to do this for everyone who I have trained with, taught, shared time with in Choi. I was expecting to switch to my “performance” persona and brain. The one that I go to on stage on when doing TV. That is a kind of focussed mental state but directed outwards and towards the subject and to the crowd. That, however, did not happen. I wasn’t there to put on a show. Instead I felt something else. It was as if the original aha moment came back. I was very much there and doing the grading, making adjustments when the automatic elements were not tweaked quite right, but I was also off experiencing the potential. I had some time to ponder all this as I was grading with some people who were doing their 2nd and 3rd Dan black belts. This meant we all did my stuff in each section then I stood down whilst they continued to do 2nd Dan, then finally the 3rd dan was left to do his thing. Keeping warmed up, keeping on task was tricky because I tend to have always just carried on with things, as we do in class. Short breaks then intense action. Whilst watching them and seeing what was coming in the next few years I got to understand that it was the aha moment and joy of the future journey, of learning more and those constant achievements that felt so unusual. I then related to it back to my passion for tech and for understanding where it fits in the world and where it is going. I had always though my tech aha moment was purely in the past but I realise that it happens to me every day as I explore the future.
When the grading was done and we stood lined up Master Brophy, who brought CKD to the UK, built it up and whom we all owe a lot too, explained to us that the Black Belt is not just a test of memory and physical ability, as he explained it is also a test of character and much more I felt I had a true understanding of his words.
It is my black belt, it has my name on it, but it is not just my qualification to crow about, or to wave to prove my worth (as we have to do with school and university qualifications). It is the sum of life experiences that have led up to this point, it is a the sum of all the effort of all those who I have trained with in whatever capacity. It is the start of the next part of an even more interesting journey. It is an indication that anything can be overcome and just getting on with it, applying a positive attitude to things, not getting disheartened just works.
So this is a constant and eternal thankyou to everyone. as we say in CKD Pil Seung! (certain victory).

Unity 4.6 – A new UI

I have always really enjoyed developing in Unity3D as you may have noticed from my various posts and projects. Watching it evolve into such a major platform has been really interesting too. Yesterday Unity moved form version 4.5 to 4.6. Now in most systems a minor point release is not big deal. In fact the whole patch and push versioning with lack of backwards compatibility that bedevils the tech industry is something I find I have to battle with on a regular basis. Do a change in version numbers… no thanks. Except this is the exception to prove the rule.
Unity3d is great at allowing a developer to start placing objects in an environment, attaching behaviours and code to those objects. A lot can be done with point and click, drag and drop but there is also room for real lines of code and proper software engineering. In fact that is essential. I can however, with ease, make a 3d character animated and moving around, colliding with scenery etc. Up to now though if I wanted to use any other form of user input with a Graphical User Interface I had to really go back in time.
The GUI creation in 4.5 and since the earlier releases has been awful. In code in the OnGUI loop you had to try and describe layouts and positions where much of what you put was implied. So you have to describe the user interface and its behaviour with code that merges function with display whilst not being able to see where things are until you actually run it. This is the opposite of most of Unity3d which lets you see where things are in the development environment both when running and when not.
I have lost track of the number of times I have tried a “fancy” layout of buttons, one that starts nicely, with flowing positions based on screen resolution, only to get lost in the code and resort to fixed x,y positions which generally a hit and miss for the first few times. In Unity3d you can expose parameters to allow compound prefab objects to be configured without code changes. I have ended up with UI helper objects that change my spacing and positions at runtime, so I can tweak the numbers to try and shuffle buttons around. Unity3d is great at letting you move alter runtime values, and also great at going back to the originals when you stop running. Unfortunately this works against you on UI’s. You have to tweak the values then remember to write them down on a piece of paper before hit stop so that you can apply them in the code for next time once you get it ‘right’.
That may or may not make sense, but take my word for it, it was a right pain. Let alone what you then had to do to try and alter the look of the buttons, sliders etc with the GUI.Skin which is like CSS but much much worse. So all my research projects have had plain buttons. I am not a graphic designer, and I was not easily able to explain the UI process or how to work with designers to make is nicer.
All that is now gone though. The old way still works in 4.6 (which is a relief from a phased improvement position) but a new shiny UI creation tool and framework is now at last public. It has long been promised, by long I mean literally years! I am sure when I get to the depth of it there will be a few things that cause grief, but that programming folks!.
Now the UI exists as a real type of Unity3d visual object. If you create a layout area you can see it, a button its there all the time. Each of the objects is part of a proper event system but also many of the normal functions are exposed as parameters. If you want a button to do something when pushed you tell it in its setup which function to call.
Previous UI buttons were only ever placed on the view rectangle, like a Heads Up Display (HUD). I have often needed thing to be interactive element in a 3d environment but to act like buttons. Again I have worked around this but now a UI canvas and all its actions, roll overs etc can be detached as a hud and placed into the environment. In my hospital environment this means I can have an better AR style UI at each bed. I have that at the moment but it does not have all the nice feedback and customisation a UI element can have.
UnityScreenSnapz017
The other major change is how the UI elements can be typed and hence changed. Each element can be a regular button, slider etc but they can also be a graphic or even the result of another camera rendering a live texture.
Here is a the game view (though not running) of the demo UI. The continue button is highlighted so on the right are its properties, events etc. The mini menu is showing the choices just for that button on how to transition on normal events such as rollover and click. Here it is set to animation, hence it uses mechanim and potential can go flying around the screen twirling. Simpler transitions such as just a colour tint or a sprite swap can also be defined.

This was possible before but again it was a lot of messing around. The old system of what to do on a roll over etc was buried in the skin code or overridden in the onGUI. Now it makes use of the regular mechanim animation system. This state machine allows definition of all sort of things, and it how we make character jump and dive into rolls, duck etc. It makes sense to have that for more complex UI transitions and states. In multiplayer environments I use a mechanim state change value to send over the network to make sure any mirrored/ghosted network object acts in the same way. So I guess now I will be able to use that for UI’s too to keep players in synch with certain modal activity.
Anyway, this 4.6 and new UI is obviously a precursor to the much bigger Unity 5.0 coming very soon. However it has dropped at the right time for the next phase of a project, so I get to use it in anger and re-engineer something that just about works into something much better.
As I tweeted, this is a great xmas gift, than you Unity 🙂

Landing on a Comet ! Wow!

Yesterday I sat glued to the internet video stream from ESA HQ in Germany tensely waiting to hear, via the orbiter Rosetta that the lander Philae had made it to the surface. When it did and the cheers started to ripple around the feed I had a definite lump in my throat and tear in my eye. I was watching it with predlet 2.0. He had asked if he could play Minecraft and for once I said No, watch this!. Of course without the history behind it all it was just a few people sitting looking at a screen, no commentary. So tried to explain the enormity of what we were waiting for. The fact the probe launched (10 years ago) before he was born is pretty amazing and the fact that the project started 20 years equally so. Though that makes less impact. I think my enthusiastic cheer when it landed probably got it across to him though.
It is truly awe inspiring that people have gathered together across countries to put together a massive project to land a tiny machine on a comet 300 million miles away in order to then perform some experiments and determine what the makeup of the comet is.
It transcends all the ridiculous austerity measures and the greed of those with the worlds wealth. This helps the human race and give some sense of purpose. We need more of this!
This image came from the surface of the comet!
Image on a coment
So we watched this on a 70mbps fibre optic internet stream, live via a web browser on a PS4 games console, whilst also second screening BBC news in iplayer on a small touchscreen wifi enabled device (iPhone), whilst also follow the tweets from the landers account too. I was sharing it with my son in person but also with my friends around the world who were also tuned in. 10 years ago when the probe launched none of the ways we engaged with the landing existed in any mainstream form.
I am sure there are some naysayers who will argue we need to sort this planet out first, but I think we need to sort this planet out as well! This sort of project provides hope and unity. It inspires generations (just as the moon landings did to me when I was just a toddler). That makes it a great investment. We learn lots about our past, help figure out our future but also the will and the passion can be turned back to earth, as log as we don’t let commercial greed take over.
What an awesome experience, and I hope, despite the 3 bounces that the lander continues to fulfil it’s mission. Though as the scientists have been saying when asked what happens now, the answer is, we don’t know, that why we do this sort of science to find out.

Some fun with space science and physics

Last week on the excellent Prof Brian Cox show Human Universe TV show a wonderful real life experiment was shown. It is common at school to learn about gravity, about mass and weight. Often the original concepts from Galileo around dropping objects are covered and considered. We now see footage from space and weightlessness.
However in this experiment a giant vacuum hall is used to show the difference air resistance makes. Being told a heavy thing like a bowling ball or a light thing like a bunch of feathers we are taught from our own observations that the ball will drop faster. In this clip though the air is removed from the environment and we get to see the ball and feathers drop at the same rate. It is spooky, despite knowing the science. You can see from the scientists reaction how special it is to see this. It is real life magic at work.

I have shown the predlets and they are bemused and intrigued too.
I have always wanted to the predlets to feel the wonder of science. We spend a lot of time on screens looking at the virtual. This is all great, but the real thing can be even more amazing.
We just took delivery of a proper telescope this week to be able to check out the night sky. This was in part from being at a friends and all the kids wanting to see the moon close up with their scope, in part because of the wonder at the size of the universe playing space games on Oculus Rift like Elite Dangerous and the influence of TV shows like Human Universe.
I remember looking up in awe at all the constellations and planets, and of course the moon when I was about 7 and armed with a pair of binoculars.
This box arrived from Amazon

After a little bit of assembly it looked like this.

The Equatorial mount on it is, if you are used to 3d applications, really interesting. 3 axis of movement but up is not always up :). You get this in 3d when your rotations are relative not absolute.
There are lots of things to consider setting it up, but just with the basics I pointed it at the moon for us last night. We used the 20mm lens that came with it. We got a pretty full frame moon.
Taking a photo was tricky as I did no have a camera mount, and only the iPhone. However jiggling it around a bit I ended up with this

The predlets we somewhat agast seeing the craters in this close. Obviously they have seen photos before but seeing it yourself certainly does something different. It did when I was a kid so fingers crossed.
Later I spent some time exploring how to line the thing up, adjust the right bits and pieces and there is lots more to explore 🙂
Also as a side effect, as kids always like to play with the boxes of things, we ended up with a lot of cardboard just on the day school asked for some to be brought in for a makefest that they are doing in years 3/4 🙂
TV/internet/games etc might be considered sedentary and lazy, but they can be used to inspire and explore and raise awareness.

Digital Snow – Flush Magazine – 15

It’s starting to get towards winter, the clocks has fallen back so this months Flush Magazine, amongst other things, is all about snow, mountains, skiing and boarding. I just had something snow related to consider for my 14th article for the magazine. (That now sounds like a lot!)
I thought I should cover a bit go history, as usual. Some tech reminiscing. This time it was Hungry Horace goes skiing. Anyone else remember that ? 🙂
These sort of flashbacks to things that felt special or unusual then act a seed to zoom forward. This time I covered how snow has felt on screen in games and films through the years. From the simplistic to the “material point method for snow simulation” CGI research papers created by Disney in making films like Frozen.
I didn’t want to stick to pure virtual snow though but some other advances and where it could go in the near future to enhance our skiing experience. I did give a mention to SNOW the game as that looks mighty impressive and has some real potential IMHO.
Thanks to everyone writing for Flush Magazine, it’s great to have so many good articles to nestle in between on a cold winters night. Huge thanks to @tweetthefashion for such awesome production and editing and for still wanting me to write these journeys in history based futurism.
The full issue can be found below


I am zooming around on page 116 and the iOS version is here
All the past articles and links to them are here on the writing portfolio page
Right, where are my snow boots!

Oculus Rift – DK2 – Elite Dangerous – Breaking the Flow

Feeding Edge’s Oculus Rift DK2 (Development Kit version 2) arrived last week. I had an original Oculus Rift (Pre-Facebook buyout) form the Kickstarter campaign. It is amazing looking back and seen that I wrote about that in August 2013! How time flies!
The new DK2 is a big leap in technology, though it comes in a much less impressive cardboard box, rather than the foam filled attache case of the original one.

The new unit is smaller and lighter but boasts a huge increase in picture quality. There have been quite a few ergonomic tweaks too.

The placing of the wires to the unit are now in the centre top and the cable is fed over the top head strap so it doesn’t get in the way as much as the side dangling cable of the DK1.

The old control box form the DK1 is gone, this used to take all the connections and a power switch etc. It was often the first thing to fall on the floor as the cables got tangled up.
The eye pieces are noticeably bigger too, presumably to get all that extra resolution and field of view in.

Aside from the resolution of the screens the biggest change is the external tracking of the Rift’s position. DK1 knew you were turning and tipping but if you stepped or moved side to side it had no idea. DK2 comes with a sensor that need to be mounted and pointing at the rift. It can, with software that bothers to use it, tell if you are near of far right or left, up or down relative the the sensor. It is not a kinect, but it does give 3d space positioning.
My neighbours may glance up and see this tripod and thing it is facing out of the window, when in fact it is pointing at me!

Getting the rift running on the Mac was a doddle as usual. With no control box the connections are simply and HDMI for the display and two USB connections one for the camera device and one for the headset. The power lead runs into a tiny splitter block and a synch lead runs from that to the camera sensor.
It was more tricky getting it to run on windows 8.1. though. My HP laptop has 2 gfx cards and trying to persuade it to extend the display and put primary rendered graphics on the rift is a bit of a pain. I ended up with the primary display 90 turning itself 90 degrees and the secondary rift flipped 180 degrees in landscape. The demo/tuning application runs in direct mode so you don’t need any of this but lots of the other things rely on extended desktop. You can of course make rift your primary display but it is not then delivered a one image but a bit in the left eye and a bit in the right so it makes launching things difficult.
However after a bit of “well thats just windows isn’t it!” I sparked up the Elite Dangerous Beta. This had upgraded since the last time. Elite has a display on secondary option though this was not DK2 direct mode it was possible to make it work.
I was instantly blown away by the improvements to the visual resolution of DK2. My laptop managed to cope too with the environment and it may not be the best gaming rig in the world but it certainly worked. I launched out into space and got chills running down my spine with the sense of being there as the shadow of the cockpit struts moved across my field of view as I turned away from the star I was docked near. It is really really good. I mapped a button on the stick to re centre the rift view though as there was a slight drift. However 2 beta things and on windows 8.1 it was not a huge problem.
I flew around for a while taking in the scenery then got into a slight dogfight in space. Being able to pull tight turns but look up and backwards to spot the bad guy was really effective. However I did suffer a little from not being able to see my controls. Using a joystick with loads of buttons is great, but I did not realise how much I had relied on seeing button 7 ! So it did not take my adversary very long to break my ship down. In the previous beta of Elite the canopy would crack than the ship would blow and you ejected. The new version before the cracking I got sparks and smoke filling the cockpit. This was an incredible effect in the Rift. The combination of the sense of being there in the vastness of space, with the confinement of sitting in a smoking capsule is something I will remember as a key moment in my gaming and VR experiences. The rift makes sense in games like this as you probably would have a helmet on anyway. That is something a lot of games designers could consider. Rather than trying to pretend its not something you have on, embracing it and making it part of the game narrative.
I experienced a crash, in a typical windows style, whilst trying to dock as a space station. I can’t tell if it was the rift, the game or the machine that did it but at a key moment of concentration and full immersion the shock of being unplugged from the environment, to loose the cockpit and the space around me to be replaced by half of the windows desktop in each eye was horrendous. It seems the Rift and Elite where generating a different type of gaming Flow. I was not experiencing flow because the task had become automatic. I was concentrating on landing the ship in the right place. It required a lot of though. However, mentally I was experiencing sensory flow. I had stopped considering the medium as a barrier, or as anything other than real. For reality to be turned off with no warning created a mental jolt. Its a very strange feeling and one that took a few moments to analyse and think about. It shows the power of these imagination enhancers and our blended reality. I should be used to this stuff but it still hits me 🙂