Happy 6th Birthday Feeding Edge Ltd

It is now 6 years ago this month that I started Feeding Edge Ltd! 6, six, VI, Tasset, 3×2 etc etc. However you do the numbers the time has flown by. I like to look back at all the things that would not have happened, or would have been very unlikely if I had not left my then job of 20 years back in 2009. Of course I still do what I did then, with a lot more freedom to operate and explore. Of course it has downsides too. A corporate guaranteed regular high end salary pays for a lot of things, though you have no time to do them. A larger company has people able to cover you, to add to projects and to find other interesting projects. It does of course also have the chance to middle management to act in odd ways and for project scopes to be messed about by sales people.

So as a single person company I have to tread a balance between not over tendering or getting involved in too many projects, but also lay the foundations and seeds for other longer term chances. It is, as the photo about shows of my WWE wrestler a fight, with the chance of some scars 🙂
Right at the moment I am waiting on a multi year development project to kick off. Unity3d and lots of other stuff, it’s more of this. However, external organisations interacting, and being the end of the chain this process has dragged on since before christmas. It is my decision to stay with it, but commercially in the short term it is painful. It is a gamble as I could be spending time waiting for something that doesn’t happen. The previous phase of the project meant I had to technically leave Feeding Edge for a few months as the project did not accommodate “sub contracting”. This ended up just costing me personally as I got enrolled in a pension for a few months and I had left by the time I got any paperwork and never had a chance to reject the pension. Just one of those things.
However the worst feeling was not working for Feeding Edge. I was really, its still my company but officially on the tax books etc I was employed elsewhere on a sort of zero hours contract. I don’t think I will be doing that again. It just means more paperwork and less income 🙂
So while I am waiting I try and not go off selling my services or getting wrapped up in projects. I know full well if I do I will end up not being able to deliver either. There is no such thing as a short term piece of work for me usually as I get embroiled in the whole project very quickly. Quick contract coding no strings attached is not really my style.
So downtime between paying work is spent exploring, learning. I try to do something interesting and learn something everyday.
I am of course available to do other things like TV, Radio, Writing etc. In many ways they make the ideal counter balance for development work. When you are building things you tend to have to block time out to code. I gathered together all my Flush Magazine articles into a portfolio book recently. It was really interesting to see it as an entire body of work

That is only from a few years or writing, so maybe, just maybe I need to write a book after all? A different book to the one I expected to be writing about leaving corporate life 6 years again and telling all, and one that instead focuses on the style of writing I have arrived at with Flush, looking at past patterns and experiences and projecting them forward as a sort of Historic Futurism.
After an interesting twitter discussion with friends old and new today I was still as evangelical as ever about the metaverse and the exiting times ahead as VR and AR headsets demand virtual environments and it starts to become even more normal to expect that sort of interaction. It’s not going away, and neither am I!
Right time to go and get ready to learn and teach my chosen martial art Choi Kwang Do. For over 3 years now this has been a major part of my life and my families life. I have to share this picture as Sabunim Close came to visit Basingstoke CKD this week. He, at 17 was accepted to a Korean university to pursue a degree in Choi Kwang Do. He upped sticks and went there knowing full well it was the right path for him, as there rest of us did 🙂 It was great to see him return on holiday and come and visit us all. Pil Seung (certain victory)

CKD has provided the social contact that working at home and in isolation (despite lots of digital contacts) as well as re-enforcing my own guiding principles of trying to do the right thing and help people.
So, on top of. spending more quality family time, the freedom to operate, the chance to have done 3 series of kids TV, presenting to lots of groups of interesting people, exploring startup life, learning, exploring and building virtual environments I have achieved a life time ambition of a black belt (a life changing experience) and I also get to teach and mentor. Not a bad move was it in 2009 🙂

MergeVR – a bit of HoloLens but now

If you are getting excited and interested, or just puzzling what is going on with the Microsoft announcement about Hololens and can’t wait the months/years before it comes to market then there are some other options, very real, very now.
Just before christmas I was very kindly sent a prototype of new headset unit that uses an existing smartphone as its screen. It is called MergeVR. The first one like this we saw was the almost satirical take on Oculus Rift that Google took with Google Cardboard. A fold up box that let you strap your android to your face.

MergeVR is made of very soft, comfortable spongy material. Inside are two spherical lenses that can be slid in and out lateral to adjust the divergence of you eyes and get a comfortable feel.
Rather like the AntVR I wrote about last time, this uses the principle of one screen, split into two views. The MergeVR uses you smart phone as the screen and it slides comfortably into the spongey material at the front.
Using an existing device has its obvious advantages. The smartphones already have direction sensors in them, and screens designed to be looked at close up.
MergeVR is not just about 3d experience of Virtual Reality (One where the entire view is computer generated). It is, by its very name Merge, about augmented reality. In this case it is an augmented reality taking your direct view of the world and adding data and visuals to it. This is knows as a magic lens. You look through the magic lens, and see things you would not normally be able to see. As opposed to a magic mirror which you look at a fixed TV screen to see the effects of a camera merging with the real world.

The iPhone (in my case) camera has a slot to see through on the in the MergeVR. This makes it very difference from some of the other Phone On Face (POF – made up acronym) devices. The extra free device I got with the AntVR, the TAW is one of these non pass through POF’s. It is a holder, and lenses with a folding mechanism to adjusts to hold the phone in place. With no pass through it is just to watch 3d content.


AntVR TAW
Okay so the MergeVR is able to let you use the camera, see the world, and then you can watch the screen close up without holding anything The lenses make you left eye look at the right half and the right eye at the left half. One of the demo applications is instantly effective and has a wow factor. Using a marker based approach a dinosaur is rendered in 3d on the marker. Marker based AR is not new, neither is iPhone AR, but the stereoscopic hands free approach where the rest of the world is effectively blinkered for you adds an an extra level of confusion for the brain. Normally if you hold a phone up to a picture marker, the code will spot the marker, the orientation of the marker and relative position in the view then render the 3d model on top. So if you, or the marker moves the model is moved too. When holding the iPhone up you can of course still see around it, rather like holding up a magnifying glass (magic lens remember). When you POF though your only view of the actual world is the camera view of the phone. So when you see something added and you move your body around it is there in your view. It is only the slight lag and the fact the screen is clearly not the same resolution or same lighting as the real world that causes you to not believe it totally.
The recently previewed microsoft Hololens and the yet to be seen Google funded Magic Leap are a next step removing the screen. They let you see the real world, albeit through some panes of glass, and then use project tricks near to the eye, probably very similar to peppers ghost, to adjust what you see and how it is shaded, coloured etc. based on a deep sensing of the room and environment. It is markerless room aware blended reality. Using the physical and the digital.

Back to the MergeVR. It also comes with a bluetooth controller for the phone. A small hand held device to let you talk to the phone. Obviously the touch screen when in POF mode means you can’t press any buttons 🙂 Many AR apps and examples like the DinoAR simply use your head movements and the sensors in the phone to determine what is going on. Other things though will need some form of user input. As the phone can see, it can see hands, but not having a Leap motion controller or a kinect to sense the body some simpler mechanism can be employed.
However, this is where MergeVR gets much more exciting and useful for any of us techies and metaverse people. The labs are not just thinking about the POF container but the content too. A Unity3d package is being worked on. This provides camera prefabs (Rather like the Oculus Rift one) that splits the Unity3D view into a Stereo Camera when running into the right shape and size, perspective etc for the MergeVR view. It provides extra access to the bluetooth controller inputs too.
This means you can quickly build MergeVR 3d environments and deploy to the iPhone (or Droid). Combine this with some of the AR toolkits and you can make lots of very interesting applications, or simply just add 3d modes to existing ones you have. With the new unity3d 4.6 user interfaces things will be even easier to have headsup displays.
So within about 2 minutes of starting Unity I had a 3d view up on iPhone on MergeVR using Unity remote. The only problem I had was using the usb cable for quick unity remote debugging as the left hand access hole was a little too high. There is a side access on the right but the camera need to be facing that way. Of course being nice soft material I can just make my own hole in it for now. It is a prototype after all.
It’s very impressive, very accessible and very now (which is important to us early adopters).
Lets get blending!

(Note the phone is not in the headset as I needed to take the selfie 🙂

Another VR headset – AntVR

I backed a project called AntVR to produce another VR headset and it recently arrived. Whilst all these things may looks the same this is a slightly different approach the now famous Oculus Rift.
The box arrived via UPS, unfortunately there seems to have been a mistake on the shipping. Whilst this was included in the Kickstarter price I had to pay the delivery man another £35. That’s early adoption for you!

The main headset unit is about the same size and weight as a Oculus DK2. The main cables rather than passing over your back drop down either side of your cheeks.


The inside view is not two round lenses as with the rift but 2 much flatter rectangular lenses. It fits over my regular glasses (though being varifocals are not idea for any of these headsets)

The reason for the different in sense shape is quite fundamental to the entire design. The AntVR uses one screen inside the headset. The lenses direct the eye to focus on that screen. The Rift appears to have 2 screens one for each eye, each eye has a spherical image projected onto it. The AnyVR is really a regular monitor looked at up close.
Whilst the latter is a lesser experience it does have a major advantage. The AntVR has a mode button on it to let it just be a regular desktop screen. This makes for much less messing around (particularly in windows) with which monitor is which. Often with the Rift you find yourself trying to half look at the regular desktop when working in extended mode or swapping the primary monitor when an application doesn’t use direct mode.
So once in the AntVR you are looking at a big screen. The mode button on the unit lets you switch to side by side 3d. So anything, like youtube, that has SBS video you can watch straight away, looking at a normal view, navigate to the browser etc. Switch to full screen then press the button on the unit and you have 3d. I did find on Windows 8.1 internet explorer seems to refuse to show SBS video on youtube. The stub bar shows side by side images but the main window just the one. On chrome it was fine. Initially I though the headset was causing it but I think it was just the browser being the browser it is!
The unit has two sliders to move the lenses outward and inwards laterally to adjust for your own eye comfort.
The headset acts as a mouse pointer with the accelerometers in it moving the mouse around, which is not always very helpful to start with.
The kit then diverges from the Rift in what it provides.
Firstly it will plug into any HDMI source (needing USB to power it). Because it is not always in SBS 3D it doesn’t need the spherical processing the Rift has to do. Obviously that flexibility is a trade off with the field of view of the Rift and the comfort factor on the eyes.
The AntVR also comes with a peripheral. It is rather like some Klingon device, a few parts clip together and you have what looks like a gun controller.



This controller can also morph, with 2 handles plugged together, into a joypad. Both parts are charged by USB so are both active units.
In addition to the “gun” there is also a cleaning device, a small rubber bladder and nozzle for blowing dirt out of the lenses. They refer to this as the “hand grenade”.

All this came through the post from China. I suspect the delay getting here may have been some puzzled customs people.
There are demo games that are ready to download. These are designed to be 3d SBS, they are both horror/zombie ones. I am not sure if I configured things correctly, but the thumbstick on the back of the gun did allow me to move my character in an FPS environment. The gun trigger worked but it was a little odd using my head to turn and look at the bad guys but holding a gun that it didn’t matter where I pointed it. I think I needed to plug a few more things in.
Another interesting feature is a little sliding trap door underneath the headset that lets you peer down and see your hands. Great for typing !
So fr I have only tried the unit on my windows laptop. For gaming, Elite Dangerous for example the Rift DK2 with its head position sensor and spherical eye view is much more comfortable and immersive. However getting the thing running can be a pain with some many different ways windows likes to mess with graphics cards and extended monitors.
AntVR seemed to just work out of the box, the gun controller might be a bit over the top but the ability to work on anything and look at any content without too much hassle is interesting.
Next up I need to write about the two smartphone as a VR screen devices I have here, one of which was an extra thrown into the box with the AntVR.
We are creeping towards devices that anyone can use without too much faffing about but we have not got there yet. Then of course we need the content. The environments, the user interfaces and the new ways of engaging.

From ICE to EV – Test driving Nissan Leaf

Just before Christmas my Subaru Impreza had a seemingly catastrophic coolant problem. I arrived at my school governor training (4 miles away) in a cloud of steam from my bonnet. The car needed a low loader to get it home, but all that took a while, so it was lucky I had the course to attend.
We had been considering a new car. We have a Honda FRV with has 6 seats and loads of room, is automatic etc. Then we have my scooby which doesn’t really get the chance to get used as much. as I work from home almost all the time most of my journeys are to take @elemming to the train station and pick her up. Driving to the supermarket or shops and most importantly heading off to Choi. All of which are about 6miles round trip.
I have an interest in tech of course, so electric vehicles (EV’s) always sounded interesting but there is a problem with causing the entire family to be early adopters if it is just for the sake of early adopting and spending significant money.
We had looked at normal Internal Combustion Engine cars (ICE). All very nice, but very few things were going to be as close to my driving experience with the Scooby. Hence all the new cars I figured would disappoint in some way. I have had the scooby since before predlet 1.0 was born so that’s 11 years. So it has seen some major changes in life, predlets born, leaving corporate life, moving house, taking up a martial art etc. It has also been very reliable. However things have to move on.
We took a punt and test drove a Nissan Leaf. This was particularly good test drive wise as they gave it to us for 7 days. Having an EV for a week gives you a chance to see if it really does fit the in with the family and out needs. It let me do some experiments too.

The first objection most people have to an EV is the range. Unless you buy a £100k Tesla you are going to be getting a car that has a 100 mile range, a 50 mile round trip. When you are used to 250-350 miles in an ICE that seems not great. However, remember I only needed this for the short 6 mile round trips. You can get a lot of those on 100 miles of battery, plus they return home, where a recharge can occur is need be. In addition we already still have a petrol Honda FRV for those long journeys. So really the range is only a problem if the other car is in use and one of us need to do a longer range journey. If this situation is going to occur is can be planned for as many EV suppliers offer petrol hire cars as part of the package. Of course this is assuming you couldn’t charge on a journey. Which you can.
So on these short hops locally an EV is ideal. It has plenty of charge and plenty of scope to just jump in and use it.
One of my concerns that any car would just not be as entertaining as my Scooby was kicked right into touch straight away. Even a relatively low end car like the Leaf has a fantastic feel to it. They feel very light, in a responsive lightness not a flimsy lightness. They also accelerate. The 0-60 on the Leaf is abut 7.5 seconds. My Scooby was about 5.9 when it was new. However… the profile of an EV means it has constant torque through the entire range. There is no specific gearing. Even a ICE automatic changes gear, or has kick down to accelerate. The Leaf just accelerates, whatever speed its goes. Without the engine noise it, and very little road or air noise the speedometer is really the only indication of speed. You hit 50, 60 and 70 very quickly on the motorway, but once at them the speed almost feel the same. This is a good thing, it means cruising at any speed is easy and acceptable. You are not between gears or struggling to justify the 5th gear change.
My drive to Hedge End I did firstly just normally in regular drive mode having not topped the car up after the station and school run. So I was on about 85/90% battery. So I very quickly experienced the anxiety that all EV driver get. In ICE you have a petrol needle, its not very accurate but you know you have a few miles left after any warning light or hitting the red. Up until that point you just drive. The you maybe have to eek it out a bit more to make it to home or a petrol station. In the EV you have a lot of information, accurate information, telling you battery levels, power usage and estimated range. If you hit the aircon/heating and floor the accelerator on the motorway you see you project range tumble. The climate control knocks 7 miles off the range. I realised I was burning too much as I reach what would have been a half way point and headed for home, this time with the eco button and full mode B energy recovery. I approached home and the warning lights started flashing. I only had 15 miles left of range. Panic!. The sat nav asked if I needed directions to a charger! PANIC!!!
Of course this is all slightly ridiculous as by now I am 3 miles from home. Even with climate control on full I would make it, but it does play on your brain.
The next day I thought I would try the same journey but top up to 100% at home before I went. Home charging is just plugging into a 13 amp plug (though you can install higher rate quicker chargers for a few hundred pounds). It means if you go to a friends house and need a top up you can (though thats a bit cheeky, I am not sure we have the social etiquette for that sorted out yet)
This time I drove on the motorway back down to the south coast on full Eco (where the max power is dropped and the effects of the throttle are lessened) and mode B where more engine braking is applied when you ease off and the battery is charged with that power.

So we have the simplicity of just putting in drive and pressing the accelerator, but we have the complexity and thought of energy management and recovery. Easing off before coming to halt to get your green lights lit on the left of what would be a rev counter.
I got to the same point with plenty of return charge but still drove back on Eco. Because of the perceived speed it didn’t seem to matter in a cruise on the motorway to have turned the wick down on the car. The energy management and recovery made for fun game. Something Nissan have spotted by rewarding you with a christmas tree/lives indicator that gradually fills up as you do good work. It sounds mad but it works.
I managed to max it out on one journey- Yay !

On the second Hedge End trip I was slightly less freaked out by the battery indicator but as I got to Winchester services, and as the car had an Ecotricity card in the sunglasses holder I thought I would try a rapid top up charge.
There were 2 bays both without cars in them, so it fitted to experiment. I held the RFID card up and chose my rapid charge mode. The nozzle comes with the plug, it was a massive device compared to the home charger as it had lots of sliders and locks. Once engaged charging started.


I was on about 33% and went for a coffee and a comfort break. It was about 15 minutes and I was not waiting around tutting.
I was impressed that Costa reminded me I was driving a leaf and had my little Eco xmas tree game to play in the car by drawing on the top of my coffee with a similar motif.

I returned to the car, went to cancel the charge and it asked me to swipe the card again. This of course makes sense to stop people unplugging you out of spite. I disconnected and saw I now had 66% charge (64% when I took the photo and faffed around before hand). Not bad!

So for the 10 miles home I dropped the eco and the full energy recovery. Pulling onto the motorway it was like having a brand new car as it woke up from its space cruising stasis and went full alien monster.
When I got home I had used 20% of the battery already. So it really hits home how to moderate driving when needed, not accelerate quick so hard. Much more so than a petrol car that you don’t really think about it much until the needle gets low.
One of the other gadgets on the Leaf is is great set of cameras surrounding the car. The dash display that doubles as the GPS, radio and everything else switches to cameras automatically for reversing, but also can be switched to camera when at low speed/ stationery heading forwards.
A reversing camera makes a lot of sense, but also the ability for it to generate an apparent birds eye view is fantastic for parking in bays.
Here I am safely parked and stationary with the forward camera working. It is in black and white as it is in night sight mode. You can see in the dark

The birdseye view can be replaced with a camera on the front left quarter, i.e. the turning blind spot, by pressing a button.
Reversing also provides the angle of trajectory super imposed Augmented Reality style on the reversing camera. Here I a parked but in reverse on out drive way showing the obstacles (in colour this time)

We had a top of the range Leaf so it had everything on it. A spoiler with a solar panel to charge the second battery (a regular car battery used for lights etc). Voice control, bluetooth hook up, it understood my iPhone playlists when plugged into the USB, heated seats etc etc. It had GPS builtin and knew where a particular network of chargers were to navigate too. Full ownership gets you hooked up (along with your charger) to CARWINGS, which is Nissan’s internet of things network for automotive. This lets you control the car setting via an app, asking it to warm up in the interior in the morning. Or you can, as I did, just use the timer. When on charge it will draw house electricity to power the AC and warm the car for a set time and temperature. There is of course no point going out and starting the engine as it doesn’t heat up like and ICE.
I enjoyed looking at the energy information screen quite a lot. It felt like playing Elite Dangerous or Eve Online balancing the power usage.

I also was entertained by the graphic for the “don’t put a baby seat in the front” which was the most extreme one I had seen. The middle picture almost looking like a theme park ride icon.

So it seems that I am quite taken with the Leaf, it has a very particular set of skills 😉
We are going to test drive the BMW I3, though that offers a range extender, a motorbike engine and 9l tank to give another 50 or so miles. However I am not sure if that is worth us getting as it would almost never be used except in extreme circumstances when it might better to hire a car or take the train. Also BMW only seem to offer a 30 min test drive, and after that if they think you are worthy a few hours test drive. Our nearest BMW dealer is in Eastleigh too which is a pain. However BMW, and all car people you need to ramp up the test-drive time!
Lets see what happens next 🙂

Happy 2015 – Flush 16 – Are you Santa?

I hope everyone is getting ready for a great 2015 and had a good break (if you got one). For a first post of the year I am going to start early and mention Christmas :). This though is just to let you know that issue 16 of Flush The Fashion magazine is live and my slightly esoteric “Hello, Are you Santa Claus?” article this issue looks at probability and how that features in Quantum theory and a light hearted look at multiple universe theory.
The whole magazine is cool and interesting as usual from @tweetthefashion. I say this each time but it is still true, the amount of effort and skill that goes into making this magazine is fantastic and it’s an honour to have my content included.
If you look at the embed below you will find me on page 22 but just before that there are some great images of food skateboarding by Benoit Jammes called Skitchen
So where else do you get such an eclectic combination of quantum physics and skating fruit?
See what you think.


The direct link to my article to get your brain kicking in for the new year is here
Now would be a good time to kick in with my epredator theme tune to introduce it 🙂
Now to work out what 2015 is going to bring. It looks like there will be a lot of VR and AR with so many headsets and combinations of tech emerging. It is very exciting and it all needs to be powered by virtual world content. So another year of metaverse evangelizing before it becomes the norm then 🙂

Theme tune

A while back I invested in the Unwinnable Kickstarter campaign to create a weekly quirky look at video games. I picked as one of my rewards a personal them tune composed by George Collazo I was intrigued at what might come out of such an exercise. Those of us getting a tune done filled in an interesting set of questions. I got a tweet just now for George saying my tune was ready. I click play and loved it, then read what George had used as inspiration.
” I tried to combine super spy James bond style coolness with pure Bruce Lee badassey with Enter the Epredator.” Yes tech and martial arts in an awesome tune!
Thankyou George! I love it.

2.3.4 dum diddle um dee dum dum, dum diddle um de darm.

Primary school computing – Barefoot

The past few years have seen a real recognition that computational thinking needs to be part of everyones education from a very young age. The UK national curriculum had some massive changes to introduce this, which is great, but it was not always matched with any resources or training to help anyone who was not a techie and programmer from understanding it let alone then be able to teach it.
I like to make people feel comfortable with technology and explain how it fits into what they know, how it can improve, or threaten their experience. So was very happy to see this initiative. Barefoot Computing is an initiative from the BCS The Chartered Institute for IT and supported by Computing at School. It seems to be a complete system of curated content that is not solely about programming or tech.
Instead it looks at how to transition what teachers know and teach already and frame that into the basis of the next steps in teaching computing. It also them provides free resources to help teach, lesson plans, Scratch programming assets etc.
It is also not simplistic, it is not a “just code”. One of the most important and striking use of word as an entire category is “tinkering”. The fact that lesson plans and the resources include specifics about letting go, allowing exploration by the students to see what happens struck me as very different.
Each of the plans is tagged with specific helpful icons that indicate the point of the lesson. These in their own right are very useful and telling about how this is all structured.
I have image captured from the site, just as an example to be able to talk through, but go and take a look, there is a free registration. It is worth us all knowing the resource is there, to share with schools and colleagues. Or better still get one of us the STEMnet and Barefoot computing volunteers to come into school and share it with the teaching staff.
Comp thinking
Here we have the core approaches to computational thinking.
Tinkering, finding out what happens and discovering new things is the first on the list!
Debugging sounds a bit more techie, but it is spotting whats wrong it anything and fixing it.
Persevering. This is not something you usually see in “tech” presentations. Something not working, not operating as you expected is a normal part of the business. It is an important aspect of life.
Collaborating. Working with others is also essential.
You can se these are not computing specific. They apply to every walk of life and and experience. In my martial art Choi Kwang Do, Perseverance is a key principle. We tinker and debug as we learn about how to perform our moves, adjust, apply a what if approach and finally we work together in a friendly environment. This just happens to be focussed on programming, but it is so much more!
comp concepts
Clearly this next layer looks more like the technology words we use in the industry, but again many of them are applicable in non keyboard related instances. Breaking problems down, spotting patterns, making things repeatable are all key elements.
All these are setting the scene before the specifics that make up computer science. The nuts and bolts of what we do as programmers and architects. Now if you came at computing with this list below it would be a lot more overwhelming, even though each element is relatively simple as a concept.
all
Very often as techies this is what we hit people with. Often in order to help them understand the beauty and elegance in the art of programming, or as a weapon to maintain a technical dominance.
In all these icons though my favourite is the variables one. It specifically shows the word lives and 3 hearts. The number of lives you have in a game is a variable. It goes down (sometimes up if you hit a bonus). For me the variable was the part of the aha moment, when all computing became clear for me.
I started computing at school to work out how Space Invaders worked. We were writing a program in class dealing with electricity bills. It had a loop to recalculate a bill based on different inputs. It was a variable we were setting. It all seemed like a weird mess of symbols until it clicked that the thing we were setting and how it stayed set or got changed by the code was the same as the score in space invaders! It was also the same as the position of each invader, where the player was etc… Kaboom! a life changing understanding of something that seems laughably simple when you spot it and then apply the pattern repeatedly to other situations.
So, a big thumbs up from me and I will be helping predlets school, and any others that need it with this set of resources as a starting point.

Black Belt experience

Last sunday 7th December 2014 I achieved another life ambition and graded to receive my Choi Kwang Do Il Dan (1st Degree) Black belt. What is interesting about this is that the ambition and reason to get a black belt has turned out to be very different in almost every way than my initial thoughts. So this is not a post to say “look at me aren’t I clever” which is what many qualifications need to be for. Instead it is about a transformation and a journey and a whole set of other ideas.

As a kid I thought martial arts looked amazingly cool. The 70’s posters of Bruce Lee and the occasional glimpses of stylish fighting on TV certainly left their mark on me. We didn’t really have much in the way of martial arts near my home. I also, and this might sound strange, wanted to avoid learning them because the techniques were so potentially deadly. I never experienced full on physical bullying for very long at school as I was always willing to fight back if need be. Being relatively mild mannered it was usually a shock to a potential bully if I retaliated. I thought I might learn things and then turn to the dark side and go and use them, or go all vigilante.
As I got older I then wished I had started younger and was probably put off by the fact that the later I started the less high a level I would be able to attain. If it was worth doing something I wanted to make sure I was really really good at it. All very odd reasons and excuses not to do something.
My friend Jamie and I got a self defence book from out local library when we were 8 years old and occasionally practiced what you do if someone came at you with a bat or a knife. It was all very informal, and a bit scary but us would be super heroes had to learn some stuff 🙂
Zoom on a good few years, I was feeling very overweight (and actually very overweight) and I also wanted predlet 2.0 to get a chance at a martial art that I never had. Something father and son could do together. I thought there would be no chance for me of rocking up at a place of fighting in the state I was in, so strangely I trained for about a year to get ready to go and do a martial art. That and the tech I used features in the article I wrote in Flush The Fashion in 2012
A leaflet in Predlet 2.0 school bag for Choi Kwang Do got my attention. It was coming up to his 5th birthday and I was intrigued as Choi Kwang Do looked like something new. I like new!
The trepidation and nervous excitement with which I took myself and predlet 2.0 to our first session is something I will no forget. 40+ years of wanting to try a martial art but not, plus a year of getting in some sort of shape and learning some fight moves so I didn’t get whooped straight away all bundled together.
Within seconds of walking into South Coast CKD and meeting Sabunim Webster for the first time, and all the other students and instructors I realised how completely and utterly incorrect my perceptions of all martial arts had been. Everyone was calm, happy and above all friendly. I had nothing to worry about with respect to getting beaten up or schooled in front of predlet 2.0. CKD just isn’t like that. So having stepped into an art with thoughts of gaining some sort of prowess, a degree of domination and aggression to get something, to win something… all that was suddenly not what I wanted to do, nor happen and we were in a place where that was not what it was all about.
This particular AHA! moment is only the second such one I remember. The first was when I suddenly realised what you could do with computer code and how it all worked, from just one little concept. It is when something clicks, and the excitement that there is a world of answers out there waiting to be discovered and visited. That set me on my career. It led to a degree in Information Technology, twenty years in IBM with some amazing people and equally amazing projects and firsts in the world. It still keeps me going to day and is the cornerstone of Feeding Edge Ltd.
My CKD aha moment has not had quite so many years to bed in but is unfolding nicely. The learning in Choi has proved to be so much more than just punches and kicks. It has been, and continues to be constant personal challenges. We all learn how to defend ourselves in what is a very dramatic, fast and potentially deadly way. That though is just a small part of the journey and experience. We learn to help our own bodies physically and mentally improve. We have have think faster, think and operate calmly under all sorts of conditions. we learn how to help other learn. Teaching and sharing is done in a completely positive way. We all aim to point out the good things, offer a tweak in technique or attitude, then strengthen with more positive comments. That creates a virtuous circle. It means all the things we learn feel good. Frustration, anger, pressure to conform etc all are out of the equation. We all push ourselves just a little bit more each time because it is a safe environment to explore what we can do. As there is no competition, there is no need to hunker down into the particular ways in order to survive.
The family nature, and subsequent family involvement in Choi (all 4 of us do it now) was a major stabilising factor when we upped and moved house. Whilst we had to say goodbye to South Coast CKD we got to say hello to Basingstoke CKD and to Master Scrimshaw. (Master is a term used for anyone with a 5th Dan (i.e. 5 levels of Black belt) or above). That family atmosphere and friendliness we experience on day 1 applies everywhere. So it is alway great to meet and train with new people, but you do know it is going to be good and welcoming.
As months pass for each belt rank we all learn a piece of syllabus. Each thing we learn builds on everything that we have learned before. At a grading you show you belt syllabus. You don’t go to a grading unless you do know it. So gradings are not exams, they are not there to trick you but to offer a place to focus on what you know. Before you know it though 3 or more years of training, of experiencing the art form, of practice and sharing lead to a Black Belt grading. This is where I found myself last Sunday.
I would say that most people, even if they are not into “fighting”, action movies, or martial arts get what a Black Belt can mean. It is, rather like a university degree, a qualification. It is not obscure, in that clearly it is about physical effort. It is not a quota system either, you do the work, you can achieve a black belt, but you do have to do the work, for real.
I was excited and nervous, but very focussed on getting to the black belt grading. I wanted to make sure I had prepared mentally and physically for it. I received lots of help and focussed training from Master Scrimshaw and my fellow students and assistant instructors. So there we are lining up doing the same pledge and principles we do in every class. I am thinking about everything that led me to this point. Yet I am not thinking about me and what I needed to get out of it. I felt pressure in that I did not want to fail, or mess up, but I felt an overwhelming need to do this for everyone who I have trained with, taught, shared time with in Choi. I was expecting to switch to my “performance” persona and brain. The one that I go to on stage on when doing TV. That is a kind of focussed mental state but directed outwards and towards the subject and to the crowd. That, however, did not happen. I wasn’t there to put on a show. Instead I felt something else. It was as if the original aha moment came back. I was very much there and doing the grading, making adjustments when the automatic elements were not tweaked quite right, but I was also off experiencing the potential. I had some time to ponder all this as I was grading with some people who were doing their 2nd and 3rd Dan black belts. This meant we all did my stuff in each section then I stood down whilst they continued to do 2nd Dan, then finally the 3rd dan was left to do his thing. Keeping warmed up, keeping on task was tricky because I tend to have always just carried on with things, as we do in class. Short breaks then intense action. Whilst watching them and seeing what was coming in the next few years I got to understand that it was the aha moment and joy of the future journey, of learning more and those constant achievements that felt so unusual. I then related to it back to my passion for tech and for understanding where it fits in the world and where it is going. I had always though my tech aha moment was purely in the past but I realise that it happens to me every day as I explore the future.
When the grading was done and we stood lined up Master Brophy, who brought CKD to the UK, built it up and whom we all owe a lot too, explained to us that the Black Belt is not just a test of memory and physical ability, as he explained it is also a test of character and much more I felt I had a true understanding of his words.
It is my black belt, it has my name on it, but it is not just my qualification to crow about, or to wave to prove my worth (as we have to do with school and university qualifications). It is the sum of life experiences that have led up to this point, it is a the sum of all the effort of all those who I have trained with in whatever capacity. It is the start of the next part of an even more interesting journey. It is an indication that anything can be overcome and just getting on with it, applying a positive attitude to things, not getting disheartened just works.
So this is a constant and eternal thankyou to everyone. as we say in CKD Pil Seung! (certain victory).

Unity 4.6 – A new UI

I have always really enjoyed developing in Unity3D as you may have noticed from my various posts and projects. Watching it evolve into such a major platform has been really interesting too. Yesterday Unity moved form version 4.5 to 4.6. Now in most systems a minor point release is not big deal. In fact the whole patch and push versioning with lack of backwards compatibility that bedevils the tech industry is something I find I have to battle with on a regular basis. Do a change in version numbers… no thanks. Except this is the exception to prove the rule.
Unity3d is great at allowing a developer to start placing objects in an environment, attaching behaviours and code to those objects. A lot can be done with point and click, drag and drop but there is also room for real lines of code and proper software engineering. In fact that is essential. I can however, with ease, make a 3d character animated and moving around, colliding with scenery etc. Up to now though if I wanted to use any other form of user input with a Graphical User Interface I had to really go back in time.
The GUI creation in 4.5 and since the earlier releases has been awful. In code in the OnGUI loop you had to try and describe layouts and positions where much of what you put was implied. So you have to describe the user interface and its behaviour with code that merges function with display whilst not being able to see where things are until you actually run it. This is the opposite of most of Unity3d which lets you see where things are in the development environment both when running and when not.
I have lost track of the number of times I have tried a “fancy” layout of buttons, one that starts nicely, with flowing positions based on screen resolution, only to get lost in the code and resort to fixed x,y positions which generally a hit and miss for the first few times. In Unity3d you can expose parameters to allow compound prefab objects to be configured without code changes. I have ended up with UI helper objects that change my spacing and positions at runtime, so I can tweak the numbers to try and shuffle buttons around. Unity3d is great at letting you move alter runtime values, and also great at going back to the originals when you stop running. Unfortunately this works against you on UI’s. You have to tweak the values then remember to write them down on a piece of paper before hit stop so that you can apply them in the code for next time once you get it ‘right’.
That may or may not make sense, but take my word for it, it was a right pain. Let alone what you then had to do to try and alter the look of the buttons, sliders etc with the GUI.Skin which is like CSS but much much worse. So all my research projects have had plain buttons. I am not a graphic designer, and I was not easily able to explain the UI process or how to work with designers to make is nicer.
All that is now gone though. The old way still works in 4.6 (which is a relief from a phased improvement position) but a new shiny UI creation tool and framework is now at last public. It has long been promised, by long I mean literally years! I am sure when I get to the depth of it there will be a few things that cause grief, but that programming folks!.
Now the UI exists as a real type of Unity3d visual object. If you create a layout area you can see it, a button its there all the time. Each of the objects is part of a proper event system but also many of the normal functions are exposed as parameters. If you want a button to do something when pushed you tell it in its setup which function to call.
Previous UI buttons were only ever placed on the view rectangle, like a Heads Up Display (HUD). I have often needed thing to be interactive element in a 3d environment but to act like buttons. Again I have worked around this but now a UI canvas and all its actions, roll overs etc can be detached as a hud and placed into the environment. In my hospital environment this means I can have an better AR style UI at each bed. I have that at the moment but it does not have all the nice feedback and customisation a UI element can have.
UnityScreenSnapz017
The other major change is how the UI elements can be typed and hence changed. Each element can be a regular button, slider etc but they can also be a graphic or even the result of another camera rendering a live texture.
Here is a the game view (though not running) of the demo UI. The continue button is highlighted so on the right are its properties, events etc. The mini menu is showing the choices just for that button on how to transition on normal events such as rollover and click. Here it is set to animation, hence it uses mechanim and potential can go flying around the screen twirling. Simpler transitions such as just a colour tint or a sprite swap can also be defined.

This was possible before but again it was a lot of messing around. The old system of what to do on a roll over etc was buried in the skin code or overridden in the onGUI. Now it makes use of the regular mechanim animation system. This state machine allows definition of all sort of things, and it how we make character jump and dive into rolls, duck etc. It makes sense to have that for more complex UI transitions and states. In multiplayer environments I use a mechanim state change value to send over the network to make sure any mirrored/ghosted network object acts in the same way. So I guess now I will be able to use that for UI’s too to keep players in synch with certain modal activity.
Anyway, this 4.6 and new UI is obviously a precursor to the much bigger Unity 5.0 coming very soon. However it has dropped at the right time for the next phase of a project, so I get to use it in anger and re-engineer something that just about works into something much better.
As I tweeted, this is a great xmas gift, than you Unity 🙂

Landing on a Comet ! Wow!

Yesterday I sat glued to the internet video stream from ESA HQ in Germany tensely waiting to hear, via the orbiter Rosetta that the lander Philae had made it to the surface. When it did and the cheers started to ripple around the feed I had a definite lump in my throat and tear in my eye. I was watching it with predlet 2.0. He had asked if he could play Minecraft and for once I said No, watch this!. Of course without the history behind it all it was just a few people sitting looking at a screen, no commentary. So tried to explain the enormity of what we were waiting for. The fact the probe launched (10 years ago) before he was born is pretty amazing and the fact that the project started 20 years equally so. Though that makes less impact. I think my enthusiastic cheer when it landed probably got it across to him though.
It is truly awe inspiring that people have gathered together across countries to put together a massive project to land a tiny machine on a comet 300 million miles away in order to then perform some experiments and determine what the makeup of the comet is.
It transcends all the ridiculous austerity measures and the greed of those with the worlds wealth. This helps the human race and give some sense of purpose. We need more of this!
This image came from the surface of the comet!
Image on a coment
So we watched this on a 70mbps fibre optic internet stream, live via a web browser on a PS4 games console, whilst also second screening BBC news in iplayer on a small touchscreen wifi enabled device (iPhone), whilst also follow the tweets from the landers account too. I was sharing it with my son in person but also with my friends around the world who were also tuned in. 10 years ago when the probe launched none of the ways we engaged with the landing existed in any mainstream form.
I am sure there are some naysayers who will argue we need to sort this planet out first, but I think we need to sort this planet out as well! This sort of project provides hope and unity. It inspires generations (just as the moon landings did to me when I was just a toddler). That makes it a great investment. We learn lots about our past, help figure out our future but also the will and the passion can be turned back to earth, as log as we don’t let commercial greed take over.
What an awesome experience, and I hope, despite the 3 bounces that the lander continues to fulfil it’s mission. Though as the scientists have been saying when asked what happens now, the answer is, we don’t know, that why we do this sort of science to find out.