We had a new door fitted here at the family home/offices of Feeding Edge Ltd. With it we needed a new doorbell. It used to be we had a choice of a few mechanical rings and buzzes but the digital age brought MIDI like ringtones. I had always wanted to build a “record your own” doorbell but never got around to it. So I was very happy to wander into Homebase and see the Byron Wirefree portable MP3 door bell unit.
It is a normal doorbell, but with a USB port and the ability to play a 10 second mp3. Of course having this choice is actually tricky. What do we do now, what can we put on there. Music is not a great choice as it may be confused with the radio. I did try the hook for Geekin’ by Will.i.am “get your geek on get your get your geek on”. However I then went for a rich set of sounds from Star Trek and used Audacity to edit up a mash up of messages from star fleet and red alerts. It is very effective though I am not sure the entire family share my enthusiasm.
There may be an embarrassment factor to opening the door when the bell has been rung and the red alert is still going off, followed by a self destruct message 🙂 we will see how it goes. I can of course always change it.
A managed to capture a bit of it in a Vine 🙂 You need to click the unmute top left for the full effect otehrwise its just a flashing light 😉
I guess next would be a playlist of sounds and door tones? Any suggestions? I might get the Predlets to record a chorus of “Door! DOOOOR! Dad!!!! DOOOOOR” 🙂
Sometimes an idea gets stuck somewhere in the back of your mind. It sits there and occasionally raises its head to bother you. One way to sort it out is to write it down, and even better share it with others. Sometimes though these ideas and thoughts bundle themselves together and if you think at an abstract level you can link them and get them all out in one go.
That is how I ended up writing the latest Flush Magazine article where I go “From Koi carp to the Xbox One in a Parkour style free run of ideas”. It was really about the Xbox DRM sledge hammer solution but it did seem to offer a pattern or a modern fable to consider with technology adoption and replacement.
My words have once again been beautifully presented by @tweetthefashion some fantastic art work and presentation wonderfully in keeping with the odd direction this article took.
Have a look here page 110 on but also don’t forget the rest of the magazine 🙂
Yesterday was a very interesting day all around. The main event for me was the arrival of my Oculus Rift development kit. This is a Virtual Reality (VR) headset. We have had these before but not ones that are this cheap (relatively) and that just plug into any machine and work with Unity3d !
It came in a cool case too. Some nice big clicking catches on it like something James Bond would get from Q.
I had to dash out and get a mini display port to DVI adapter as I had that frustrating moment we have all had at some time when you realize a tiny piece of cable is missing. I only had a VGA adapter that I usually need to projectors at presentation. Handily there is a Mac store in Basingstoke and also Apple kit stocked in the nearby PC world so I had that covered in minutes.
I ran the Tuscany demo straight away. This is a small, but lovely rendered, building in the Italian countryside and on the coast. I was, despite having experienced this sort of stuff alot before suitably amazed and excited by it.
I did not hog it though and got each member of 3 generations in the house to have a go.
I was really happy that the predlets got how cool it was too. There was a suitable amount of bemusement and wonder.
The unit is two screens one for each eye and the ability to track head movements. So these head movements can then inform a game or application that you have moved you view.
The unity 3d integration is the library that is able to talk to the rift, but it is also a couple of prefabs (bundles of reusable code) that lets you change the normal single view camera to one that generates 2 views from slightly different angles to then feed to the rift screens (one per eye). Elegant and simple. It also has a controller prefab that takes the input from the headset movement and feeds that into the unity environment.
So it is not going to take to much to test some of the applications I have and make them rift enabled, though they have to be native application publications not web ones I am assuming.
It was not the only tech of the day though as in the morning Predlet 1.0 went and experienced indoor skydiving at Airkix.
Something I remember well from the TV show 🙂
She had a blast doing that.
I had a go on the iracing sim rig there had there too, this is basically a high end pc with a hydraulic chair and some force feedback steering.
We have yet to try the brush boarding or the skiplex slope (a skiing conveyer belt).
Just to juxta pose those though predlet 1.0 finished her Revell glue together and paint kit of the Titanic. Don’t worry though, it was all perfectly normal as predlet 2.0 was completing levels on Lego Indiana Jones 2 on the Xbox 🙂
As I was on the roster as the rather excellent Goto conference I was asked by Ben Linders a few questions on virtual environments and where they fit in the world of software engineering. This turned into a slightly longer interview and has just gone live. It is always interesting to frame the answers about virtual worlds into a slightly different context or industry but of course when it comes to the software industry, the fact that this is software anyway has a nice meta element to the conversation.
The article is live here.
Along with my super g33k bio picture 🙂
It is also cool that you can now just search “epredator” on infoq and tadah! 🙂
I guess this may seem more of me going on about the same things, but there is a reason for that. These technologies make a huge difference to a lot of use cases. I am getting more calls and questions again about how these may work (post bubble). Ignoring the possibilities for anyone or any industry could be costly this time around. I just want to help 🙂
You may have seen a number of posts and tweets on the interwebs about next week on Youtube (thanks @moehlert who I noticed saying it first). If not then you are in for a treat. Next week is #geekweek 🙂
It says “Come hang out at YouTube Geek Week and celebrate geek culture with a whole week of new vids, series premieres, epic collabs, and top tens from more than 100 channels across YouTube.”
That includes an experimental resurrection of the game/cgi/green screen show called Knightmare
If only we had all the clips of the Cool Stuff Collective super g33k sections online we could have joined in. (I can’t put them up as I don’t own them so just have to make do with the showreels)
I had originally thought I would not bother with a LEAP controller. However new technology has to be investigated. That is what I do after all 🙂
I ordered the LEAP from Amazon, it arrived very quickly. So whilst I might not have been an early adopter, and was not on the developer programme it is not hard to get hold of one now. It is £69.99 but that is relatively cheap for a fancy peripheral.
It is interesting the self proclamation on the box. “The remarkably accurate, incredibly natural way to interact with your computer”. My first impressions are that it is quite accurate. However, as with all gesture based devices as there is no tactile feedback you have to sort of feel you way through space to get used to where you are supposed to be.
However the initial setup demonstration works very well giving you a good sense for how it is going to work.
It comes with a few free apps via Airspace and access to another ecosystem to buy some more.
The first one I clicked on was Google Earth, but it was a less than satisfying experience as it is not that obvious how to control it so you end up putting the world into a Superman style spin before plunging into the ocean.
I was more impressed with the nice target catching game DropChord (which has DoubleFine’s logo on it). This has you trying to intersect a circle with a chord and hit the right targets to some blasting music and glowing visuals. It did make my arms ache after a long game of it though!
What was more exciting for me was to download the Unity3d SDK for LEAP. It was a simple matter or dropping the plugin into a unity project and then importing a few helper scripts.
The main one Leap Unity Bridge can be attached to a game object. You then configure it with a few prefabs that will act as fingers and palms, press run (and if you have the camera point the right way) you see you objects appear as your fingers do.
Many of the apps on Airspace are particle pushing creative expression tools. So creating an object that is a particle generator for fingers immediately gives you the same effect.
It took about 10 minutes to get it all working (6 of those were downloading on my slow ADSL).
The problem I can see at the moment is that pointing is a very natural thing to do, that works great, though of course the pointing it relative to where the LEAP is placed. So you need to have a lot of visual feedback and large buttons (rather like Kinect) in order to make selections. Much of that is easier with touch or with a mouse.
Where it excels though is in visualisation and music generation where you get a sense of trying to master a performance and get to feel you have an entire space to play with, not limiting yourself to trying to select a button or window on a 2d screen which is a bit (no) hit and miss.
I spent a while tinkering with Chordion Conductor that lets you play a synth in various ways. The dials to adjust settings are in the top row and you point and twirl your finger on the dials to make adjustments. It is a fun and interesting experience to explore.
Just watch out where you are seen using the LEAP. You are either aggressively pointing at the screen, throwing gang signs or testing melons for ripeness in a Carry on Computing style.
I am looking forward to seeing if I can blend this with my Oculus Rift and Unity3d when it arrives though 🙂