Lots more people are playing with/using video now with the lockdown. Every conference is a video conference. We are communicating with friends, family, customers and colleagues in a variety of ways and a stack of different contexts.
I sparked up Faceapp on my phone, something I had tinkered with a while back and found that it now not only does mad crazy things with great accuracy to photos but can also do video for some of those effects. With still images like this before and after example, or they they the other way around 🙂 you decide.
The video though…. mind blowing stuff.
There is also lots of fun to be had with the snapchat virtual camera on PC and Mac able to feed into any of the video conference tools you use by diverting your main camera to it first then selecting the snap camera as your webcam.
Lots can be done in foreground and background replacement. I mean who would turn up at a meeting in a predator costume? (like its 2006)
Like my Second Life avatar (below) I cant claim to have created this one. But it was great to see it on the list of snap chat lenses. My SL avatar has a custom jacket (my original RL one) and a feeding edge T-shirt so there are always tweaks to be made. Yes SL is still there, yes I still have 2 islands, yes you should sign up/re-login.
Back in the UGC world of SL in 2006 we had lots of interesting uses for logos and brands, like the Wimbledon Tennis flying towel and Wimbledon contact lenses (Official sanctioned I should add). I dived into the Snapchat Lens Studio to see what I could quickly create. This was knowing full well there was a very rich 3d animation and code/no-code environment that could be a career for some talented artists. I used one of the base models and tweaked it a bit and gave myself some shades with a subtle mention of 451 Research (now part of S&P Global Market Intelligence). Obviously that would all not fit on the glasses, but you get the idea.
I have always talked about putting logos and our own designs into game environments like my reconfigure car to advertise my novels. Before that back in 2006 putting eightbar into games too ( in the link above too). Now I have a feeding edge logo in Animal Crossing New Horizons. I am planning on doing a reconfigure one too. Only a few pixels to work but its fun to try. Our entire family is obsessed with this game at the moment, on top of all the other various games we play its a unifying experience as we tend to our islands, or live on one another’s. (Hmmm another SL like reference)
Of course this lockdown is driving us all crazy, but having a bit of fun is essential. Back to snapchat it was good to see the Trolls movie had some official snapchat lenses. When I tweeted this I was told I looked like Eddie Izzard 🙂
The family joined in too, but not posting them without permission 🙂
Having said that all this tech is great, but I also felt the need to try a real life filter and broke out this costume left over from The Cool Stuff Collective TV show (I didn’t wear it on the show, but at the wrap party). It was certainly a shock for my colleagues on video. The neighbours probably thought I had lost it too as I ran around the garden in it, doing a real life animal crossing manoeuvre.
Stay safe everyone, have some fun with the tech and lark around a bit!
The recent pandemic situation is causing many offices to suddenly switch from co-location in a building to everyone working from home. 11 years ago when I left IBM and started Feeding Edge I was immediately a home worker, but the preceding years, despite having an office and a base most work was effectively remote. Being a metaverse evangelist in 2006 was all about people using virtual world and game technology to be able to communicate and understand one another at distance.
The projects in 1997, in the early days of the web, were built as a team in an office but were clearly very much about interacting with the world at large. For those of us in these industries, building and shaping the use of the internet we did not face a big bang switch over from office to home. We tended to grab and adapt or write whatever tools were available to work across physical and digital divides. That of course is a typical pioneering spirit and having been an evangelist for change I know not everyone is comfortable with that, and nor should they be.
A sudden switch from one thing to another is bad enough, I recently had to switch from Mac to Windows for work, its annoying, and irritating at best and veer stressful at worse. Today we have the added external stress and personal worries about family and friends as well as the future of businesses layered on top.
The reality is, we do have the technology to communicate and share, we might not all have the right processes or social norms in place but the more we try, the better.
For many people teleconferences are a norm anyway, just usually they commute to a desk or office in order to have those. However many others will not. It is here we all need to be cognitive of how much the technology blocks and filters who we are. A voice only teleconference is great for a presenter, or for the alphas who thrive on talking, but many people will not feel comfortable to interject and cannot use body language to find a gap in a conversation. Video links are not really any better, for some that strips them of their normal behaviour as they attempt to stare at the camera or try and get a level of eye contact akin to the physical world. Some people engage in text chat alongside audio and video, though often those who are better at talking do not pay attention to the text, and sometimes the text just becomes chatter in the background. There is an art and style to all this and people will find what suits them, just as in a physical situation we adapt to one another’s signals.
It is these sort of difficulties that led to exploring virtual worlds like Second Life. Some of that is shown in the Album below of 600+ images 2006-2009. This was a long and varied journey and one I have talked, writes and presented on many many times. Happy to share the tales just ping me @epredator or the 11 years worth of posts here many related to metaverse concepts or even read my Reconfigure series and get the gist of of some of it in a modern sci fi context.
It acted (and still does) as a teleconference in having shared sounds in a space, but it has the presence of avatars to represent people, that get moved around even doing simple things like sitting or standing or hovering next to people you know in a meeting. Instant visual feedback, who is there, where are they, what are they representing with the avatar. We used text chat, group and 1:1 whilst also having voice. We were able to bring artefacts into the environment dynamically, whiteboard in a 3d space. All adding to the depth of communication. There is also, for those that can, ways to code, make virtual objects do something too. This is still entirely possible on Second Life and many other virtual environments. Now more than ever is time for people to experiment.
This is also without even bringing in the tech and immersion fo Virtual Reality or the potential for Augmented Reality to allow us to blend our physical and digital presences in physical space. That of course requires people to don headsets and have the kit, but virtual world interaction does not need that. A laptop/tablet and an internet connection is all that is required. Carry on with the frustration of teleconference and video conference by all means, sometimes they are the ideal approach, but just consider what is causing frustration of working remotely and investigate what might be a better way.
It’s World CKD day and founder Grand Master Chois birthday so here are the bulk of a 8 years worth of articles about our family journey in the art up to now to celebrate. Not including all the flickr photos, tweets and facebook posts! http://www.feedingedge.co.uk/blog/category/ckd/ In this I use science and tech to help measure techniques, evangelise and show a passion to teach the art. Many long term friendships formed. Positive health changes. A real, we can do it together, family experience. Helped our kids with the house move and change of schools. It also helped me resolve a lot of things when I finally found the art. Even more importantly has been seeing the evolution of students confidence and abilities in all elements of life. Watching this growth is wonderful. The art and all the volunteers and students acts as a substrate or scaffold to achieve this. Just as we each strive for to improve, aiming at perfection we know we always will have more to learn, more to improve. The journey is never ending. The art itself, by its collective nature will also strive to be better, but it will also make mistakes, hopefully address them, learn from them, and correct as a whole, just as each student does in their own journey. Enjoy your journey everyone. Pil Seung!.
I am a techie, always have been, but often my work is primarily writing and speaking about emerging tech, more than making it work. As you will know if you have read Reconfigure or Cont3xt I view everything as a fractal, so the level of complexity required to deal with code and hardware is often mirrored looking at the interaction between systems, companies, markets. Though as a programmer first and foremost, the patterns and the detail making things work (or not) is where the true art and beauty is. As they say those who can’t do teach, those who can’t teach write about it, I like to make that a full loop instead. Those who can teach and can write have to be able to “do” in the first place. Trust me I’m a doctor 🙂
This weekend I dived into a build of the Sparkfun Advanced Autonomous kit added to the Sphero RVR (already a well instrumented and interesting bit of kit that I was very happy to have backed on Kickstarter and mentioned in this post on Basingstoke Techjuice). I didn’t do any step by step images of the build as I spent a lot of the time trying to get my aging eyes to be able to deal with the tiny tiny screws, bolts and wire connectors. I was primarily following the guidance from SparkFun build tutorial
The additional kit comprises of a Raspberry Pi Zero W and a preconfigured SD card for its OS. A camera with 2 servos, 2 x time of flight sensors and a GPS board provide extra inputs and some connector boards to combine the inputs on a mux board and a custom serial shield for the Pi. The shield has a RVR/Serial switch to make it easier to plug directly into the USB port from another host computer.
The kit mounts on a plate that can be added too or taken off the RVR. The key advice was to ensure the UART power lead from the board to the RVR was connected the right way around, the red wire suggested as the 5V side to ensure no blue smoke.
The worst problem I had with the hardware, other then being so small and fiddly, was the ribbon cable to the Pi from the camera. I had managed to pull the small retainers out to put the new cable into the camera and push the clamp back down, but on the Pi I managed to pull the black retainer out completely and it was very difficult to get back in, and may even be a little broken, but seems to be holding. The camera already had a ribbon cable attached, which was initially confusing, but it was clear the one in the tutorial matched the one in the box as it needed to be narrower to fit on the smaller connector on the Pi than a standard one.
It was suggested to set up the Pi first, but also said it could be done afterwards, I went for the latter. Doing s/w config was not as exciting and interesting as building the thing. Once built it all seemed connected and I was getting a nice green light on the Pi. The SD card needed files edited on it to put in the network details, and one other config edit was required. I managed to find a SD card adapter and was able to edit the files on the MacBook Pro. I had thought I could do the setup via the iPad Pro as that’s where the RVR app is, but this is of course outside the normal RVR educational environment so the Mac was easier.
After several attempts and rechecks of everything it seemed the machine was not going to connect to the mesh network. The proxy file that it was supposed to copy over the details with was failing (or just not happening). So I fell back to option 2. Connect the Mac to the serial port board and SSH in. That did not work straight away, I did not see the device when plugged in via the shield, so I tried the usb ports on the pi itself. The one I could reach, as a sensor was in the way, didn’t see to work either. Then I read that is the middle port that needs to be used, and found this article (and the additional steps) very helpful to get it working and then to be able to switch back to the serial shield too. SSH into the device and ran sudo raspi-config and entered the network details remotely, I got a pleasing bing on my watch as a new device joined my mesh network. I was expecting to have 2.4/5ghz wifi trauma, but it all worked out.
It was not all quite as straight forward as when I started to put the thing together, but as I have lots of kit, cables, adapters etc, and residual knowledge of what remote connecting in to linux via ssh actually is it was like being back home. Though, again, as in my novels, my dislike of overly convoluted “all you have to do is type xxxxxyyy.xxx..xx..ss.www” remains. I can do it, always have to look things up, understand why I am doing what I do, but its still based on adjusting to the needs of the tech not the other way around.
Connected and seemingly stable I then ran through some of the Python based tests indicated on the article from Sparkfun. It was here I was brought back to our old favourites, permissions. I tested the servos, camera, gps, and time of flight and all had various permission errors. It may be I have not got something quite connected or ran something out of step, possibly missed some sudo prefixes and created some dodgy files. However, I am not overly worried, it is pretty much put together and now I am back in the s/w config arena, checking a few logs and hacking a few things I should get back on track next time I spend some time on it.
Then it will be into proper code again, with the Sphero Python SDK for the device. It looks as if there is enough stuff on the iPad Pro to be able to engage and code with that, which will be preferable. Stuck with a works windows machine now so the Mac won’t be travelling with me, my iPad Pro on the other hand will.
Wish me luck 🙂 I will try and not make it take over the world. Yes, doing this will inform something interesting in book 3, I am sure of it, as well as helping my analyst day job and any educational stuff I do for STEMnet and BCS 🙂 It also felt really good to be mixing with hands one hardware and software, especially that didn’t work first time.
Way back in the mists of time I wrote how utterly ridiculous the EA lock down of accounts was that effectively removed any parental control in favour of a lazy and restrictive approach. This post “A mess with Origin, EA and Xbox” has been the most commented and read post of all of mine here on Feeding Edge, it was in 2014, its now 2020 and still it gets comments from annoyed parents wondering what the heck is going on.
Yesterday Predlet 2.0 turned 13. He was both looking forward too, and then greeted by, a coming of age with the EA online system of bureaucracy. On sparking up Need for Speed he was informed he qualified for a magical teen account. After a bit of email address admin my side we managed to accept it.
It means I will no be able to validate easily if EA ever get off this blanket removal of any parental choice, which I should imagine forces many parent to help their kids lie about their age on online accounts. Simply to make life easier for EA.
Predlet 2.0 will remember EA doing this for a long while, whilst he is overjoyed at coming of age in the eyes of a games company, he, and his friends are not as big a fan of EA as a brand as they might have been.
All for controls on what can and can’t be used online by minors, but equally it is ultimately parental controls that should be the key, so that from an early age we can help kids come to terms with the online world through guidance and openness.
Another decade has passed into the annals of history. I know that applies to every unit of time from the smallest of microseconds (and below) but people do like a good pattern and boundary to think about don’t they. I am no different and a nice patterned number like 2020 seems pretty cool. Bearing in mind that growing up I used to read the wonderfully named 2000AD comic every week.
Here we were told tales of Judge Dredd in Mega city one, Strontium dog et al. Of course the comic was not about the year 2000, but the far off turn of the century back in 1977, seemed a glowing future. So to be in 2020 despite some its Brazil like qualities is pretty awesome.
Whilst I am slipping back into the 70’s and 80’s here this should really be what happened the past 10 years, so here goes.
I started 2010 as the first full year out of corporate life, having quite in Feb 2019 to start Feeding Edge. There were a few months of things I am probably still not able to write about but it was quite stressful, but I started 2010 with a bang. Diving into a stack of projects writing lots of code, startup land. I even wrote 7 predictions for the decade just passed. Well you have to if you have a blog and its rolling into 2010.
I suggested “Keep Walking. Obviously the big one to consider is what used to be called mobile. Everything is mobile now. We already have relatively easy access to 3g and wi-fi to allowed our technology to be untethered. Of course this needs to be wrestled away from the anti-competitive telecoms companies. The thing stifling growth is the wayward charging mechanisms” I think that stood up ok, the real game changer is now going to be 5G and what software defined networking does. Sim cars? e-sims are already a thing and there is plenty of non sim based networking.
Also Brands crossing digital borders. Engagement with people where they happen to want to be online and offline will have to increase. It will not be enough, as back in the early web to just leave you website lying around to be found. Business has to become a travelling exhibit, a movable market stall that can be adjusted and placed wherever people are or want to be. Digitally distance knows no bounds, but you need more than a sign post or banner ad. Active guides, persuaders, dare I say salespeople? Maybe I am referring to my evangelist brethren though? People who know the territory” I think this probably is a tick too, especially with the social media influencers now.
Also “3d Printing. 10 years should be enough for this to become “mainstream” as by then the transmission of 3d content and design with the associated rules and regulations, kite marks, certifications etc will start to be in place. Why move goods all over the planet when you can make them locally? It really is a no brainer.” Whilst not sat in out home style mainstream the number of companies engaged in this is pretty impressive.
Check the rest out on the link, not least I quote the current POTUS “ot interested in money but it helps to use it to keep score”
The end of 2010 and into 2011 saw the weird switch I had into Kids TV. The Cool Stuff Collective and the entire crew at Archie Productions made for the most entertaining work based thing I have done with a team. I fully documented the fun and tech from this first post on. Yes 3D printing on kids TV in 2010. So 39 episodes over 3 series. I think that’s a pretty good run and lots of people still get a little surprised in the few minutes intros I often do now.
2012 after I had slumped a bit and then went mad getting fit, partly as a reaction to losing dad, and realising what my kids would go through if I didn’t we found Choi Kwang Do. Which has been a major part of all our family lives this decade. We now have 3 2nd Dan black belts and 1 1st dan in the house. 1 Chief instructor, 2 assistant instructors and a leadership. 3rd Dan approaches rapidly this coming year for Predlet 2.0 and I. I can’t say enough about the Choi family that we are part of.
I was in 2013 also carrying on the build of the Unity based training hospital for Imperial College, this had started in Second Life, a create place to prototype and build live, but then got way more serious and complicated. I still look at the code and wonder how it wasn’t a team fo 20 people building it!
Also in 2013 the new (and still going version) of Rocksmith arrived. It is still the best use of software and gaming I have ever come across. Utterly fantastic (if you want to play electric guitar that is).
Jan 2014 saw my most popular post ever on anything. I complained at the mess that is EA and its management, or just plain locking out, of kids accounts. It still gets comments and reads. The stunning lack of ability for a company like EA to be able to solve letting parents make choices for their kids, based on only applying US law to us all was, and is, ridiculous. Predlet 2.0 turns 13 in a few weeks and the problem will (I hope) magically go away). The post though… well long may it help.
April 2015 I blogged, partly in frustration about my longevity on the tech space and partly to get me going again after a bit of a downturn in jobs and contracts. As I said then “So I am left in a slightly bemused state as to what to do with this knowledge. With this all going so much more mainstream again I am no longer working in a niche. Do I ply my trade of independent consulting chipping away in odd places and helping and mentoring some of the new entrants in the market or do I try and find a bigger place to spread the word?”
In August 2015 whilst on holiday I got a call about a potential new role. The one I started in April 2016. I was not sure it was going to take that long to start but September 3rd 2015 I started to write what I consider to be my best professional achievement Reconfigure and its follow up Cont3xt. I had not originally planned to write fiction. A war story of the life of a metaverse evangelist 2006-2009 was more likely. However I dived in and found the most rewarding daily unrolling of the plot hit pie with both books. The links are on the sidebar or just head to Reconfigure and Cont3xt
April 2016 I started as an IoT Analyst at 451 Research, where I still am, though as of today working for a much bigger corporate owner/partner. Writing is like code, with less compiling or interpreting to do. File formats for things like 3d models or animations also do not figure. I do miss programming properly, though I still have plenty of tech projects on the go for fun and for education.
Other very notable things for the past decade. We moved house from Warsash too Basingstoke, plus a lot of renovations and building. I served as a school governor for a few years. Jan went from strength to strength in her career once shot of IBM. We celebrated out 20th wedding anniversary, its the 25th this year! I also became a Doctor of Technology in 2018!
As a family we travelled the world, Japan was stunning (my 50th), The norther lights and ice hotel (Jan’s birthday treat). We took an RV around Canada. We toured LA, San Fran, Yosemite, death valley, Vegas and the gran canyon in a car. I have ended up in Vegas, San Jose, Boston, Raleigh, Chicago, Barcelona, Madrid, China and a few other places with work.
We are on our 3rd electric car with the latest Nissan Leaf. We also have a robot japanese toilet and deep japanese bath to help relax with inspired by the trip)
Despite all the Choi I realised I had put on way too much weight in 2018 ending up at over 120Kg so last year started with an intense 800 cal diet and stack more Choi and Walking which dropped me to around 95Kg before I decided to take a rest in April. Then I slipped over in the kitchen at easter, gave myself a server concussion and sort of lost ability to push that particular agenda for a while, so back at 105Kg now, which is still OK but 2020 is another burst of fitness/weight loss that I know I can do as I have done it before. Having dropped over 25kg I figure I can do another 15 or so.
So what does the roaring 20’s have on offer?. Who knows maybe early retirement due the rip roaring success of a Netflix series of Reconfigure, a fantastic lottery win or who knows what. The tech is going to be brilliant, the games are going to be awesome and life is going to rock.
There has, in the gaming world, been a lot written, tweeted and said about Google’s new online cloud based gaming service Stadia. It has turned out to be pretty much what I expected it to be but that doesn’t stop it also delivering some fresh let downs.
The principle of just using a controller and a “dumb’ screen with a fast internet connection to a service has clearly worked for things like Netflix and Amazon Prime Video, we are able to watch UHD content flawlessly (most of the time)
However a game service such as Stadia is a bit more complicated, as unlike a film, a game is rendering content as needed based on input from the users controller, it is not simply a stream of known content that can be cached or buffered. After all in watching a film when you press play, if the film starts 0.5 seconds later to give the buffering a chance to provide smooth images based on network conditions, you just would not notice. In a game you do, so latency and any delays are utterly obvious.
In IoT, especially industrial internet of things we discuss Edge vs Cloud and latency and processing data as close to the source as possible is an obvious solution, rather than round tripping to cloud.
A game is pretty much an ideal case for edge computing, and has been for many years. That does not mean that the edge needs to be isolated, you cache and render locally but may get elements of data remotely, like the postion of other players. Those few streaming coordinates are very different to a full frame 4K image that is rendered remotely, followed by another and another at 60fps.
Hence my experience of Stadia is this…
Glitches in sound, dropped frames intermittent pauses. I have not captured the worst of it as if you capture in game via stadia it captures it at source, and the videos are perfect. So the indicates it is not the end infrastructure for the game but the network between me and it. Google suggests that using the Chromecast box that plugs into the TV you use also use an ethernet cable and plug straight into the router. I do do this with the Xbox One (Due to its wireless interface being so bad) but the PS4, Switch and various iPads, PCs and Macs are all wireless and have no trouble with “networked” games or downloads. That of course is all because the content is local, the rendering is local. My network is a very robust mesh network, ironed out many kinks working from home and being big consumers of digital content over the years. We also have 70Mb broadband.
I wanted the cloud approach to work, in fact we had OnLive on the Cool Stuff Collective back in 2011. It was part of a piece where I descried what cloud computing, 9 years ago. OnLive was trying to get cloud gaming going. In emerging tech we are well used to the ebb and flow of trying and it being too early, waiting then it returns (look at VR and virtual worlds), and eventually it takes but at the moment Stadia doesn’t seem to have it sorted.
It is not just the intermittent breaks in smooth content that are a problem at the moment though. I have a Stadia Pro subscription, one where you apparently get games under an all you can eat subscription. The only trouble is there are 2 games, samurai showdown, a retro beat em up (fast moving so lag matters) and Destiny 2, of which the redeeming feature is that it has cross platform character saves. The rest you buy (and not really very many of those either that I don’t have on other platforms already). None the less I re-bought red dead redemption 2 just to try and support the platform and see what it could do. I have several times been greeted with this experience.
The game thinks it lost the controller, the controller is still able to summon up Stadia content, stope the game etc, but none of the buttons worked.
I also stepped away from RDR 2 having paused it for a relatively short period, just a few minutes whilst I sorted something out and returned to this.
A Stadia message saying I was dumped out of the game, where the mission had not completed, so was not saved. Some RDR 2 missions take a while and with the choppiness and this the level of frustration destroys the experience.
I was also expecting to be able to play on my IoS devices, but I can’t yet. On the PC the system down scales so will not do 4K images (not that it is in real 4K anyway.
Friending people doesn’t work. Purchases of content (should you want to) have to be on a mobile app not in the system, replays and captures are only viewable on mobile. the list goes on.
Microsoft and Amazon are waiting in the wings with their streaming services. For me the Microsoft and my Xbox Game Pass library and purchased library will be a bigger draw, though the network issues will be the same for any provider. Yes OK, 5G might help, but that’s a way off yet in reality running at full pelt. Latency and physics will alway remain an issue (unless we deal with quantum physics that is!)
A few weeks ago we upgraded our EV Nissan Leaf to the newest version of the Leaf Tekna. It is now our 3rd Leaf having been a very early adopter over 5 years ago. The last to we had were pretty similar but the new Leaf has undergone some significant changes, and they are all good I have to say.
The shape and outside styling is a little more angular with cuts and flashes of bodywork and lights, which is pretty much the styling of most of the Nissan range now. In particular the rear lights wrap around to the side of the car.
The changes on the interior are a little more obvious with a new centre console and a few of the buttons have shifted around, like the eco button and strangely the start button. A new steering wheel has a squared off base, again like many other Nissans in the range.
Our previous two cars had a type 1 charger, and we have an installed cable unit, this new care has the type 2. I thought we would be easily able to upgrade the cable/charge unit but it seems the various government grants to install a charger do not apply to upgrades. So it is around £900 to get a new charging unit even though the installation would just be switching the head unit not the 6 hour job it is normally scheduled to be. However a range of adapters exist, so for £100 we have a solution in place. Not ideal but it made the scheduling of handover of the cars from one to the other less complex.
The big changes are really quire impressive though. We have ProPilot on the car which is a modern driver assistance set of tools. It is not quite a Tesla in the self driving sense but it certainly helps get closer to not having to do the driving. Now purist drivers and petrol heads (I think I am/was one) might baulk at the thought of not being in complete control but we have had a lot of computer controlled safety aids on vehicles for many years, traction control, ESP, ABS braking and even Sat Nav. ProPilot when on a motorway allows the car to maintain its lane, speed and distance from any car in front, handling the acceleration, braking and steering. You do have to maintain hands on the wheel though. If you want to over take you indicate, and start to steer the car then accelerates to get past the car you are overtaking. It is particularly good when the cars in front start to brake. I was hovering over the brake pedal on my first longish journey and it started to brake at a stoppage just before I was about to hit the pedal. It needs white lines to follow and lets you know if it can see them, also lets you know if it thinks it is following a car in front. You can set you comfort distance between you and the car in front as well via a 1, 2 or 3 bar system. It can feel a little odd as it makes small corrections in steering that maybe as a driver I would not bother.
Knowing whether to trust code and systems like this is a tricky one. However on the way back from out recent holiday I got to drive the hire car back to the airport, it was Diesel stick shift Nissan but did have some of the smart features and sensors. It reminded me what an archaic system stick shift is, yes it gives you plenty to do, but it is made for us as a driver to have to adjust to the design of the engine and gearbox. It seemed a right pain. Then when we got back Gatwick I got to drive our 2007 Petrol Automatic Honda people carrier back to Basingstoke. This has almost no driver aids or sensors, except a reversing sensor. It does have cruise control, but I never have used it as it has no idea what is going on in front of it. It is a blind lump of metal. A lot has changed in the past 12 years. When I got back I popped into the new Leaf and headed to the supermarket. With all the sensors running just on local roads it felt brilliant, but it also faster, more fun and much more agile and light to drive than both the ICE cars that day.
One of the other tricks the Leaf has is one pedal driving. EV’s don’t need gears due to the torque of the electric motor. Hence no clutch, but they do have a stop and a go pedal. With a switch on the console you can switch to 1 pedal driving only using the accelerator. If you lift off the accelerator the car slows down using the brakes itself as needed. This feels even weirder that the self driving motorway pro pilot, but it is actually works really well. It is still using all its senses to understand what it in front, even being able to emergency stop, or at least start the process quicker than the driver can. You can still apply brakes, its not they are turned off BTW. As you approach a roundabout or junction and you ease off you quickly start to feel how much to reduce the pedal by, it almost feels the car learns from that, not that it is really. It is also using the regenerative braking to recharge the battery. This is something that they have got better and better at over the years. In stop and go traffic jams it removes almost all the hassle, aside from the delay. Gently creeping forward and stopping dead as you lift off the pedal works really well. I certainly don’t miss the clutch, brake, try not to stall, apply handbrake, repeat malarky.
We did not opt for the self parking option on the car as another 5k was a bit steep for something that is so easy to park anyway with its all round camera and sensors. It even looks out the back of the car both ways for you as you reverse to detect a sudden on coming car or pedestrian.
Other nice things about the vehicle is it now has Apple CarPlay. Plug the phone in and I get music apps like spotify though the awesome Bose sound system. It also reads out incoming text messages which can be responded to via voice commands too. It is a little confusing having the in car voice control and then it side loaded with Siri but I like it.
EV’s always have the range question thrown at them. Not a major problem for us as most journey in this are the shorter town ones, which is why we switched in the first place but both range and power have been upped. I used to press the eco button not the steering wheel in the last one to turn eco off and have a turbo boost to leave a roundabout if needed. So far even in full eco and full energy recovery the car launches pretty much the same it seems. There is a different with eco on or off as to the amount of pedal travel on the accelerator and how much power it delivers but it is still very nippy. Having been a long time Subaru Impreza WRX owner, that thing launched with a very impressive 0-60 but the Leaf off the line feels faster and lighter. EV power deliver is pretty constant so yes it would fall away after the 0-30 but I think many people don’t realise just how quick these things can be. It is how heavy footed or how fast you drive that impacts the range. The last Leaf on full charge indicated a guessed range of 110 miles. This could easily be halved on a motorway at 70. This Leaf indicates around 150miles on charge but it doesn’t seem to tail off on the motorways quite as much. I am sure it can easily do 100-120 miles. Probably the full 150 on country roads.
I was very happy that the Robosens T9 transforming robot that I backed on Kickstarter arrived this weekend. I have backed a lot of interesting things on the crowdfunding site and most have proven to be pretty good.
T9 though made me smile and laugh the most, combined with a slight tingle on the back of the neck. It is not net connected AI, but it is a clever piece of mechanical engineering as it does actually transform from a driving car to a walking and dancing robot.
The instructions and details were a little on the lite side, as I have not yet figured out the angles and coordinates for the 22 joints in the programming environment. I asked a leg to move, got the wrong angle and it fell over 🙂
None the less. Just asking to it transform is pretty cool I think 🙂 It has lots of preset manoeuvres, it even does push ups 🙂
I recently was asked to write a piece for the BCS (this is the professional organisation that many UK techies belong too formerly known as the British Computer Society, but now more global). I have been the chair of the Animation and Game Specialist Group for a few years and obviously I have worked with elements of game tech for a good few years. That includes now at 451 Research with the elements of Augmented and Virtual Reality having an impact on the world of IoT. AR is the User interface for IoT, and is built with game engine workflows and tools as much as any leading edge game.
The piece has a stack of things that seemed to come together and influence my to be interested in tech, much of it Sci-Fi based. In a world where we didn’t have this sort of access to technology we have today, but just enough to hint at where we were heading.
Feel free to go and have a read and reminisce here with The Memoirs of a Bedroom Coder. I had pondered writing a factual book about what influenced me and this weird career path I have taken but I bundled some of those attitudes and thought up into Reconfigure and Cont3xt in science fiction instead. (Which will be free from Saturday 24th August – 29th August) if you can’t bear to part with 99p/99c to read them.
Meanwhile, this weekend whilst checking out the awesome cars at Carfest South on Sunday, the family will also be enjoyed some live music by 80’s icons, Boy George & Culture Club and The Human League. So I am sure my (nearly) 52 year old brain will be flashing back and pondering the future at the same time.
If you need me the rest of the Bank Holiday weekend I will be mostly with my head in an Oculus Rift or PSVR enjoying the fantastic No Man’s Sky update bringing it to VR after 3 years.
Or, out in the garden playing the US CornHole game, chuck bean bags 30 feet into to hole in competition. Its a real sport you know! Its as low tech as I am going to go :).