This weekend I finally was able to purchase a digital copy of the film version Ready player One. It is something we saw as a family at the cinema but I had been wanted to take another look. I don’t buy many individial movies any more as with Sky, Netflix and Amazon things are usually easy to access as needed. It seems the US got to the home release first, another weird thing to be doing in the 21st century, slowly rolling out digital/streaming availability is so archaic and only really leads to piracy and lost revenue.
I had to make a choice on how to get to a digital version of the movie. Do I pay Sky for its buy and keep, where they also send blu-ray to go with the download. Or buy a blu-ray and attempt to figure out the film industry approach to digital, which, as with the release schedule is pretty archaic. Instead I went for Amazon Video. This we have on all our TV’s in the house, computers, pads and phones, alongside with Netflix. The last movies I bought were also on Amazon, John Wick 2 and before that The Raid 2, so its ideal to have next door to those.
I sparked up Amazon on the TV and home cinema and started to enjoy spotting even more of the references from my childhood and growing through the 80s as a gamer. Plus of course the who depth of engagement in VR is what flows through my head most of the time as a metaverse evangelist. Really though it is this targeted set of references and in jokes for me and my generation of geeks that I love. Wade’s Delorean with what seems to be a Knight Rider front grill, Duke Nuke-em aiming a launcher during the first battle scene, hello kitty and friends waddling along the bridge next to Wade, it goes on.
Being in VR though, with the headsets, suits, and running walkways I though I should at least try and watch it on my Oculus Go. I paused the TV stream, popped the headset on that is always next to me. I hoped an Amazon app had made it to the store, but no. Instead using the web browser I headed to Amazon video, logged in and found my movie, clicked play…. Nothing. It was a little deflating. Then though I noticed the “Use Desktop version” button in the top right and bingo it worked. Picking up where we left off at the start of the race with characters like SF’s Ryu wandering off to his car and seeing the Akira motorcycle raz into view. Selecting full screen gave a nice curved view of the 2D movie, enough I had to turn my head to look across the picture a little. Proper front row stuff. Not immersed as much as the main characters are int eh OASIS but a nice blend of traditional film, with new view tech and all a little bit meta 🙂
Another thing I noticed, which I think I had forgotten, was that when Wade first appears he has green hair, which he then dynamically changes to show avatar customisation in the OASIS. For me thats a big “yay” because I spent a long while with green hair in Second Life before finding my predator Avatar. My green spiked hair was a thing I identify with, and even got 3d printed. I also often referenced that in corporate presentations as the choice to have green hair is a subtle prod at conformity and social norms, just on the border of whether that would be acceptable to many in an office. It is not as full on extreme body shapes, or cartoon looks. The predator AV took this a stage further in the conversations. I also use green hair in almost every avatar in every platform as its a much quicker mental attachment than finding or trying to build a complex Yautja avatar. PS Home famously (from my point of view) did not allow green hair. Very odd!
So seeing Wade, in a movie about VR, the 80s and my generation my ego naturally assumed that any research the production team may, just may have come across my green hair metaverse evangelizing 🙂
If you like player one ready, or don’t, you will still like my own VR and AR story telling with real world gaming and tech references though with Reconfigure and Cont3xt. Its got a love of Marmite in it too 🙂
These years just tick away don’t they? Last year finished in a big rush of work with reports to get out before I took the rest of my holiday that was sitting there ready to expire. I was really please that I got my AR and VR long form report out for 415 customers. These a 6k or words and pictures but I was not expecting to be in a position to share Augmented and Virtual reality insights quite as much as I have this past year. It is what I know, and what I feel or course, but there are a lot more things in IoT especially in Industrial to try and stay on top of. However, AR is the UI for IoT, so there it is. It was over 18 years ago trying to get a shared avatar space of our offices working with presence of people, at the turn of the century! The along came Second Life, which made things a lot easier in 2006.
2017 was supposed to be the VR year, and it sort of was, but also AR is hot on its tail and finally Magic leap, on the 20th December unveiled their AR goggles.Yes, that was just a few days AFTER I managed to ship to AR/VR report with the words, “we shall have to wait and see what they do” in it. Though the shut down of Tango, which I did get in as an amendment waved the flag that would happen.
No prices, not dates, other than 2018, but its a jump in technology with light field (hopefully)
We are also very close to Ready Player One hitting the screen, which will be the main point of reference for escapist VR with Spielberg directing. So it seems all the future thinking stuff we have all been doing is going mainstream to some degree or other.
Apple ARKit and Google ARCore put AR tracking into the hands of everyone with every mainstream device too.
Maybe a few more people will spot my novels and enjoy them with all the AR and VR, IoT and Quantum theory in them 🙂
Recently the fledgling AR industry got a bit of a boost with the announcement of Apple’s AR kit. This bit of software layering is supported by Unity3d and Unreal, to name a few, and puts some AR elements directly into the OS of the Iphone and iPad. That of course is a bit of a problem for the many other AR toolkits that have been around for some time, but these things happen. Developers are already diving in an making interesting things, such as this Minecraft style tech demo
The ability to position and accurately keep track of objects in the the camera view of the World is a core part of what happens in my novels, including the “how to build stuff in unity3d” parts of Roisin’s inner dialogue. Of course the bits of how she gets to instrument the world if is a bit further off, but as Quantum communication is getting closer and, well that leaves just a little sliver of fiction left that powers the stories 🙂
These things add to the amazing amount of tech out there to keep track off and understand, something my IoT analyst day job keeps me very busy doing. Luckily (well by design really) my research agenda incorporates AR and VR and whatever XR comes along so I am still able to build on all these many years of being in the business of virtual. It is surprising how many companies are deep into it, but still keep it a little quiet, probably out of the same embarrassment or resistance we back in 2006 showing Second Life as an integration platform with both people and real World data, but oh…. it had colourful graphics and game like features, how could that be “serious”. It still makes me laugh to think of the resistance to it, as with the web, e-commerce and social media. The same is happening with blockchain, AI, IoT, autonomous vehicles. Lots of stuck in the mud attitudes, it will never catch on…(a few years pass) oh look my entire business has been disrupted…. why didn’t I listen…
Anyway, keep an open mind on tech and its interaction with society, it’s not all version numbers and installations, people are in the mix and very much part of whatever ecosystem is forming. Why not read the sci fi adventures whilst they are still fiction, look back in a few years and it will like a documentary. (That’s by design not accident BTW) Reconfigure is the place to start there are some links on the right 🙂 Enjoy the future.
I had a trip to Jersey this weekend, sponsored by Jersey BCS so that I could be an out of town judge at the #hackjsy game development event. The focus here was for teams to build something in 36 hours, game related. With my BCS Animation and Games specialist group hat/badge on it made a lot of sense to to and see what was going on.
I also treated the trip as a re-aquanting myself with travelling on business, the family getting a chance to see I am now there all the time, but just for a short first stint. I also thought I would test out the new clothes for travel comfort. That test worked, but in a way I was not expecting. I usually have been wearing combat trousers and my phone sits nicely in the leg pocket. Instead it was inside my new jacket/ waistcoat arrangement. As i parked the car at the airport and hoofed my overcoat on with a hunch of the shoulders, my iPhone 6s plus felt the urge to slide upwards out of the shiny new pocket and propel itself face down onto the floor. I knew it was not going to be too well but I was surprised at just how smashed it made itself.
It was completely unhappy with any sort of interaction. I couldn’t power it off with a slide either. I tried the power and home button together for a few seconds and it shutdown. The Jersey flight is inly 40 minutes in a turbo prop but they don’t like phones being on. It is not so bad these days to be without a phone as if you have a laptop/pad etc wi-fi is readily available, so I let home know I was not going to be texting and Jersey know I was not going to be ignoring them if they called.
Whilst there I got to talk to a lot of people from all over the island in different industries. It was great to catch up with the guys from vizuality as they are making huge strides in the areas of installation experiences using VR. Tracking users in a 10x10m space and providing headset visuals as they wander around.
I spent the Saturday hacking too. I looked a little in IBM Bluemix and its Unity api for text to speech, using my own book quotes to see if it could cope. They all still have trouble pronouncing Roisin though 🙂 I also then spent a bit more time on my Vuforia AR covers for the books. I decided that Reconfigure should have the variant of the block world view that Roisin sees and builds in her own Unity application.
Then on Cont3xt I explored writing a scene changer, so at certain intervals the models and view would swap. Initially I did that by toggling the image targets, but that did not trigger the re-viewing of them, as it expected the same target to have the same stuff on it. However, swapping the game objects attached in the tree, turning them on and off worked, just as the animation works. So I now have a little bit of authoring infrastructure that makes it easier to add multiple scenes and play through them.
I was going to do something with the leap motion sensor too, but that fitted more with having an AR headset to interact with the book covers, with a broken phone and other judging work to do I parked that one.
I also had a lot of conversations around IoT in various forms, and a bit of a chat about blockchain too. Jersey may only have 100,000 people on it, but there is a vibrant tech community there. It was a great trip, and the phone is now repaired (the Jersey shop wasn’t able to do 6s plus so Apple Basingstoke did it in 1 hour) It also now has a proper case. I had avoided that for ages, not having broken the phone before. Rather than take a picture of the phone in a mirror to show the case, I sparked up the AR unity, put the Reconfigure picture on the phone and then the mac did its thing and rendered Roisin, holding her phone with a view of the world that she sees.
Just to re-interate the loops within loops here. I am rendering an AR representation using Unity3d and an Iphone onto a digital version of the cover of the e-book that contains a story about Roisin discovering a way to see the World in terms of position and labels that she expands on by writing an application in Unity that tuns on her iPhone. I will share this post on Twitter. The same Twitter (happy 10th birthday) that she uses to accidentally discover her new found abilities when she accidentally types into the wrong window. Meta enough ? 🙂
Anyway, well done everyone at #hackjsy, great organization, great participation, great fellow judges and a great island. See you all again soon I hope.
Yesterday I popped along on just an expo pass to London’s Excel centre for the Wearable Technology show. Despite its name it is not all about wrist watches that check your heart rate. The organisers have recognised the wider implications of things getting more instrumented, more data flowing and more business opportunities in the Internet of Things industry pattern. I had several reasons to pop along, the main one there will be a little more on later. I alluded to some big changes in a tweet. I don’t mean to tease but I am going to anyway.
One of the reasons I went to the show was to see some of the sports tracking wearables though. This was more in keeping with the title of the show. I was interested in both the physical monitoring, taking it past heart rate, and how the coaching software was shaping up to make sense of the data. I was also interested in anything that helped track the type of movement. Both these are from my training and teaching in Choi Kwang Do. A lot of the newer body monitoring kit was being built into skin tight performance clothing. That seems a good idea in general, and for a martial art, not having things on our wrists, yet getting some great training and tuning feedback during cardio and PACE training is going to be useful. There was only really one body movement tracker, related to boxing. It was based on the accelerometer principle, combined with an app that told you how many punches you were doing and at what rate and also showed the speed of the punch. It is not out for a while but it will be interesting to try this version with our more unusual martial arts moves. It is called Corner
Another reason to go, just in general because it’s what I have worked with for years is the VR and AR aspects of the show. There were a lot of headsets, both full immersive Rifts, Gear VR and also lots of peppers ghost, not really a hologram, heads up displays. There were some interesting uses to track warehouse goods in a HUD and also using projection onto a surface to avoid the need for glasses at all.
One company I spent a bit of time talking to and taking the demo was vTime
The preamble was good, and the demo was great and I wish them all the best of luck. It is a 3d chat room, avatar based where the users choose lovely rendered scene to sit in and converse, soon to share pictures etc. It is claiming to be a sociable not social network application. It is targeted at mobile first. I kept hearing the zuckerburg quote about one day people will just sit around a virtual campfire. Of course I was taken back to 2006, but tried not to get all grumpy and remind them everything old is new again. The campfire was a key part of the imagery and the experience we had online back then. Though then we could get up and walk about in the free form environment. In fact we still can, and SL had VR support. i.e. two cameras one for each eye. I still liked this new application, and if people connect and enjoy it, then so be it 🙂 I would dive in but it was focussed onto Gear VR and I only have my iPhone (and a load of things to drop that into to get stereo vision)
The full post this was in to date it is here
Also Rita J. King was our embedded story teller/journalist and wrote tales from the fire pit to explain the rise of the virtual community powered by people, avatars and Second Life. The PDF of the story is here, and should be useful reading for the next wave of virtual environments. Headset or otherwise. I would say it is essential reading in fact 🙂
The economist had a stand showing a great use of virtual worlds. They have a reconstruction, from photos and other data, of the Mosul artefacts that have been destroyed due to the conflicts in the middle east. The VR was a little old fashioned, but the principle and content was good.
It was fun talking to the marker based AR developers, as they showed me things like book covers coming alive. Once again, I had to let them know I new a little about it and of course I tried to sell the idea in my books to people. If I had though about it I would have had a stand to show sci-fi novels about VR and AR and IoT with Reconfigure and Contxt 🙂
I mentioned serendipity in the title. I had tweeted my location as I got to excel, but then not checked twitter as I was going around the exhibits. As I tweeted I was leaving I saw that my colleague from way back Martin Gale was at the show, speaking and presenting. It was fantastic to catch up and see how well he is doing, quote the Fast Show, ‘Ah Ted” multiple times. Thank you for the coffee :). It made a great end to a fantastic day out. A day in which I got to practice a little of what I will be doing in the very near future, around the subjects that I will be immersed in. I guess in startup terms someone would call it is pivot. There I go teasing again. Watch this space, if you are interested 🙂
I thought it would be interesting to revisit some of the Augmented Reality tools that have Unity3d packages for them. The first I headed for was Vuforia, now owned by PTC. I am using the free version. It all worked straight from the install, though there are a lot of features to tinker with.
I wanted to use an image marker of my own, so naturally I used the book cover. There is a lot of AR and VR in the story so it all makes sense.
For the 3d models I used the current trial of Fuse/Mixamo now part of the Adobe. It lets you build a model and export it in a nice friendly Unity3d fix format. You would be suprised how these various formats for models and animations hide so much complexity and weird tech problems though. Eventually I found a couple of animations that fitted the mood. This is not a full action sequence from the book, but it could be. I have some more vignettes in mind to experiment with. I am just upgrading my windows machine so I can use the Kinect 2.0 for some of my own motion capture again. Finding the right animation, and then trying to edit it is really hard work. The addition of motion, especially in humans, hits a different part of the brain I think. It makes lots of things look not quite right.
So here is the video of augmented reality on a cover of a physical copy of an ebook novel telling the story with lots of augmented reality in it, if thats not too meta. Roisin rather sassily taps her for while the Commander lays down the law to her.
Reconfigure and Cont3xt are available right now to enjoy, and if at all possible, PLEASE write a review on Amazon, it makes a massive difference to the take up of the books.
Wow. It is seven years since I started Feeding Edge Ltd. That is quite a long while isn’t it? The past year has been a more difficult one with less work in the pipeline for most of it. It has meant I have had to take stock and look to do other things, whilst the World catches up. It does seem strange given the dawn of the new wave of Virtual Reality, and Augmented Reality that I have not managed to find the right people to engage my expertise and background in both regular technology and virtual worlds. In part that was because I was focussing on one major contract, when that dried up suddenly there was no where to go. It is starting from zero again to build up and sell who I am and what I do.
My other startup work has always been ticking along under the covers, as we try and work the system to find the right person with the right vision to fund what we have in mind. It is a big a glorious project, but it all takes time. Lots of no, yes but and even a few lets do it, followed by oh hang can’t now other stuff has come up.
On the summer holiday, I had a good long think about whether to give this all up and try and find a regular position back in corporate life, I was hit with a flash of inspiration for the science fiction concept. It was so obvious that I just had to give it a go. That is not to say I would not accept a well paid job with slightly more structure to help pay my way. However, the books flowed out of me. It was an incredibly exciting end to the year. Learning how to write and structure Reconfigure, how to package and build the ebook and the print version. How to release and then try and promote it. I have learned so much doing it that helps me personally, helps my business and also will help in any consulting work I do in the future. I realised too that the products of both Reconfigure and Cont3xt are like a CV for me. They represent a state of the virtual world, virtual reality, augmented reality and Internet of Things industry, combined with the coding and use of game tech that comes directly from my experiences, extrapolated for the purpose of story telling.
Write what you know, and that appears to be the future, in this case the near future.
This year I have also been continuing my journey in Choi Kwang Do. This time with a black suit on as a head instructor. It has led me to give talks to schools on science and why it is important, with a backdrop of Choi Kwang Do as a hook for them. I am constantly trying to evolve as a teacher and a student. Once again the reflective nature of the art was woven into the second book Cont3xt. I did not brand any of the martial arts action in the book as Choi Kwang Do as that may mis-represent the art and I don’t want to do that, but it did influence the start of the story with its more reflective elements, later on a degree of poetic licence kicked in, but the feelings of performing the moves is very real.
I have continued my pursuit of the unusual throughout the year. The books as a product provide, rather like the Feeding Edge logo has in the past, a vehicle to explore ideas.
I still really like my Forza 6 book branded Lambo, demonstrating the concept of digital in world product placement.
If you have read the books, and if not why not? they are only 99p, you will know that Roisin like Marmite. Why not ? I like Marmite, again write what you know. It became a vehicle and an ongoing thread in the stories, and even a bit of a calling card. It is a real world brand, so that can be tricky, but I think I use it in a positive way, as well as showing that not everyone is a fan. So the it is just another real world hook to make the science fiction elements believable. So I was really pleased when i saw that Marmite had a print your own label customisation. It is print on demand Marmite, just as my books are print on demand. It uses the web and the internet to accept the order and the then there is physical delivery. I know its a bit meta but thats the same pattern Roisin uses, just the physical movement of things is a little more quirky 🙂
I have another two jars on the way. One for Reconfigure and one for Roisin herself.
I am sure she will change her own twitter icon from the regular jar to one of these later as @axelweight Yes she does have a Twitter account, she had to otherwise she would not have been able to accidentally Tweet “ls -l” and get introduced to the World changing device @RayKonfigure would she?
All this interweaving of tech and experience, in this case related to the books, is what I do and have always done. I hope my ideas are inspirational to some, and one day obvious to others. I will keep trying to do the right thing, be positive and share as much as possible.
I am available to talk through any opportunities you have, anytime. epredator at feedingedge.co.uk or @epredator
Finally, last but not least, I have to say a huge thank you to my wife Jan @elemming She has the pressure of the corporate role, one that she enjoys but still it is the pressure. She is the major breadwinner. You can imagine how many 99p books you have to sell make any money to pay anything. She puts up with the downs whilst we at for the ups. Those ups will re-emerge, this year has shown that too me. No matter how bleak it looks, something happens to offer hope. I have some new projects in the pipeline, mostly speculative, but with all these crazy ideas buzzing around something will pop one day.
As we say in Choi Kwang Do – Pil Seung! which means certain victory. Happy lucky 7th birthday Feeding Edge 🙂
All the technology and projects I have worked on in my career take what we currently have at the moment and create or push something further. Development projects of any kind will enhance or replace existing systems or create brand new ones. A regular systems update will tweak and fix older code and operations and make them fresher and new. This happens even in legacy systems. In both studying and living some of the history of our current wave of technology, powered by the presence of the Internet, I find it interesting to reflect of certain technology trajectories. Not least to try and find a way to help and grow this industries, and with a bit of luck actually get paid to do that. I find that things finding out about other things is fascinating. With Predlet 2.0 birthday party we took them all Karting. There was a spare seat going so I joined in. The Karts are all instrumented enough that the lap times are automatically grabbed as you pass the line. Just that one piece of data for each Kart is then formulated and aggregated. Not just with your group, but with the “ProSkill” ongoing tracking of your performance. The track knows who I am now I have registered. So if I turn up and rice again it will show me more metrics and information about my performance, just from that single tag crossing the end of lap sensor. Yes that IoT in action, and we have had that for a while.
The area of Web services is an interesting one to look at. Back in 1997, whilst working on very early website for a car manufacturer, we had a process to get to some data about skiing conditions. It required a regular CRON job to be scheduled and perform a secure FTP to grab the current text file containing all the ski resorts snowfall, so that we could parse it and push it into a form that could be viewed on the Web. i.e. it had a nice set of graphics around it. That is nearly 20 years ago, and it was a pioneering application. It was not really a service or an API to talk to. It used the available automation we had, but it started as a manual process. Pulling the file and running a few scripts to try and parse the comma delimited data. The data, of course, came from both human observation and sensors. It was collated into one place for us to use. It was a real World set of measurements, pulled together and then adjusted and presented in a different form over the Internet via the Web. I think we can legitimately call that an Internet of Things (IoT) application?
We had a lot of fancy and interesting projects then, well before their time, but that are templates for what we do today. Hence I am heavily influenced by those, and having absorbed what may seem new today, a few years ago, I like to look to the next steps.
Increasingly I have had to get the Unity applications to talk to the rest of the Web. They need to integrate with existing services, or with databases and API’s that I create. User logons, question data sets, training logs etc. In many ways it is the same as back in 1997. The pattern is the same, yet we have a lot more technology to help us as programmers. We have self defining data sets now. XML used to be the one everyone raved about. Basically web like take around data to start and stop a particular data element. It was always a little to heavy on payload though. When I interacted with the XML dat from the tennis ball locations for Wimbledon the XML was too big for Second Life to cope with at the time. The data had to be mashed down a little, removing the long descriptions of each field. Now we have JSON a much tighter description of data. It is all pretty much the same of course. An implied comma delimited file, such as the ski resort weather worked really well, if the export didn’t corrupt it. XML version would be able to be tightly controlled and parsed in a more formal language style way, JSON is between the two. In JSON the data is just name:value, as opposed to XML value. It is the sort of data format that I used to end up creating anyway, before we had the formality off this as a standard.
Unity3d copes well with JSON natively now. It used to need a few extra bits of code, but as I found out recently it is very easy to parse a web based API using code and extra those pieces of information and adjust the contents of the 3d Environment accordingly. By easy, I mean easy if you are a techie. I am sure I could help most people get to the point of understanding how to do this. I appreciate too that having done this sort of thing for years there is a different definition of easy.
It is this grounding in real World pulling info data and manipulating it, from the Internet and serving it to the Web that seems to be a constant pattern. It is the pattern of IoT and of Big Data.
As part of the ongoing promotion of the science fiction books I have written I created a version of the view Roisin has of the World in the first novel Reconfigure. In that she discovers and API that can transcribed and described the World around her.
This video shows a simulation of the FMM v1.0 (Roisin’s application) working as it would for her. A live WebGL version that just lets you move the camera around to get a feel for it is here.
WebGL is a new target that Unity3d can publish too. Unity used to be really good because it had a web plugin that let us deploy applications, rich 3d ones, to any web browser not just build for PC, mac and tablets. Every application I have done over the past 7 years has generally had the web plugin version at its core to make life easier for the users. Plugins are dying and no longer supported on many browsers. Instead the browser has functions to draw things, move things about on screen etc. So Unity3d now generates the same thing as the plugin, which was common code, but creates a mini version for each application that is published. It is still very early days for WebGl, but it is interesting to be using it for this purpose as a test and for some other API interactions with sensors across the Web.
In the story, the interaction Roisin starts as a basic command line ( but over Twitter DM), almost like the skiing FTP of 1997. She interrogates the API and figures out the syntax, which she then builds a user interface for. Using Unity3d of course. The API only provides names and positions of objects, hence the cube view of the World. Roisin is able to move virtual objects and the API then, using some Quantum theory, is able to move the real World objects. In the follow up, this basic interface gets massively enhanced, with more IoT style ways of interacting with the data, such as with MQTT for messaging instead of Twitter DM’s as in the first book. All real World stuff, except the moving things around. All evolved through long experience in the industry to explain it in relatively simple terms and then let the adventure fly.
I hope you can see the lineage of the technology in the books. I think the story and the twists and turns are the key though. The base tech makes it real enough to start to accept the storyline on top. When I wrote the tech parts, and built the storyboard they were the easy bits. How to weave some intrigue danger and peril in was something else. From what I have been told, and what I feel, this has worked. I would love to know what more people think about it though. It may work as a teaching aid for how the internet works, what IoT is etc for any age group, from schools to boardroom? The history and the feelings of awe and anger at the technology are something we all feel at some point with some element of out lives too.
Whilst I am on real World though. One of the biggest constants in Roisin’s life is the like it or love it taste of Marmite. It has become, through the course of the stories, almost a muse like character. When writing you have to be careful with real life brands. I believe I have used the ones I have in these books as proper grounding with the real World. I try to be positive about everyone else products, brands and efforts.
In Cont3xt I also added in some martial arts, from my own personal experience again, but adjusted a little her and there. The initial use of it in Cont3xt is not what you might think when you hear martial art. I am a practitioner of Choi Kwang Do, though I do not specially call any of the arts used in the book by that name as there are times it is used aggressively, not purely for defence. The element of self improvement is in there, but with a twist.
Without the background in technology over the years and the seeing it evolve and without my own personal gradual journey in Choi Kwang Do, I would not have had the base material to draw upon, to evolve the story on top of.
I hope you get a chance to read them, it’s just a quick download. Please let me know what you think, if you have not already. Thank you 🙂
Yes, ok, my latest obsession of writing #Reconfigure might make a greta TV series or Movie but that’s not what this is about. Instead it is about another set of skills I seem to be putting into practice.
Before the I spent a day editing up the school end of year video. I didn’t shoot it, but I did use al the various messing around and comedy moments from the teachers and staff to put a montage together. It had a story, some pace, a step change or two. It was great fun. I have now been asked to edit a few more things together. I like editing moving images, it is fun. Though I am not charging for this is just to help people out.
At the same time I am in the middle of a little project to try and jazz up a regular presentation in the corporate space. I demoed a little virtual fly through a building, past a couple of characters and onto an image using Unity3d. It turned into a slightly more complex epic but non the less I am using lots of varied skills to make it work. The pitch will be pre canned but I have built the Unity environment so that it runs on rails, but live. I have c# code in there to help do certain things like play video textures, animation controllers and even NavMesh agents to allow a character to walk around to a particular room and lots of flying cameras. I had used the first few things a lot. It is stock Unity. However I had not really had a chance to use Unity animations. All my animations had been either remade FBX files or ones that I created in Cheetah3D. Once you imported an animation like that it was then sort of locked in place.
However, Unity3d has its own animation dope and curve sheets, and its really handy.
The animation Tab brings a different timeline. It is different to the Animation Controller screen that lets you string animation together and create transitions. e.g. jump or run from a walk.
The animation tab lets you select an object in the scene and create a .anim for you. It then by default adds that anim to an animation controller and adds that to the object.
The record mode then allows you to expand any or all of the properties of the object, usually transform position and rotation, but it can be colour, visibility, basically anything with a parameter. If you move the timeline and then move the object, like a camera for instance, it creates the key frames for the changed parameters. It removes any of the awkward importing of paths and animation ideas from another tool. It is not great for figures and people. They are still best imported as skinned meshes with skeletons and let the import deal with all the complexity so you can add any stock animations to them. Whoever for a cut scene, or zoom past to see a screen on a wall it works really well.
I have written a lot of complicated things in Unity3d, but never bothered using this part. It is great to find it working, and doing exactly what I needed it to do for my cinematic cut scenes.
You can have things zooming and spinning around a known path in no time. You have to break it down a little otherwise it resolves the easiest path through a wall or something, but it seems very robust to edit it afterwards.
The only pity is that its not possible to lock a camera preview window open in stock Unity3d when selecting another object. Having got a camera pan just right its trick if you was to move an object in the camera view. With camera selected you see it’s preview, with the object selected you see it. Never got around that yet, just end up with trial and error.
No matter, it works, and its awesomely useful.
Back in 2006 when we all got into in the previous wave of Virtual Reality (note the use of previous not 1st) and we tended to call them virtual worlds or the metaverse we were restricted to online presence via a small rectangular screen attached to a keyboard. Places like Second Life become the worlds that attracted the pioneers, explorers and the curious.
Several times recently I have been having conversations, comments, tweets etc about the rise of the VR headsets Oculus Rift et al, AR phone holders like Google Cardboard, MergeVR and then the mixed reality future of Microsoft Hololens, and Google Magic Leap.
In all the Second Life interactions and subsequent environments it has always really been about people. The tech industry has a habit of focusing on the version number, or a piece of hardware though. The headsets are an example of that. There may be inherent user experiences to consider but they are more insular in their nature. The VR headset forces you to look at a screen. It provides extra inputs to save you using mouse look etc. Essentially though it is an insular hardware idea that attempts to place you into a virtual world.
All out previous virtual world exploration has been powered by the social interaction with others. Social may mean educational, may mean work meetings or may mean entertainment or just chatting. However it was social.
Those of us who took to this felt as we looked at the screen we were engaged and part of the world. Those who did not get what we were doing only saw funny graphics on the screen. They did not make that small jump in understanding. There is nothing wrong with that, but that is essential what we were all trying to get people to understand. A VR headset provides a direct feed to the eyes. It forces that leap into the virtual, or at least brings it a little closer. Then ready to engage with people. Though at the moment all the VR is engaging with content. It is still in showing off mode.
We would sit around campfires in Second Life. In particular Timeless Prototype’s very clever looking one that adds seats/logs per person/avatar arriving. Shared focal points of conversation. People gathering looking at a screen in front of them but being there mentally. The screen providing all the cues and hints to enhance that feeling. A very real human connection. Richer than a telephone call, less scary than a video conference, intimate yet stand offish enough to meet most peoples requirements if they gave it a go.
It is not such a weird concept as many people get lost in books, films and TV shows. They may be immersed as a viewer not a participant or they may be empathising with characters. They are still looking at an oblong screen or piece of paper and engaged.
With a VR headset, as I just mentioned, that is more forced. They will understand the campfire that much quicker and see the people and the interaction. There is less abstraction to deal with.
The rise of the blended and mixed reality headset though change that dynamic completely. Instead they use the physical world, one that people already understand and are already in. The campfire comes to you. Wherever you are.
This leads to a very different mental leap. You can have three people sat around the virtual campfire in their actual chairs in an actual room. The campfire is of course replaceable with any concept, piece of information, simulated training experience. Those people each have their own view of the world and of one another but its added to with a new perspective, they see the side of the campfire they would expect.
It goes further though, there is no reason for the campfire not to coexist is several physical spaces. I have my view of the campfire from my office, you from yours. We can even have a view of one another as avatars placed into our physical worlds, or not. It’s optional.
When just keeping with that physical blended model it is very simple for anyone to understand and start to believe. Sat in a space with one another sharing an experience, like a campfire is a very human very obvious one. For many that is where this will sort of end. Just run of the mill information, enhanced views, cut away engineering diagrams, point of view understanding etc.
The thing to consider, for those who already grok the VW concept is that just as in Second Life you can move your camera around, see things from any point of view, for the other persons point, from a birds eye view, from close in or far away, the mixed reality headsets will still allow us to do that. I could alter my view to see what you are seeing on the other side of the camp fire. That is the star of something much more revolutionary to consider whilst everyone catches up the fact that this is all about people not just insular experiences.
This is the real metaverse, a blend of everything anywhere and everywhere. Connecting us as people, our thoughts dreams and ideas. There are huge possibilities to explore and more people will once that little hurdle is jumped to understand it’s people, its always been about people.