The recording of my BCS Animation and games pitch is now live on the site and on youtube. Unfortunately only the live audience will get to see the larger avatar I was using to deliver this pitch, as the zoom presentation feed is only of the shared screen with a tiny top left insert of the avatar and my dulcet and somewhat stuttering tones tones. It took a few minutes to get revved up and to many I might be stating the obvious but it is surprising how many people don’t get the importance of user generated content and/or expression in virtual worlds and games.
Official site is here but added the youtube below to save a click or two.
Way back in the dim and distant era of 2009 I was exploring a lot of tools to help me build virtual environments with avatars and characters that could be animated, typically in Unity. 3D modelling is an art in its right and then rigging those models for animation and applying animations to those is another set of skills too. At the time I was exploring Evolver, which in the end was bought by Autodesk in 2011. A decade on and there is a new kid on the block from Epic/Unreal called Metahuman. This cloud based application (where the user interface is also a cloud streamed one) runs in a browser and produces incredibly detailed digital humans. The animation rigging of the body is joined by very detailed facial feature rigging allow these to be controlled with full motion capture live in the development environment of Unreal. Having mainly used Unity there is a lot of similarity in the high level workflow experience of the two, as they are the leading approaches to assembling all sorts of game, film and XR content. However there was a bit of a leaning curve.
I decided to generate a couple of characters and ended up making what to me feels like Roisin and the Commander from my Reconfigure novels. Creating them in the tool plays instant animations and expressions on the faces and is really quite surreal an experience. I installed Unreal on my gaming rig with its RTX gfx card and all teh trimmings and set about seeing what I needed to do to get my characters working.
First there is an essential application called Quixel Bridge that would have been really handy a decade ago as it brokers the transfer of file formats between systems, despite standards being in place there are some quirks when you move complex 3D rigged assets around. Quixel can log directly into the metahuman repository and there is a specific plugin for the editor to import the assets to Unreal. Things in Unreal have a data and control structure called a blueprint that is a kind of configuration and flow model that can be used in a no-code (but still complex) way to get things working. You can still write c++ is needed of course.
My first few attempts to get Roisin to download failed as the beta was clearly very popular. I only took a photo of the screen not a screen cap, a bit low quality but there is more to come.
However, eventually I got a download and the asset was there and ready to go. Unreal has a demo application with two MetaHumans in it showing animation and lip synching and some nice camera work. Running this on my machine was a sudden rush to the future from my previous efforts with a decade old tech for things such as my virtual hospitals and the stuff on the TV slot I had back then.
The original demo video went like this
Dropping into the edits and after a lot of shader compilation I swapped Rosin with the first metahuman by matching the location coordinates and moving the original away. Then in the Sequence controller, the film events list I swapped the target actor from the original to mine and away we go.
This was recorded without the sound as I was just getting to grips with how to render a movie rather than play or compile an application to then screen cap instead. Short and sweet but proves it works. A very cool bit of tech.
I also ran the still image through the amusing AI face animation app Wombo.AI this animates stills rather than the above which is animating the actual 3d model. I am not posting that as they are short audio clips of songs and teh old DMCA takedown bots end to get annoyed at such things.
Now I have plan/pipe dream to see if I can animate some scenes from the books, if not make the whole damn movie/tv series 🙂 There are lots of assets to try and combine in this generation of power tooling. I also had a go at this tutorial, one of many that shows the use of a Live facial animation capture via an iPhone streamed to the metahuman model. I will spare Roisin public humiliation of sounded like me and instead leave it to the tutorial video for anyone wishing to try such a thing.
Last weekend, Friday 3rd July and Sat 4th July 2020 the virtual world Sansar hosted the Glastonbury Shangrila set of gigs called Lost Horizon. I decided I really should attend as it looked such an epic lineup and attempt to put on 3pm-3am constant music over several stages for 2 days, in VR.
Sansar was originally a Linden Lab( Of Second Life fame) spin off, focussed on using newer tech to get people into the environment. I was in there back in 2017 importing a tilt brush sculpture into my own space. I had been in the early beta but it was all NDA so I didn’t post photos until it went live.
Sansar was recently sold off to Wookie Ventures, but I was pleased to see that my Sansar account even from back then still worked, and I had the “look I have been here since year dot” style level on my profile.
I had a quick look around new Sansar the days before the event, you know to get the avatar in good shape and not look a total n00b moving around etc. It had changed a lot, very much in keeping with the style on VR apps with pop up menus you can place in space or float from you wrist. It also looked and sounded great
Incidentally, for some reason all the photos I took from within the Oculus wrist band came out square, it all looked much better in VR and moving.
I visited the popular 2077 cyberpunk world too, again all as prep. All very much in keeping with the whole scene, and great to see Max Headroom (look him up kids) on a screen for my generation of geeks.
The real star though was the set of gigs both of dance music, some urban house or something I don’t even know and some live bands performing all manner of wonderful tunes.
Now this is where my experience may have got a little more freaky. I had my Oculus Rift S ready to go for the visuals, but I also wore my Woojer haptics vest that hits your body with bass lines and clever sensory perception through sound, then that fed to my Nuraphone in ear and over ear headphones, that are tuned to my individual hearing pattern. Its a lot of clobber to be working but it had to be tried. (Note this is all bought and paid for no product placement here!)
As I logged into Sansar and I arrived at the Nexus I immediately got a blast on through all the kit, as I teleported to the first stage I arrived at the landing zone, the stage was a little walk away, but already the dance music bass line was hitting me physically as well as sounding great. I had to turn the haptics down a little to start off with to get used to it as I approached the arena.
Once in, and this applied to all the shows I went to, the DJ, in this case, was green screened into the booth, so whilst we were all avatars, that DJ was theirselves doing their thing.
Everyone else was an avatar of many different forms, the vanilla basic first avatar to some incredible creations. I couldn’t find my normal predator kit so I went with my second option of green hair and some cyberpunk kit. I also picked up a set of animations and dance moves free form the shop but went on to buy a whole lot more to add to the fun.
The music and the feeling was incredibly intense and I tried all the stages in my first 3 hours at the event. I liked the DJ stuff but live bands were even better.
I dropped out after 3 hours just to make sure I wasn’t shaking myself to pieces, and to grab some food. I also mentioned it to Predlet 2.0, he installed Sansar and got set up, he didn’t set his VR up but it supports desktop.
He was having a blast with silly avatars too. We were at the Fat Boy Slim gig, as you can see below. Fat Boy Slim is waving in the background. I am rocking a guitar that I went and bought, as I was enjoying the actual bands and Predlet 2.0 was a dancing shark for this one.
In Sansar you have a choice about how much you want to interact, I love having lots of dance moves and gesture and string them together in time with the music, it was clear others were just listening, living the avatar to dance, but it’s all good. Predlet 2.0 shark liked doing head spins.
You can mix between the first person view, or as in these photos orbit the camera around. If you are dancing its best to see what you are doing with the avatar from the outside.
Occasional crashes or re-sharding sent me back to my own little room but I stayed for another 3 or 4 hours on the first day. Thats a lot of VR and haptic feedback and banging tunes in your head I have to say. I was worn out 🙂
Some of the venues got us all flying around with low gravity or bouncing pads, so I got to be all rock and roll on a rooftop.
I got an awful lot of moves and animations, many from names I know well from Second Life. These all strung together in a wonderfully esoteric way and the puppetry of dancing got more and more fun.
It was also clear that there were an awful lot of people with years in Sansar, intricate avatars, lots of youtubers streaming too. Equally there were some people who apparently had never been in any virtual world or games before. These people were sort of amusing in they would arrive and shout to their friend, “can you hear me”, “Is that you?”, “Look what I can do?” which was moderately funny, but then they would start pointing out weirdness in avatars thinking no one could hear them. I had left my channel open so I could. There were also some people, as in any crowd looking for a fight, swearing or being abusive, they were politely reminded by Sansar helpers/bouncers to chill out and that was in fact PG-13. I saw a few of those less capable of enjoying space with others leave, or get booted for being grievers. Its 2006 all over again, as we used to have to do that event at the IBM Wimbledon Second Life event.
I saw a lot of bands, so many I can’t name them all but on day 2 probably about 5 hours in I caught UK metal/punk/street band PENGSHUi and they absolutely rocked. If nothing else it was the mix of styles messing and thrashing my haptic Woojer. It was certainly time to break out the guitar. Yes I even bought the album afterwards, the first music I have bought for years as a memory of the event.
I spent about 10-12 hours in total out of the 24 available. The Woojer and Nuraphones added a great deal to my personal experience. I know a lot of other people were enjoying it and the production was fantastic. It is the best virtual event I have been too, not the longest as 2 weeks running Wimbledon in SL would probably take that record, and that was different. This was pure escapism, rocking music and a bunch of fellow mad virtual world enthusiasts and n00bs alike coming together in this weird locked down pandemic.
If I have note already posted all the photos I managed to take above they are in this album, but go check out the videos and other official content, not least PENGSHUi (Yay I see my avatar)
Or once of the DJ’s like Fatboy Slim (There were multiple shared rooms, looks like I wasn’t in the filmed room this time 🙂 )
Usually by this time of year I would have done several presentations on stage or a webinar or some such thing. This year of course we we have all been locked down. We got to 75 days on our chalk board before one of the predlets (following social distancing) walked into the park to meet a friend. That is a long time. We are lucky to have space in the house and garden and lots of things to do, but it clearly has a mental impact.
I have spent hours and hours walking around the garden listening initially to music, then getting really into audible books but also properly regularly listening to Games At Work.Biz pod cast on a Monday. The dulcet tones of my friends Michael Rowe, Michael Martine and more recently Andy Piper talking about all the things in tech that I love was like hanging out and shooting the breeze. Instead of talking I was walking, in order to maintain some level of fitness and get some vitamin d. I had been tweeting about the show and engaging in some extra conversation as thoughts arose so it was funny the last few episodes to get a name check.
I got asked to guest on the show last Friday which was utterly fantastic. However, I realised that I had not really been doing much public speaking so I started to doubt I had the words or thoughts worthy of the show. Equally, its about stuff I care about so there was no way I was going to say no. Instead I bought a new cardoid mic with a spring scissor boom and a pop filter. Yay for the tech. I have been on the show before, after I published the novels but that was November 2015 ! episode 126. It was part of a wide set of publication promotions I tried
I signed into Skype with my Audio Hijack app configured to send my channel of seeking to the communal drop box. We had a slack full of suggested news items (including the ones I added). Then we piled in recording having had a bit of a thread and final item in place. Four of us on a podcast might get tricky, but we all got to say what we needed to say. It was really good fun. It was great timing too as we all went through an amazing time with virtual worlds in 2006-2009 and I was attending Augmented World Expo and some of its events that week. So very fresh in the short term memory combined with long term memory 🙂
The time flew by, it was like the pub getting time called, but for the time I was on it was mentally refreshing. Of course then I was nervous about listening back to it, would it still give me the great buzz that I got to get me into the week. I am please to say it did, I sort of filtered out what I said and took even more notice of what the guys said. It is very different to be in listener mode as opposed to broadcast. Part of the reason Zoom calls are so tricky as most of the time we are receiving info, but we have open cameras so are in broadcast mode. That is tiring.
My participation, professional and social, in virtual worlds over many years mixes my tech geek inquisitive builder side with the deep emotional impact of how interactions feel. I still clearly remember events from 2006 that occurred in what we still call virtual. I also still feel the waves of joy and excitement as a fledgling industry grew and the pain of the inevitable bubble bursting. I remember the people who I have grown up with in virtual worlds, friendships that a transcending time and space. I also deeply feel the fear and jealousy that was directed at many of us despite attempting to be inclusive. It was shrouded in the apparent lack of seriousness that comes from games technology, but was really a fear of missing out in human to human communication and of power structures altering. I do think and mourn where we could have been with the virtual, right now, when we need it the most but equally we now have social media deeply adopted. The world is moving, and now everyone has to find ways to communicate that work for them.
I was honoured to be invited to gather around a firepit and chat in Second Life last weekend. It was at 2am my time on Sunday morning. If you were part of that great time around 2006 you will recognise these wonderful people from Billions of Us. I have remained an SL resident and island owner all these years so it was not a resurrection event to sign on and pop along. However, I felt… nervous… Why? I was just going to have a private chat with people who I share social media space with all the time. The reason was we were going to talk about the old days. I have all those emotions tied up in that period of time now also compacted with being locked down at home for the past 57 days.
In order to help deal with some of the demons I felt I needed to upgrade my avatar. My original predator avatar, with striped leather jacket and then a subsequent feeding edge t-shirt is from a very different time. SL has changed to be able to cope with 3d mesh models now, I was a collection of graphic primitive cunningly sculpted and scripted by Sythia Veil. However, I was a processor hog. I had visited a sim and a script had told me I was using up 4% of the processing power on the server, on my own! The client tells you your avatar weight (as in processing required) and I was bloated old and fat at over 50k. I went and found a new and brilliantly built predator avatar on the store, a detailed mesh, with a weight of just 5k. I adjusted it to have a feeding edge logo attached at low cost and also placed clickable versions of both Reconfigure and Cont3xt on my hips, like six shooters. Changing a long standing avatar is not without another range of emotions, but it helped me think about the future not the past, and damn… it looks good, with and without the mask. Well done to undercover for the build!
I had tweeted about this and was thrilled that Games at Work dot biz talked about it on the podcast. A link very much to the original eightbar crew in 2006 with Andy Piper, Michael Rowe and Michael Martine. Also great to hear Andy also headed into an avatar rebuild. You see its 14 years ago and still very relevant, very personal and are all very connected. Also a wonderful podcast and the Michael’s are now at episode 272 of the weekly event (I will let that number sink in!)
Anyway Sunday 2am rolled into play, I teleported to the fire pit and was there with everyone. It turned out I had not tested the avatar in public so I was over 8 foot tall, but I sat down rather than editing there.
We talked for an hour solid about the good old days and the future, the relevance of virtual worlds to our reality now. I shared how it all got started for me and the chain of serendipitous events and wonderful people that gathered around IBM eightbar. We then took a quick tour of some art installations on the fledgling Billions of Us islands.
Then we said our goodbyes and I returned home. Both to my own island in SL and then to my office.
That “returning home” is way more emotionally loaded that I would normally expect it to be. It was late, I was tired but…. unlike all this torrent of video calling (and audio) which being people to my screen into my office I felt that for the first time in 57 days I had been somewhere with people, friends old and new. When I woke up in the morning I still felt I had been out. It is this feeling that I have had a lot with virtual world and game experiences but the fact it happened under this crazy pandemic world times and was so liberating and so deeply felt I pity anyone still messing around trying to come to terms with those funny people with there funny avatars and crazy ideas.
Virtual worlds have not gone away. There are many more. Please consider this might be a great mental getaway for many of you. I don’t care which one, and I know community building takes time, but get in one take a look and really think how does it feel.
It was my tribe, it was on a subject we all knew but this is very very real. That one hour made me feel better, really very much better than I have for a long while. If it can do that for me, maybe it can do it for you. 1 hour respite just chatting or wondering around a virtual world. Let me know how you get on.
Lots more people are playing with/using video now with the lockdown. Every conference is a video conference. We are communicating with friends, family, customers and colleagues in a variety of ways and a stack of different contexts.
I sparked up Faceapp on my phone, something I had tinkered with a while back and found that it now not only does mad crazy things with great accuracy to photos but can also do video for some of those effects. With still images like this before and after example, or they they the other way around 🙂 you decide.
The video though…. mind blowing stuff.
There is also lots of fun to be had with the snapchat virtual camera on PC and Mac able to feed into any of the video conference tools you use by diverting your main camera to it first then selecting the snap camera as your webcam.
Lots can be done in foreground and background replacement. I mean who would turn up at a meeting in a predator costume? (like its 2006)
Like my Second Life avatar (below) I cant claim to have created this one. But it was great to see it on the list of snap chat lenses. My SL avatar has a custom jacket (my original RL one) and a feeding edge T-shirt so there are always tweaks to be made. Yes SL is still there, yes I still have 2 islands, yes you should sign up/re-login.
Back in the UGC world of SL in 2006 we had lots of interesting uses for logos and brands, like the Wimbledon Tennis flying towel and Wimbledon contact lenses (Official sanctioned I should add). I dived into the Snapchat Lens Studio to see what I could quickly create. This was knowing full well there was a very rich 3d animation and code/no-code environment that could be a career for some talented artists. I used one of the base models and tweaked it a bit and gave myself some shades with a subtle mention of 451 Research (now part of S&P Global Market Intelligence). Obviously that would all not fit on the glasses, but you get the idea.
I have always talked about putting logos and our own designs into game environments like my reconfigure car to advertise my novels. Before that back in 2006 putting eightbar into games too ( in the link above too). Now I have a feeding edge logo in Animal Crossing New Horizons. I am planning on doing a reconfigure one too. Only a few pixels to work but its fun to try. Our entire family is obsessed with this game at the moment, on top of all the other various games we play its a unifying experience as we tend to our islands, or live on one another’s. (Hmmm another SL like reference)
Of course this lockdown is driving us all crazy, but having a bit of fun is essential. Back to snapchat it was good to see the Trolls movie had some official snapchat lenses. When I tweeted this I was told I looked like Eddie Izzard 🙂
The family joined in too, but not posting them without permission 🙂
Having said that all this tech is great, but I also felt the need to try a real life filter and broke out this costume left over from The Cool Stuff Collective TV show (I didn’t wear it on the show, but at the wrap party). It was certainly a shock for my colleagues on video. The neighbours probably thought I had lost it too as I ran around the garden in it, doing a real life animal crossing manoeuvre.
Stay safe everyone, have some fun with the tech and lark around a bit!
The recent pandemic situation is causing many offices to suddenly switch from co-location in a building to everyone working from home. 11 years ago when I left IBM and started Feeding Edge I was immediately a home worker, but the preceding years, despite having an office and a base most work was effectively remote. Being a metaverse evangelist in 2006 was all about people using virtual world and game technology to be able to communicate and understand one another at distance.
The projects in 1997, in the early days of the web, were built as a team in an office but were clearly very much about interacting with the world at large. For those of us in these industries, building and shaping the use of the internet we did not face a big bang switch over from office to home. We tended to grab and adapt or write whatever tools were available to work across physical and digital divides. That of course is a typical pioneering spirit and having been an evangelist for change I know not everyone is comfortable with that, and nor should they be.
A sudden switch from one thing to another is bad enough, I recently had to switch from Mac to Windows for work, its annoying, and irritating at best and veer stressful at worse. Today we have the added external stress and personal worries about family and friends as well as the future of businesses layered on top.
The reality is, we do have the technology to communicate and share, we might not all have the right processes or social norms in place but the more we try, the better.
For many people teleconferences are a norm anyway, just usually they commute to a desk or office in order to have those. However many others will not. It is here we all need to be cognitive of how much the technology blocks and filters who we are. A voice only teleconference is great for a presenter, or for the alphas who thrive on talking, but many people will not feel comfortable to interject and cannot use body language to find a gap in a conversation. Video links are not really any better, for some that strips them of their normal behaviour as they attempt to stare at the camera or try and get a level of eye contact akin to the physical world. Some people engage in text chat alongside audio and video, though often those who are better at talking do not pay attention to the text, and sometimes the text just becomes chatter in the background. There is an art and style to all this and people will find what suits them, just as in a physical situation we adapt to one another’s signals.
It is these sort of difficulties that led to exploring virtual worlds like Second Life. Some of that is shown in the Album below of 600+ images 2006-2009. This was a long and varied journey and one I have talked, writes and presented on many many times. Happy to share the tales just ping me @epredator or the 11 years worth of posts here many related to metaverse concepts or even read my Reconfigure series and get the gist of of some of it in a modern sci fi context.
It acted (and still does) as a teleconference in having shared sounds in a space, but it has the presence of avatars to represent people, that get moved around even doing simple things like sitting or standing or hovering next to people you know in a meeting. Instant visual feedback, who is there, where are they, what are they representing with the avatar. We used text chat, group and 1:1 whilst also having voice. We were able to bring artefacts into the environment dynamically, whiteboard in a 3d space. All adding to the depth of communication. There is also, for those that can, ways to code, make virtual objects do something too. This is still entirely possible on Second Life and many other virtual environments. Now more than ever is time for people to experiment.
This is also without even bringing in the tech and immersion fo Virtual Reality or the potential for Augmented Reality to allow us to blend our physical and digital presences in physical space. That of course requires people to don headsets and have the kit, but virtual world interaction does not need that. A laptop/tablet and an internet connection is all that is required. Carry on with the frustration of teleconference and video conference by all means, sometimes they are the ideal approach, but just consider what is causing frustration of working remotely and investigate what might be a better way.
This weekend I finally was able to purchase a digital copy of the film version Ready player One. It is something we saw as a family at the cinema but I had been wanted to take another look. I don’t buy many individial movies any more as with Sky, Netflix and Amazon things are usually easy to access as needed. It seems the US got to the home release first, another weird thing to be doing in the 21st century, slowly rolling out digital/streaming availability is so archaic and only really leads to piracy and lost revenue.
I had to make a choice on how to get to a digital version of the movie. Do I pay Sky for its buy and keep, where they also send blu-ray to go with the download. Or buy a blu-ray and attempt to figure out the film industry approach to digital, which, as with the release schedule is pretty archaic. Instead I went for Amazon Video. This we have on all our TV’s in the house, computers, pads and phones, alongside with Netflix. The last movies I bought were also on Amazon, John Wick 2 and before that The Raid 2, so its ideal to have next door to those.
I sparked up Amazon on the TV and home cinema and started to enjoy spotting even more of the references from my childhood and growing through the 80s as a gamer. Plus of course the who depth of engagement in VR is what flows through my head most of the time as a metaverse evangelist. Really though it is this targeted set of references and in jokes for me and my generation of geeks that I love. Wade’s Delorean with what seems to be a Knight Rider front grill, Duke Nuke-em aiming a launcher during the first battle scene, hello kitty and friends waddling along the bridge next to Wade, it goes on.
Being in VR though, with the headsets, suits, and running walkways I though I should at least try and watch it on my Oculus Go. I paused the TV stream, popped the headset on that is always next to me. I hoped an Amazon app had made it to the store, but no. Instead using the web browser I headed to Amazon video, logged in and found my movie, clicked play…. Nothing. It was a little deflating. Then though I noticed the “Use Desktop version” button in the top right and bingo it worked. Picking up where we left off at the start of the race with characters like SF’s Ryu wandering off to his car and seeing the Akira motorcycle raz into view. Selecting full screen gave a nice curved view of the 2D movie, enough I had to turn my head to look across the picture a little. Proper front row stuff. Not immersed as much as the main characters are int eh OASIS but a nice blend of traditional film, with new view tech and all a little bit meta 🙂
Another thing I noticed, which I think I had forgotten, was that when Wade first appears he has green hair, which he then dynamically changes to show avatar customisation in the OASIS. For me thats a big “yay” because I spent a long while with green hair in Second Life before finding my predator Avatar. My green spiked hair was a thing I identify with, and even got 3d printed. I also often referenced that in corporate presentations as the choice to have green hair is a subtle prod at conformity and social norms, just on the border of whether that would be acceptable to many in an office. It is not as full on extreme body shapes, or cartoon looks. The predator AV took this a stage further in the conversations. I also use green hair in almost every avatar in every platform as its a much quicker mental attachment than finding or trying to build a complex Yautja avatar. PS Home famously (from my point of view) did not allow green hair. Very odd!
So seeing Wade, in a movie about VR, the 80s and my generation my ego naturally assumed that any research the production team may, just may have come across my green hair metaverse evangelizing 🙂
If you like player one ready, or don’t, you will still like my own VR and AR story telling with real world gaming and tech references though with Reconfigure and Cont3xt. Its got a love of Marmite in it too 🙂
These years just tick away don’t they? Last year finished in a big rush of work with reports to get out before I took the rest of my holiday that was sitting there ready to expire. I was really please that I got my AR and VR long form report out for 415 customers. These a 6k or words and pictures but I was not expecting to be in a position to share Augmented and Virtual reality insights quite as much as I have this past year. It is what I know, and what I feel or course, but there are a lot more things in IoT especially in Industrial to try and stay on top of. However, AR is the UI for IoT, so there it is. It was over 18 years ago trying to get a shared avatar space of our offices working with presence of people, at the turn of the century! The along came Second Life, which made things a lot easier in 2006.
2017 was supposed to be the VR year, and it sort of was, but also AR is hot on its tail and finally Magic leap, on the 20th December unveiled their AR goggles.Yes, that was just a few days AFTER I managed to ship to AR/VR report with the words, “we shall have to wait and see what they do” in it. Though the shut down of Tango, which I did get in as an amendment waved the flag that would happen.
No prices, not dates, other than 2018, but its a jump in technology with light field (hopefully)
We are also very close to Ready Player One hitting the screen, which will be the main point of reference for escapist VR with Spielberg directing. So it seems all the future thinking stuff we have all been doing is going mainstream to some degree or other.
Apple ARKit and Google ARCore put AR tracking into the hands of everyone with every mainstream device too.
Maybe a few more people will spot my novels and enjoy them with all the AR and VR, IoT and Quantum theory in them 🙂