As many people who pop along here will know back in 2006 (and even before that) I have been somewhat passionate in explaining and sharing what can be done with things like game technology, virtual worlds etc to help with human communication, that includes entertainment (which is often regarded as not a serious business), office type business, education, industrial digital twins the list goes on. Metaverse is a handy word to bundle all this together and whilst not tightly defined I think many of us know what we mean. This month I wrote a piece for the BCS which is available to read here and they have nicely highlighted the phrase ““Travelling and actual teleportation aside, do you think that we should be able to find better ways to feel we are with people and to have shared experiences?” Whatever you think the Metaverse is evolving to, just consider if the way you are doing things now is the best it could possibly be given the tech available. Things evolve and move, not always replacing everything behind them but adding.
Having sad that, it is worth reminding everyone that the original work in 2006 in Second Life was mostly about pulling data from other sources into the environment, and doing things with it, this eventually led to building Wimbledon’s 2006 course with the ball trajectory from Hawkeye rendered there.
There is an ongoing discussions is about interoperability of Metaverse or virtual world platforms. These range from is it desirable to have objects from world x appear in world y and then often hit the barrier of platform x has a completely different rendering engine to platform y. The former is a style and even a business question, why does an environment of any type exist, real, virtual, augmented, projected, whatever. This then leads to the technical wars around content being “in” a development or build environment. Importing of meshes, textures, animation rigs etc is one way to look at Metaverse evolution but I think it is also worth considering the potential for things like remote rendering, if the unifying standard for content sharing was actually containers, of potentially any size or shape and level of detail, the anything from environment x could appear in environment y and even interact with it to vary degrees depending on permissions or needs. My Nvidia RTX ray traced shiny car could appear in the most basic of 3d environments as I am rendering the car my end, and the virtual world is processing a container and stream. This is after all what we do when we video chat, each of our cameras is doing some work and being hosted by a basic integration layer. Yes I know 3d and collisions and physics and and etc. all make it more complex but it is worth considering that other ways of working can exist to the build and compile we are used to from games. Also, from what I remember of Snowcrash avatars are rendered by the person’s own device, those with worse computing power have more basic avatars in world? Also, we already did some of this back n 06/07 it just takes a different way of thinking.
Talking to thinking, literally, I was invited to join the Games at Work dot Biz podcast again and I was more than happy to give a shout out to a book by Prof Richard Bartle on virtual worlds and our god like powers over them. Checkout the podcast and links are here and on any player you have, give it a rating to help the guys too please.
Well what an interesting day yesterday was as Facebook went all out on being a Metaverse company and renamed itself to Meta. It also marked a point that crosses my straight unbiased industry analyst day job with my constancy of being a metaverse evangelist, a title I had in 2006 and maintain. These comments are mine alone and not that of the company I work for. I have lots more to share and discuss on all things Metaverse, but that’s something that people have to pay for nowadays.
However, I will say that I was really pleased to see a slick presentation on all things Metaverse from a multi billion dollar company like Facebook. Last year the same happened to some degree with Microsoft and Mesh at Ignite. The conversations have moved by many people from, “nah we don’t want to interact like that ever” to “no one company should own all this”. The Metaverse the industry and society needs does, as was said in the Meta keynote, need to be able to let people connect with both the physical and multiple virtual realities for a multitude of reasons and with many available ways to be who we are. I have, for many years, talked about how we have had to adjust to the limitations of technology to allow our communication at distance. We still type on qwerty keyboards designed as a layout in part to stop physical typewriter hammers jamming by spacing out the commonly used letters in words to avoid that issue. Not that we need to get rid of this form of typing, but to hold it up as the only way to interact with technology and data seems a little bit odd doesn’t it?
Back a decade or so ago I created a slide for a presentation on virtual worlds and the metaverse that was aimed at showing how far we might be able to go and why. It was in part a presentation about the fact it is not all avatars and islands. Those work, but there are many more things that come together. There are many forms of input, including neural interfaces (see below at the left), there are combinations of virtual to virtual, virtual to real (AR) and as I said at the end virtual to real to real, which is virtual objects becoming physical again (3d printing etc). It is about the connecting of people, a wetware grid, but threaded by narrative and context. The Meta presentation often showed these as quick interactions, thrown a basket ball, looking at a design etc, but the there is of course a longer thread, a digital thread if you like. It’s all very fractal in its nature, the patterns repeat at any scale.
One thing to note though is that everything feeds back into the entire loop, the arrows are multi directional. You might 3d print a “device” or prop you found in a virtual world interaction, but that itself might be an input device for another experience. Imagine if you 3D printed a modelling clay tool you bought in a virtual world then made a clay sculpture that you took a photo of for Flickr, or even scanned the sculpture in and made it a virtual object as a one off that you sold on an NFT market place which then saw it used as a prop in a virtual world movie, and so on. I used to talk about printing off a penny whistle and joining in a virtual band at an irish pub in Second Life. I hope you get the gist.
One key thing though, and it is slightly annoying Facebook have nicked the word but as you see below the entire loop is the meta loop (c 2009).
I know many of us old guard or original gangster, however OG you want to be, might be annoyed at the mega corp going Meta on us and riding high on our hard work persuading the world this is a good idea and the way we are all going. However, it is the way we are going, so we get to say “I told you so”, rather than multi billionaires. Money clearly isn’t everything 🙂 We, and I include all my fellow Eightbar people and surrounding community from back in 2006, we were already riding on top of some Double OG work. It’s the nature of waves of societal acceptance and tech development. The hope is that business and experiences that are born of the Metaverse with thrive and bring entertainment, education, excitement and benefit to the world. By metaverse business I mean not the ones building the underlying tech platforms, but the ones thriving on top of them, but then going on to influence the wave after this just as Facebook and co are today. It’s certainly not going away, so my sons birth certificate from 14 years ago saying father’s occupation “Metaverse Evangelist” might turn out to be of use to him someday.
I mentioned above that the physical world is an important part of the metaverse and in the Facebook/Meta pitch there was an instrumented house where the mugs and teapot that were carried in real life updated their position in the digital model. Well, that around the other way is the premise of my Reconfigure and Cont3ct novels. They are very metaverse, IoT, Quantum AR and VR related, but our hero Roisin gets to manipulate the physical world by altering the digital world. It works for an adventure, but it also keeps the “scifi set this afternoon” concept alive and it is not yet all done and dusted 🙂 Go take a look it’s very reasonably priced and might spark a few ideas on the back of the future that Meta are pushing. More here or see the ads not the right.
The recording of my BCS Animation and games pitch is now live on the site and on youtube. Unfortunately only the live audience will get to see the larger avatar I was using to deliver this pitch, as the zoom presentation feed is only of the shared screen with a tiny top left insert of the avatar and my dulcet and somewhat stuttering tones tones. It took a few minutes to get revved up and to many I might be stating the obvious but it is surprising how many people don’t get the importance of user generated content and/or expression in virtual worlds and games.
Official site is here but added the youtube below to save a click or two.
Way back in the dim and distant era of 2009 I was exploring a lot of tools to help me build virtual environments with avatars and characters that could be animated, typically in Unity. 3D modelling is an art in its right and then rigging those models for animation and applying animations to those is another set of skills too. At the time I was exploring Evolver, which in the end was bought by Autodesk in 2011. A decade on and there is a new kid on the block from Epic/Unreal called Metahuman. This cloud based application (where the user interface is also a cloud streamed one) runs in a browser and produces incredibly detailed digital humans. The animation rigging of the body is joined by very detailed facial feature rigging allow these to be controlled with full motion capture live in the development environment of Unreal. Having mainly used Unity there is a lot of similarity in the high level workflow experience of the two, as they are the leading approaches to assembling all sorts of game, film and XR content. However there was a bit of a leaning curve.
I decided to generate a couple of characters and ended up making what to me feels like Roisin and the Commander from my Reconfigure novels. Creating them in the tool plays instant animations and expressions on the faces and is really quite surreal an experience. I installed Unreal on my gaming rig with its RTX gfx card and all teh trimmings and set about seeing what I needed to do to get my characters working.
First there is an essential application called Quixel Bridge that would have been really handy a decade ago as it brokers the transfer of file formats between systems, despite standards being in place there are some quirks when you move complex 3D rigged assets around. Quixel can log directly into the metahuman repository and there is a specific plugin for the editor to import the assets to Unreal. Things in Unreal have a data and control structure called a blueprint that is a kind of configuration and flow model that can be used in a no-code (but still complex) way to get things working. You can still write c++ is needed of course.
My first few attempts to get Roisin to download failed as the beta was clearly very popular. I only took a photo of the screen not a screen cap, a bit low quality but there is more to come.
However, eventually I got a download and the asset was there and ready to go. Unreal has a demo application with two MetaHumans in it showing animation and lip synching and some nice camera work. Running this on my machine was a sudden rush to the future from my previous efforts with a decade old tech for things such as my virtual hospitals and the stuff on the TV slot I had back then.
The original demo video went like this
Dropping into the edits and after a lot of shader compilation I swapped Rosin with the first metahuman by matching the location coordinates and moving the original away. Then in the Sequence controller, the film events list I swapped the target actor from the original to mine and away we go.
This was recorded without the sound as I was just getting to grips with how to render a movie rather than play or compile an application to then screen cap instead. Short and sweet but proves it works. A very cool bit of tech.
I also ran the still image through the amusing AI face animation app Wombo.AI this animates stills rather than the above which is animating the actual 3d model. I am not posting that as they are short audio clips of songs and teh old DMCA takedown bots end to get annoyed at such things.
Now I have plan/pipe dream to see if I can animate some scenes from the books, if not make the whole damn movie/tv series 🙂 There are lots of assets to try and combine in this generation of power tooling. I also had a go at this tutorial, one of many that shows the use of a Live facial animation capture via an iPhone streamed to the metahuman model. I will spare Roisin public humiliation of sounded like me and instead leave it to the tutorial video for anyone wishing to try such a thing.
Last weekend, Friday 3rd July and Sat 4th July 2020 the virtual world Sansar hosted the Glastonbury Shangrila set of gigs called Lost Horizon. I decided I really should attend as it looked such an epic lineup and attempt to put on 3pm-3am constant music over several stages for 2 days, in VR.
Sansar was originally a Linden Lab( Of Second Life fame) spin off, focussed on using newer tech to get people into the environment. I was in there back in 2017 importing a tilt brush sculpture into my own space. I had been in the early beta but it was all NDA so I didn’t post photos until it went live.
Sansar was recently sold off to Wookie Ventures, but I was pleased to see that my Sansar account even from back then still worked, and I had the “look I have been here since year dot” style level on my profile.
I had a quick look around new Sansar the days before the event, you know to get the avatar in good shape and not look a total n00b moving around etc. It had changed a lot, very much in keeping with the style on VR apps with pop up menus you can place in space or float from you wrist. It also looked and sounded great
Incidentally, for some reason all the photos I took from within the Oculus wrist band came out square, it all looked much better in VR and moving.
I visited the popular 2077 cyberpunk world too, again all as prep. All very much in keeping with the whole scene, and great to see Max Headroom (look him up kids) on a screen for my generation of geeks.
The real star though was the set of gigs both of dance music, some urban house or something I don’t even know and some live bands performing all manner of wonderful tunes.
Now this is where my experience may have got a little more freaky. I had my Oculus Rift S ready to go for the visuals, but I also wore my Woojer haptics vest that hits your body with bass lines and clever sensory perception through sound, then that fed to my Nuraphone in ear and over ear headphones, that are tuned to my individual hearing pattern. Its a lot of clobber to be working but it had to be tried. (Note this is all bought and paid for no product placement here!)
As I logged into Sansar and I arrived at the Nexus I immediately got a blast on through all the kit, as I teleported to the first stage I arrived at the landing zone, the stage was a little walk away, but already the dance music bass line was hitting me physically as well as sounding great. I had to turn the haptics down a little to start off with to get used to it as I approached the arena.
Once in, and this applied to all the shows I went to, the DJ, in this case, was green screened into the booth, so whilst we were all avatars, that DJ was theirselves doing their thing.
Everyone else was an avatar of many different forms, the vanilla basic first avatar to some incredible creations. I couldn’t find my normal predator kit so I went with my second option of green hair and some cyberpunk kit. I also picked up a set of animations and dance moves free form the shop but went on to buy a whole lot more to add to the fun.
The music and the feeling was incredibly intense and I tried all the stages in my first 3 hours at the event. I liked the DJ stuff but live bands were even better.
I dropped out after 3 hours just to make sure I wasn’t shaking myself to pieces, and to grab some food. I also mentioned it to Predlet 2.0, he installed Sansar and got set up, he didn’t set his VR up but it supports desktop.
He was having a blast with silly avatars too. We were at the Fat Boy Slim gig, as you can see below. Fat Boy Slim is waving in the background. I am rocking a guitar that I went and bought, as I was enjoying the actual bands and Predlet 2.0 was a dancing shark for this one.
In Sansar you have a choice about how much you want to interact, I love having lots of dance moves and gesture and string them together in time with the music, it was clear others were just listening, living the avatar to dance, but it’s all good. Predlet 2.0 shark liked doing head spins.
You can mix between the first person view, or as in these photos orbit the camera around. If you are dancing its best to see what you are doing with the avatar from the outside.
Occasional crashes or re-sharding sent me back to my own little room but I stayed for another 3 or 4 hours on the first day. Thats a lot of VR and haptic feedback and banging tunes in your head I have to say. I was worn out 🙂
Some of the venues got us all flying around with low gravity or bouncing pads, so I got to be all rock and roll on a rooftop.
I got an awful lot of moves and animations, many from names I know well from Second Life. These all strung together in a wonderfully esoteric way and the puppetry of dancing got more and more fun.
It was also clear that there were an awful lot of people with years in Sansar, intricate avatars, lots of youtubers streaming too. Equally there were some people who apparently had never been in any virtual world or games before. These people were sort of amusing in they would arrive and shout to their friend, “can you hear me”, “Is that you?”, “Look what I can do?” which was moderately funny, but then they would start pointing out weirdness in avatars thinking no one could hear them. I had left my channel open so I could. There were also some people, as in any crowd looking for a fight, swearing or being abusive, they were politely reminded by Sansar helpers/bouncers to chill out and that was in fact PG-13. I saw a few of those less capable of enjoying space with others leave, or get booted for being grievers. Its 2006 all over again, as we used to have to do that event at the IBM Wimbledon Second Life event.
I saw a lot of bands, so many I can’t name them all but on day 2 probably about 5 hours in I caught UK metal/punk/street band PENGSHUi and they absolutely rocked. If nothing else it was the mix of styles messing and thrashing my haptic Woojer. It was certainly time to break out the guitar. Yes I even bought the album afterwards, the first music I have bought for years as a memory of the event.
I spent about 10-12 hours in total out of the 24 available. The Woojer and Nuraphones added a great deal to my personal experience. I know a lot of other people were enjoying it and the production was fantastic. It is the best virtual event I have been too, not the longest as 2 weeks running Wimbledon in SL would probably take that record, and that was different. This was pure escapism, rocking music and a bunch of fellow mad virtual world enthusiasts and n00bs alike coming together in this weird locked down pandemic.
If I have note already posted all the photos I managed to take above they are in this album, but go check out the videos and other official content, not least PENGSHUi (Yay I see my avatar)
Or once of the DJ’s like Fatboy Slim (There were multiple shared rooms, looks like I wasn’t in the filmed room this time 🙂 )
Usually by this time of year I would have done several presentations on stage or a webinar or some such thing. This year of course we we have all been locked down. We got to 75 days on our chalk board before one of the predlets (following social distancing) walked into the park to meet a friend. That is a long time. We are lucky to have space in the house and garden and lots of things to do, but it clearly has a mental impact.
I have spent hours and hours walking around the garden listening initially to music, then getting really into audible books but also properly regularly listening to Games At Work.Biz pod cast on a Monday. The dulcet tones of my friends Michael Rowe, Michael Martine and more recently Andy Piper talking about all the things in tech that I love was like hanging out and shooting the breeze. Instead of talking I was walking, in order to maintain some level of fitness and get some vitamin d. I had been tweeting about the show and engaging in some extra conversation as thoughts arose so it was funny the last few episodes to get a name check.
I got asked to guest on the show last Friday which was utterly fantastic. However, I realised that I had not really been doing much public speaking so I started to doubt I had the words or thoughts worthy of the show. Equally, its about stuff I care about so there was no way I was going to say no. Instead I bought a new cardoid mic with a spring scissor boom and a pop filter. Yay for the tech. I have been on the show before, after I published the novels but that was November 2015 ! episode 126. It was part of a wide set of publication promotions I tried
I signed into Skype with my Audio Hijack app configured to send my channel of seeking to the communal drop box. We had a slack full of suggested news items (including the ones I added). Then we piled in recording having had a bit of a thread and final item in place. Four of us on a podcast might get tricky, but we all got to say what we needed to say. It was really good fun. It was great timing too as we all went through an amazing time with virtual worlds in 2006-2009 and I was attending Augmented World Expo and some of its events that week. So very fresh in the short term memory combined with long term memory 🙂
The time flew by, it was like the pub getting time called, but for the time I was on it was mentally refreshing. Of course then I was nervous about listening back to it, would it still give me the great buzz that I got to get me into the week. I am please to say it did, I sort of filtered out what I said and took even more notice of what the guys said. It is very different to be in listener mode as opposed to broadcast. Part of the reason Zoom calls are so tricky as most of the time we are receiving info, but we have open cameras so are in broadcast mode. That is tiring.
My participation, professional and social, in virtual worlds over many years mixes my tech geek inquisitive builder side with the deep emotional impact of how interactions feel. I still clearly remember events from 2006 that occurred in what we still call virtual. I also still feel the waves of joy and excitement as a fledgling industry grew and the pain of the inevitable bubble bursting. I remember the people who I have grown up with in virtual worlds, friendships that a transcending time and space. I also deeply feel the fear and jealousy that was directed at many of us despite attempting to be inclusive. It was shrouded in the apparent lack of seriousness that comes from games technology, but was really a fear of missing out in human to human communication and of power structures altering. I do think and mourn where we could have been with the virtual, right now, when we need it the most but equally we now have social media deeply adopted. The world is moving, and now everyone has to find ways to communicate that work for them.
I was honoured to be invited to gather around a firepit and chat in Second Life last weekend. It was at 2am my time on Sunday morning. If you were part of that great time around 2006 you will recognise these wonderful people from Billions of Us. I have remained an SL resident and island owner all these years so it was not a resurrection event to sign on and pop along. However, I felt… nervous… Why? I was just going to have a private chat with people who I share social media space with all the time. The reason was we were going to talk about the old days. I have all those emotions tied up in that period of time now also compacted with being locked down at home for the past 57 days.
In order to help deal with some of the demons I felt I needed to upgrade my avatar. My original predator avatar, with striped leather jacket and then a subsequent feeding edge t-shirt is from a very different time. SL has changed to be able to cope with 3d mesh models now, I was a collection of graphic primitive cunningly sculpted and scripted by Sythia Veil. However, I was a processor hog. I had visited a sim and a script had told me I was using up 4% of the processing power on the server, on my own! The client tells you your avatar weight (as in processing required) and I was bloated old and fat at over 50k. I went and found a new and brilliantly built predator avatar on the store, a detailed mesh, with a weight of just 5k. I adjusted it to have a feeding edge logo attached at low cost and also placed clickable versions of both Reconfigure and Cont3xt on my hips, like six shooters. Changing a long standing avatar is not without another range of emotions, but it helped me think about the future not the past, and damn… it looks good, with and without the mask. Well done to undercover for the build!
I had tweeted about this and was thrilled that Games at Work dot biz talked about it on the podcast. A link very much to the original eightbar crew in 2006 with Andy Piper, Michael Rowe and Michael Martine. Also great to hear Andy also headed into an avatar rebuild. You see its 14 years ago and still very relevant, very personal and are all very connected. Also a wonderful podcast and the Michael’s are now at episode 272 of the weekly event (I will let that number sink in!)
Anyway Sunday 2am rolled into play, I teleported to the fire pit and was there with everyone. It turned out I had not tested the avatar in public so I was over 8 foot tall, but I sat down rather than editing there.
We talked for an hour solid about the good old days and the future, the relevance of virtual worlds to our reality now. I shared how it all got started for me and the chain of serendipitous events and wonderful people that gathered around IBM eightbar. We then took a quick tour of some art installations on the fledgling Billions of Us islands.
Then we said our goodbyes and I returned home. Both to my own island in SL and then to my office.
That “returning home” is way more emotionally loaded that I would normally expect it to be. It was late, I was tired but…. unlike all this torrent of video calling (and audio) which being people to my screen into my office I felt that for the first time in 57 days I had been somewhere with people, friends old and new. When I woke up in the morning I still felt I had been out. It is this feeling that I have had a lot with virtual world and game experiences but the fact it happened under this crazy pandemic world times and was so liberating and so deeply felt I pity anyone still messing around trying to come to terms with those funny people with there funny avatars and crazy ideas.
Virtual worlds have not gone away. There are many more. Please consider this might be a great mental getaway for many of you. I don’t care which one, and I know community building takes time, but get in one take a look and really think how does it feel.
It was my tribe, it was on a subject we all knew but this is very very real. That one hour made me feel better, really very much better than I have for a long while. If it can do that for me, maybe it can do it for you. 1 hour respite just chatting or wondering around a virtual world. Let me know how you get on.
Lots more people are playing with/using video now with the lockdown. Every conference is a video conference. We are communicating with friends, family, customers and colleagues in a variety of ways and a stack of different contexts.
I sparked up Faceapp on my phone, something I had tinkered with a while back and found that it now not only does mad crazy things with great accuracy to photos but can also do video for some of those effects. With still images like this before and after example, or they they the other way around 🙂 you decide.
The video though…. mind blowing stuff.
There is also lots of fun to be had with the snapchat virtual camera on PC and Mac able to feed into any of the video conference tools you use by diverting your main camera to it first then selecting the snap camera as your webcam.
Lots can be done in foreground and background replacement. I mean who would turn up at a meeting in a predator costume? (like its 2006)
Like my Second Life avatar (below) I cant claim to have created this one. But it was great to see it on the list of snap chat lenses. My SL avatar has a custom jacket (my original RL one) and a feeding edge T-shirt so there are always tweaks to be made. Yes SL is still there, yes I still have 2 islands, yes you should sign up/re-login.
Back in the UGC world of SL in 2006 we had lots of interesting uses for logos and brands, like the Wimbledon Tennis flying towel and Wimbledon contact lenses (Official sanctioned I should add). I dived into the Snapchat Lens Studio to see what I could quickly create. This was knowing full well there was a very rich 3d animation and code/no-code environment that could be a career for some talented artists. I used one of the base models and tweaked it a bit and gave myself some shades with a subtle mention of 451 Research (now part of S&P Global Market Intelligence). Obviously that would all not fit on the glasses, but you get the idea.
I have always talked about putting logos and our own designs into game environments like my reconfigure car to advertise my novels. Before that back in 2006 putting eightbar into games too ( in the link above too). Now I have a feeding edge logo in Animal Crossing New Horizons. I am planning on doing a reconfigure one too. Only a few pixels to work but its fun to try. Our entire family is obsessed with this game at the moment, on top of all the other various games we play its a unifying experience as we tend to our islands, or live on one another’s. (Hmmm another SL like reference)
Of course this lockdown is driving us all crazy, but having a bit of fun is essential. Back to snapchat it was good to see the Trolls movie had some official snapchat lenses. When I tweeted this I was told I looked like Eddie Izzard 🙂
The family joined in too, but not posting them without permission 🙂
Having said that all this tech is great, but I also felt the need to try a real life filter and broke out this costume left over from The Cool Stuff Collective TV show (I didn’t wear it on the show, but at the wrap party). It was certainly a shock for my colleagues on video. The neighbours probably thought I had lost it too as I ran around the garden in it, doing a real life animal crossing manoeuvre.
Stay safe everyone, have some fun with the tech and lark around a bit!
The recent pandemic situation is causing many offices to suddenly switch from co-location in a building to everyone working from home. 11 years ago when I left IBM and started Feeding Edge I was immediately a home worker, but the preceding years, despite having an office and a base most work was effectively remote. Being a metaverse evangelist in 2006 was all about people using virtual world and game technology to be able to communicate and understand one another at distance.
The projects in 1997, in the early days of the web, were built as a team in an office but were clearly very much about interacting with the world at large. For those of us in these industries, building and shaping the use of the internet we did not face a big bang switch over from office to home. We tended to grab and adapt or write whatever tools were available to work across physical and digital divides. That of course is a typical pioneering spirit and having been an evangelist for change I know not everyone is comfortable with that, and nor should they be.
A sudden switch from one thing to another is bad enough, I recently had to switch from Mac to Windows for work, its annoying, and irritating at best and veer stressful at worse. Today we have the added external stress and personal worries about family and friends as well as the future of businesses layered on top.
The reality is, we do have the technology to communicate and share, we might not all have the right processes or social norms in place but the more we try, the better.
For many people teleconferences are a norm anyway, just usually they commute to a desk or office in order to have those. However many others will not. It is here we all need to be cognitive of how much the technology blocks and filters who we are. A voice only teleconference is great for a presenter, or for the alphas who thrive on talking, but many people will not feel comfortable to interject and cannot use body language to find a gap in a conversation. Video links are not really any better, for some that strips them of their normal behaviour as they attempt to stare at the camera or try and get a level of eye contact akin to the physical world. Some people engage in text chat alongside audio and video, though often those who are better at talking do not pay attention to the text, and sometimes the text just becomes chatter in the background. There is an art and style to all this and people will find what suits them, just as in a physical situation we adapt to one another’s signals.
It is these sort of difficulties that led to exploring virtual worlds like Second Life. Some of that is shown in the Album below of 600+ images 2006-2009. This was a long and varied journey and one I have talked, writes and presented on many many times. Happy to share the tales just ping me @epredator or the 11 years worth of posts here many related to metaverse concepts or even read my Reconfigure series and get the gist of of some of it in a modern sci fi context.
It acted (and still does) as a teleconference in having shared sounds in a space, but it has the presence of avatars to represent people, that get moved around even doing simple things like sitting or standing or hovering next to people you know in a meeting. Instant visual feedback, who is there, where are they, what are they representing with the avatar. We used text chat, group and 1:1 whilst also having voice. We were able to bring artefacts into the environment dynamically, whiteboard in a 3d space. All adding to the depth of communication. There is also, for those that can, ways to code, make virtual objects do something too. This is still entirely possible on Second Life and many other virtual environments. Now more than ever is time for people to experiment.
This is also without even bringing in the tech and immersion fo Virtual Reality or the potential for Augmented Reality to allow us to blend our physical and digital presences in physical space. That of course requires people to don headsets and have the kit, but virtual world interaction does not need that. A laptop/tablet and an internet connection is all that is required. Carry on with the frustration of teleconference and video conference by all means, sometimes they are the ideal approach, but just consider what is causing frustration of working remotely and investigate what might be a better way.