metaverse


Metaverse and GenAI webinar for BCS

This month was the AGM for the BCS Animation and Games specialist group that I have been chairing for a very long while now. I gave a presentation from a personal view point (this is not a work presentation and I make that clear in the disclaimers, though it is what I work in too of course), on the advances in Metaverse and GenAI content creation. The full YouTube version is below but the link to the blurb and bio (and the video) at the BCS is here

We are always looking for presenters to come and share some ideas with our specialist group around all things games, animation, metaverse, esports etc, so if you are interested ping me there is a slot waiting for you. We sometimes get a big crowd, other times smaller ones but with the videos published like this it can be a useful thing to do and share.

For those of you who don’t know, BCS (formerly British Computer Society) Chartered Institute for IT is a UK based (but worldwide membership) professional body for anyone in the tech industry. It exists at all levels from just getting going in the business to Fellows with vast amounts of experience and willingness to help. It was part of my professional certification whilst at IBM and I then also became a certifier whilst there too. Volunteering and sharing ideas, such as this presentation, is one of the many ways to get involved (you don’t have to do this). It benefits you as an individual but also elevates tech roles within enterprises and organizations you work in.

You can find more at BCS, The Chartered Institute for IT (bcs.org)

Reconfigure – The Movie, nearly

Having my books Reconfigure and Cont3xt I am always using them and the story I created as a way to explore other mediums. I started drawing a few years ago to see if I could maybe go manga with it, that included buying some action figure mini drawing mannequins that have some tech and weapon props. I created some meta humans and have a unity based mocap suit from a kickstarter to see if I could produce anime. Audio voice generation has been another thing I tried. Each of these projects would be a significant amount of time, effort and probably cost too, but delving in as a mini proof of concept gives me an appreciation of what would be needed.

Midjourney and the AI image generators have offered another option to explore, an very quickly too. One of the challenges is to get the same character to appear in each generated image (i.e. story board). I decided to sort that out later in this POC. Though I did suggest that the actress to play Roisin should be Ruby Rose to get a potential consistent likeness.

Midjourney experiments
Roisin at the start of Reconfigure in a bit of a last minute situation getting off the grid

Having got this image, which was pretty good for a first attempt, with very little text in the prompt I popped over to runway.ml and tried its new gen-2 image to video generation. This time with NO PROMPT at at all! It decided she should look up and stare into the distance, which in the book she is doing as she is being chased and very aware of her surroundings as she tries to outwit those after the amazing tech she has discovered. They only generate short clips on the free accounts, but those few seconds I found to be quite astounding and bewitching.

GenAI is obviously on a major drive into all sorts of areas, but the ability to potentially use tools like this to create a film version of my books on my own time and budget is emerging as a strong contender, unless Netflix or Prime decide they would like to do it instead ๐Ÿ™‚

EyeBlend better than an Apple Vision Pro – still

In 2016 I wrote Cont3xt the follow up to Reconfigure and part of that was an upgrade our lead character gets. She has access to an API that can manipulate the physical world like the virtual, she builds that up into a Unity application for her smart phone in the first book. The second she gets an EyeBlend headset, state of the art (and still just a little ahead of Apple and co’s attempts. So she ports the Unity app and upgrades the messaging system for this high bandwidth device and to enhance the experience. Rather like all the developers are doing right at the moment. Life imitates art ๐Ÿ™‚ Read Reconfigure first, but this might be a topical time to check out Cont3xt whilst you wait for the Apple demo in a local store next year. (Just ยฃ0.99 each for these adventures ๐Ÿ™‚ )

โ€œAn EyeBlend? They are not on general release yet? How did you get that?โ€ Her eyes lit up wildly as Roisin took in the news.ย 

โ€œOh you know, I have a few friends who hack for a living.โ€ Alex replied with an air of comedy superiority. 

โ€œYou sure do! The EyeBlend is an awesome piece of kit. It does full blended reality applications. It uses field effect projection with live object tracking!โ€ She said enthusiastically. โ€œThey call it a hologram, though it really isnโ€™t as such. You can forgive that though as the effect is the same, apparently.โ€ Roisin was on her tech turf again. 

Exploring AI writing – Blackadder meets the office, car haiku and discussing the Metaverse

My social feeds were getting full of people trying out a version of the GPT3 AI for text by OpenAI on https://chat.openai.com/chat so I thought I would have a look at it. It is as amazing as the AI art generation, not a sentient being but a tool that when asked the write questions gives interesting answers.

Whilst essay style responses are interesting I asked it a rather odd question with a style slant towards it and this is what I got. Not hilariously funny, but just consider that the characters are as you would expect them to be. There is a sense of trying to get out of doing something, then getting caught out and sent back to the beginning.

Write a Blackadder sketch set in a 20th century office environment”

Blackadder: (sitting at his desk, staring at a stack of papers) Baldrick, I’m drowning in paperwork.

Baldrick: (leaning against the doorframe) Yes, sir.

Blackadder: (sighing) It seems like no matter how much I do, there’s always more to do. And on top of that, the boss is breathing down my neck.

Baldrick: (nodding) Yes, sir.

Blackadder: (slamming his fist on the desk) I need a break, Baldrick. I need to get out of this office and do something fun for a change.

Baldrick: (smirking) I’ve got just the thing, sir. How about we play a game of darts?

Blackadder: (raising an eyebrow) Darts? In the office? Are you mad, Baldrick?

Baldrick: (shrugging) Why not, sir? It’ll be a nice change of pace, and it’ll take your mind off all this paperwork.

Blackadder: (considering) Hmm, I suppose you have a point. But we’ll have to be careful not to break anything.

Baldrick: (grinning) Don’t worry, sir. I’ll go get the darts.

(Baldrick exits the room and returns a moment later with a dartboard and a set of darts)

Blackadder: (picking up a dart) Alright, Baldrick. Let’s see who the better marksman is.

(They start playing darts, with Blackadder proving to be the better player)

Blackadder: (throwing a dart and hitting the bullseye) Ha! Another bullseye, Baldrick. You’ll have to try harder than that if you want to beat me.

Baldrick: (throwing a dart and missing the board entirely) Oops, sorry sir.

Blackadder: (rolling his eyes) Never mind, Baldrick. At least we’re having fun.

(Just then, the boss enters the room)

Boss: Blackadder! What is going on in here?

Blackadder: (turning to face the boss) Ah, just taking a break from all the paperwork, sir. Playing a friendly game of darts.

Boss: (scowling) Darts? In the office? This is highly inappropriate, Blackadder. I expect better from you.

Blackadder: (apologetically) My apologies, sir. It won’t happen again.

Boss: (sternly) See that it doesn’t. Now, get back to work, both of you.

(Blackadder and Baldrick sheepishly return to their desks and resume their work, with Blackadder grumbling about the unfairness of it all)

It’s not just long form that this bot can produce I also asked it for something shorter and to the point

“Write a haiku about cars”

Cars speed down the road,

Metal beasts with roaring engines,

Nature takes a backseat.

Next I chose to mix it with some AI image generation (this time using photo leap on the iPhone)

Create a play with two people discussing the Metaverse, one person is positive the other person is negativeย 

Act I:

Person 1: Have you heard about the Metaverse? It’s this amazing virtual world where anything is possible.

Person 2: Yeah, I’ve heard of it. But it’s just a bunch of hype. It’s never going to happen.

Person 1: Are you kidding me? The technology is already here. It’s just a matter of time before we can all access it.

Person 2: I don’t know. It sounds like a utopia, but I think it’s more likely to become a dystopian nightmare.

Person 1: That’s just fear-mongering. The Metaverse has the potential to revolutionize the way we live and interact with each other.

Act II:

Person 2: But what about the consequences? What happens when people start spending all their time in the virtual world instead of the real one?

Person 1: That’s a valid concern, but I think it’s a risk worth taking. The potential benefits are just too great to ignore.

Person 2: But what about the loss of privacy? In the Metaverse, everything about us will be out in the open for anyone to see.

Person 1: That’s true, but I think the benefits will outweigh the drawbacks. The Metaverse will allow us to connect with people in ways we never thought possible.

Act III:

Person 2: But what about the cost? Creating and maintaining the Metaverse is going to be incredibly expensive.

Person 1: That’s a short-sighted view. The Metaverse has the potential to generate huge amounts of wealth and create countless new jobs.

Person 2: But what about the impact on the environment? The amount of energy needed to power the Metaverse will be enormous.

Person 1: Again, I think the benefits will outweigh the drawbacks. The Metaverse will allow us to explore new frontiers and push the boundaries of what’s possible.

Act IV:

Person 2: I still have my doubts. The Metaverse sounds like a pipe dream to me.

Person 1: That’s okay. Not everyone has to believe in it. But I have faith that the Metaverse is the future, and I’m excited to see where it takes us.

Person 2: I hope you’re right. But I’m still skeptical.

Person 1: That’s okay. Time will tell. In the meantime, I’ll continue to dream of the possibilities the Metaverse holds.

Yet more wonders I did behold when I asked…

“Describe the Metaverse in the style of William Shakespeare”ย 

In the style of William Shakespeare, the Metaverse may be described as a boundless realm of virtual reality, where spirits roam free and digital avatars take on form and substance. It is a world beyond the flesh, where imagination is the only limit and all things are possible. Here, one may forge new identities and explore uncharted territories, unfettered by the constraints of the physical world. In this vast and wondrous domain, the boundaries of time and space are mere illusions, and the possibilities for adventure and discovery are endless. So come, brave travelers, and enter the Metaverse, where you may revel in the boundless wonders of a virtual world, and unleash the full power of your imagination.

Very clever, and this is accelerating very quickly. Certainly something to keep an eye on!

AI image generation – bots FTW

Firstly, yes its been a while since I posted here, I could have done with an AI clone of me carrying on sharing interesting things. I was out mainly due to an annoying post-ailment “thing” (medical term) hitting my facial nerves which meant for months any screen time trying to read or write was visually and mentally uncomfortable. That was followed with a dose of Covid that knocked me out for a while but also seems to have had a potential positive impact on whatever else was going on. Anyway, I am back. Anything happened the past few months, I mean 1/2 a year?

I suddenly felt compelled to blog about some of the marvellous work going on with AI image generation of late with things such as Dall-e and Midjourney becoming available to more people. I have so far only used Midjourney, which is accessed through Discord. In the relevant channel you simply type the command /imagine followed by any descriptive text you fancy and it will generate four images based on that, with often surprising results. Any of those images can be selected as the root to generate more in that style, or you can upscale, get a big version, of one you favourite.

Handy hint- The mobile version of Discord really likes to scroll to new items, midjourney works by updating your post to the channel. As so many posts are being generated every few seconds you might think you can leave the app on your post to see the results but very quickly you will zoom up the screen. Constant scrolling back to try and catch your images being generated is a bit of a pain. However if you log into midjourney.com with Discord ID it will present you with your collection and lots of links to get back to do things with them. The Discord search doesn’t seem to work on userid etc so it took me a while to figure this out.

After a few attempts, and with one eye on the political situation and crisis faced in the UK I asked mid journey “an artist forced to stop painting in order to work in a bank” and got this wonderful set of images

epredator_an_artist_forced_to_stop_painting_in_order_to_work_in_bdef4924-d482-4aa2-9156-404abea5841a
an artist forced to stop painting in order to work in a bank

I also asked it to generate a protest by the people

epredator_a_crowd_of_down_trodden_people_protesting_and_rising__79a1561b-94a8-47ce-8f87-51d6610ce657
a crowd of down trodden people protesting and rising up against the billionaires and corrupt politicians

Rather than continue the doom and gloom it was time to go a bit more techie ๐Ÿ™‚ I asked it to show me directing the metaverse from a control centre

epredator_epredator_directing_the_Metaverse_evolution_from_a_hi_631ac745-9d7d-439e-a509-0ec2facdba7b
epredator directing the Metaverse evolution from a hi-tech digital operations center

An obvious course of action was to explore my Reconfigure and Cont3xt novels and this is where I was very impressed by whatever is going on with this AI. I asked to see “a cyberpunk girl manipulating the physical world using an augmented reality digital twin” and it seems that Roisin (@Axelweight) is in there somewhere ๐Ÿ™‚

epredator_a_cyberpunk_girl_manipulating_the_physical_world_usin_f751db5b-4d89-4217-ac97-831b2518431a
a cyberpunk girl manipulating the physical world using an augmented reality digital twin

This was worth exploring and picking couple of the images to then seed generating more similar to that which generated these two

epredator_a_cyberpunk_girl_manipulating_the_physical_world_usin_e9b4303b-e51d-4080-8380-0563ae9f1bdf
a cyberpunk girl manipulating the physical world using an augmented reality digital twin (generation 2 from bottom left of gen 1)

And this one

epredator_a_cyberpunk_girl_manipulating_the_physical_world_usin_023f17fc-694c-4b36-a752-243dc0e24ef6
a cyberpunk girl manipulating the physical world using an augmented reality digital twin (generation 2 from top right of gen 1)

These were good but the first version of the top right image in generation 1 was the one I did a more detail version of on its own. It could be the cover of the third book couldn’t it?

epredator_a_cyberpunk_girl_manipulating_the_physical_world_usin_9bac6e1c-1cf6-4914-bd1a-38a56a8e1f0f
Midjourney generated potential book cover for third in the reconfigure trilogy. Of course one fo these from ore geometric results are more in line with my current cover designs asking “A world of Fractal Iterations, Quantum Computing and strange side effects opened up. It appeared to offer a programming interface to everything around her.”
epredator_A_world_of_Fractal_Iterations_Quantum_Computing_and_s_c1fd5bc8-5f97-447e-91d5-89dc97834977
A world of Fractal Iterations, Quantum Computing and strange side effects opened up. It appeared to offer a programming interface to everything around her.

In case this seems a all a bit too serious, I did ask a slightly weirder question of the AI and got this.

epredator_two_ducks_with_a_top_hat_on_carrying_a_salmon_28a2f00f-dd37-4306-8710-750e6ea7e0b0
two ducks with a top hat on carrying a salmon

For a happy image how about this /imagine a bright green planet full of life

epredator_a_bright_green_planet_full_of_life_9df6f240-8bb5-4252-a3ce-500fe649388f
a bright green planet full of life

But back to the surreal and unusual and predlet 1.0 and I share a joke we “invented” when she was quite young, we were passing the pub in Lock’s Heath having been to the supermarket, she looked at the pavement and saw a creature. She said “dad look there’s a worm coming out of the pub” to which I replied “and he is legless”, and here it is

epredator_a_worm_legless_outside_a_pub_e3838ee3-3522-432f-a8a8-9fa7e733fb8e
a worm legless outside a pub

As this kind of AI generation evolves from the static image to the moving one and then onto virtual world and the metaverse, what a wonderfully weird set of worlds we will be able to dream/nightmare up ๐Ÿ™‚

The Metaverse is calling again – think bigger

As many people who pop along here will know back in 2006 (and even before that) I have been somewhat passionate in explaining and sharing what can be done with things like game technology, virtual worlds etc to help with human communication, that includes entertainment (which is often regarded as not a serious business), office type business, education, industrial digital twins the list goes on. Metaverse is a handy word to bundle all this together and whilst not tightly defined I think many of us know what we mean. This month I wrote a piece for the BCS which is available to read here and they have nicely highlighted the phrase “โ€œTravelling and actual teleportation aside, do you think that we should be able to find better ways to feel we are with people and to have shared experiences?โ€ Whatever you think the Metaverse is evolving to, just consider if the way you are doing things now is the best it could possibly be given the tech available. Things evolve and move, not always replacing everything behind them but adding.

Having sad that, it is worth reminding everyone that the original work in 2006 in Second Life was mostly about pulling data from other sources into the environment, and doing things with it, this eventually led to building Wimbledon’s 2006 course with the ball trajectory from Hawkeye rendered there.

broker
All about data from elsewhere integrating into SL in 2006

There is an ongoing discussions is about interoperability of Metaverse or virtual world platforms. These range from is it desirable to have objects from world x appear in world y and then often hit the barrier of platform x has a completely different rendering engine to platform y. The former is a style and even a business question, why does an environment of any type exist, real, virtual, augmented, projected, whatever. This then leads to the technical wars around content being “in” a development or build environment. Importing of meshes, textures, animation rigs etc is one way to look at Metaverse evolution but I think it is also worth considering the potential for things like remote rendering, if the unifying standard for content sharing was actually containers, of potentially any size or shape and level of detail, the anything from environment x could appear in environment y and even interact with it to vary degrees depending on permissions or needs. My Nvidia RTX ray traced shiny car could appear in the most basic of 3d environments as I am rendering the car my end, and the virtual world is processing a container and stream. This is after all what we do when we video chat, each of our cameras is doing some work and being hosted by a basic integration layer. Yes I know 3d and collisions and physics and and etc. all make it more complex but it is worth considering that other ways of working can exist to the build and compile we are used to from games. Also, from what I remember of Snowcrash avatars are rendered by the person’s own device, those with worse computing power have more basic avatars in world? Also, we already did some of this back n 06/07 it just takes a different way of thinking.

Talking to thinking, literally, I was invited to join the Games at Work dot Biz podcast again and I was more than happy to give a shout out to a book by Prof Richard Bartle on virtual worlds and our god like powers over them. Checkout the podcast and links are here and on any player you have, give it a rating to help the guys too please.

Meta, Metaverse, Me

Well what an interesting day yesterday was as Facebook went all out on being a Metaverse company and renamed itself to Meta. It also marked a point that crosses my straight unbiased industry analyst day job with my constancy of being a metaverse evangelist, a title I had in 2006 and maintain. These comments are mine alone and not that of the company I work for. I have lots more to share and discuss on all things Metaverse, but that’s something that people have to pay for nowadays.

However, I will say that I was really pleased to see a slick presentation on all things Metaverse from a multi billion dollar company like Facebook. Last year the same happened to some degree with Microsoft and Mesh at Ignite. The conversations have moved by many people from, “nah we don’t want to interact like that ever” to “no one company should own all this”. The Metaverse the industry and society needs does, as was said in the Meta keynote, need to be able to let people connect with both the physical and multiple virtual realities for a multitude of reasons and with many available ways to be who we are. I have, for many years, talked about how we have had to adjust to the limitations of technology to allow our communication at distance. We still type on qwerty keyboards designed as a layout in part to stop physical typewriter hammers jamming by spacing out the commonly used letters in words to avoid that issue. Not that we need to get rid of this form of typing, but to hold it up as the only way to interact with technology and data seems a little bit odd doesn’t it?

Back a decade or so ago I created a slide for a presentation on virtual worlds and the metaverse that was aimed at showing how far we might be able to go and why. It was in part a presentation about the fact it is not all avatars and islands. Those work, but there are many more things that come together. There are many forms of input, including neural interfaces (see below at the left), there are combinations of virtual to virtual, virtual to real (AR) and as I said at the end virtual to real to real, which is virtual objects becoming physical again (3d printing etc). It is about the connecting of people, a wetware grid, but threaded by narrative and context. The Meta presentation often showed these as quick interactions, thrown a basket ball, looking at a design etc, but the there is of course a longer thread, a digital thread if you like. It’s all very fractal in its nature, the patterns repeat at any scale.

One thing to note though is that everything feeds back into the entire loop, the arrows are multi directional. You might 3d print a “device” or prop you found in a virtual world interaction, but that itself might be an input device for another experience. Imagine if you 3D printed a modelling clay tool you bought in a virtual world then made a clay sculpture that you took a photo of for Flickr, or even scanned the sculpture in and made it a virtual object as a one off that you sold on an NFT market place which then saw it used as a prop in a virtual world movie, and so on. I used to talk about printing off a penny whistle and joining in a virtual band at an irish pub in Second Life. I hope you get the gist.

One key thing though, and it is slightly annoying Facebook have nicked the word but as you see below the entire loop is the meta loop (c 2009).

meta loop
Describing the Meta loop in 2009 – Its not all avatars and worlds/islands

I know many of us old guard or original gangster, however OG you want to be, might be annoyed at the mega corp going Meta on us and riding high on our hard work persuading the world this is a good idea and the way we are all going. However, it is the way we are going, so we get to say “I told you so”, rather than multi billionaires. Money clearly isn’t everything ๐Ÿ™‚ We, and I include all my fellow Eightbar people and surrounding community from back in 2006, we were already riding on top of some Double OG work. It’s the nature of waves of societal acceptance and tech development. The hope is that business and experiences that are born of the Metaverse with thrive and bring entertainment, education, excitement and benefit to the world. By metaverse business I mean not the ones building the underlying tech platforms, but the ones thriving on top of them, but then going on to influence the wave after this just as Facebook and co are today. It’s certainly not going away, so my sons birth certificate from 14 years ago saying father’s occupation “Metaverse Evangelist” might turn out to be of use to him someday.

I mentioned above that the physical world is an important part of the metaverse and in the Facebook/Meta pitch there was an instrumented house where the mugs and teapot that were carried in real life updated their position in the digital model. Well, that around the other way is the premise of my Reconfigure and Cont3ct novels. They are very metaverse, IoT, Quantum AR and VR related, but our hero Roisin gets to manipulate the physical world by altering the digital world. It works for an adventure, but it also keeps the “scifi set this afternoon” concept alive and it is not yet all done and dusted ๐Ÿ™‚ Go take a look it’s very reasonably priced and might spark a few ideas on the back of the future that Meta are pushing. More here or see the ads not the right.

What’s all this animated dancing in games (revisited) – BCS presentation

The recording of my BCS Animation and games pitch is now live on the site and on youtube. Unfortunately only the live audience will get to see the larger avatar I was using to deliver this pitch, as the zoom presentation feed is only of the shared screen with a tiny top left insert of the avatar and my dulcet and somewhat stuttering tones tones. It took a few minutes to get revved up and to many I might be stating the obvious but it is surprising how many people don’t get the importance of user generated content and/or expression in virtual worlds and games.

Official site is here but added the youtube below to save a click or two.

Digital humans evolve

Way back in the dim and distant era of 2009 I was exploring a lot of tools to help me build virtual environments with avatars and characters that could be animated, typically in Unity. 3D modelling is an art in its right and then rigging those models for animation and applying animations to those is another set of skills too. At the time I was exploring Evolver, which in the end was bought by Autodesk in 2011. A decade on and there is a new kid on the block from Epic/Unreal called Metahuman. This cloud based application (where the user interface is also a cloud streamed one) runs in a browser and produces incredibly detailed digital humans. The animation rigging of the body is joined by very detailed facial feature rigging allow these to be controlled with full motion capture live in the development environment of Unreal. Having mainly used Unity there is a lot of similarity in the high level workflow experience of the two, as they are the leading approaches to assembling all sorts of game, film and XR content. However there was a bit of a leaning curve.

I decided to generate a couple of characters and ended up making what to me feels like Roisin and the Commander from my Reconfigure novels. Creating them in the tool plays instant animations and expressions on the faces and is really quite surreal an experience. I installed Unreal on my gaming rig with its RTX gfx card and all teh trimmings and set about seeing what I needed to do to get my characters working.

First there is an essential application called Quixel Bridge that would have been really handy a decade ago as it brokers the transfer of file formats between systems, despite standards being in place there are some quirks when you move complex 3D rigged assets around. Quixel can log directly into the metahuman repository and there is a specific plugin for the editor to import the assets to Unreal. Things in Unreal have a data and control structure called a blueprint that is a kind of configuration and flow model that can be used in a no-code (but still complex) way to get things working. You can still write c++ is needed of course.

My first few attempts to get Roisin to download failed as the beta was clearly very popular. I only took a photo of the screen not a screen cap, a bit low quality but there is more to come.

Metahumans

However, eventually I got a download and the asset was there and ready to go. Unreal has a demo application with two MetaHumans in it showing animation and lip synching and some nice camera work. Running this on my machine was a sudden rush to the future from my previous efforts with a decade old tech for things such as my virtual hospitals and the stuff on the TV slot I had back then.

Roisin ?
Roisin from Reconfigure and Cont3xt in Unreal engine

The original demo video went like this

Dropping into the edits and after a lot of shader compilation I swapped Rosin with the first metahuman by matching the location coordinates and moving the original away. Then in the Sequence controller, the film events list I swapped the target actor from the original to mine and away we go.

This was recorded without the sound as I was just getting to grips with how to render a movie rather than play or compile an application to then screen cap instead. Short and sweet but proves it works. A very cool bit of tech.

I also ran the still image through the amusing AI face animation app Wombo.AI this animates stills rather than the above which is animating the actual 3d model. I am not posting that as they are short audio clips of songs and teh old DMCA takedown bots end to get annoyed at such things.

Now I have plan/pipe dream to see if I can animate some scenes from the books, if not make the whole damn movie/tv series ๐Ÿ™‚ There are lots of assets to try and combine in this generation of power tooling. I also had a go at this tutorial, one of many that shows the use of a Live facial animation capture via an iPhone streamed to the metahuman model. I will spare Roisin public humiliation of sounded like me and instead leave it to the tutorial video for anyone wishing to try such a thing.

Lets see where this takes us all ๐Ÿ™‚

Sansar-Lost Horizon party in VR

Last weekend, Friday 3rd July and Sat 4th July 2020 the virtual world Sansar hosted the Glastonbury Shangrila set of gigs called Lost Horizon. I decided I really should attend as it looked such an epic lineup and attempt to put on 3pm-3am constant music over several stages for 2 days, in VR.

Sansar was originally a Linden Lab( Of Second Life fame) spin off, focussed on using newer tech to get people into the environment. I was in there back in 2017 importing a tilt brush sculpture into my own space. I had been in the early beta but it was all NDA so I didn’t post photos until it went live.

Now #sansar is live I can post my google tiltbrush import #art
Early Sansar

Sansar was recently sold off to Wookie Ventures, but I was pleased to see that my Sansar account even from back then still worked, and I had the “look I have been here since year dot” style level on my profile.

I had a quick look around new Sansar the days before the event, you know to get the avatar in good shape and not look a total n00b moving around etc. It had changed a lot, very much in keeping with the style on VR apps with pop up menus you can place in space or float from you wrist. It also looked and sounded great

OculusScreenshot1593535943
New Sansar Nexus

Incidentally, for some reason all the photos I took from within the Oculus wrist band came out square, it all looked much better in VR and moving.

I visited the popular 2077 cyberpunk world too, again all as prep. All very much in keeping with the whole scene, and great to see Max Headroom (look him up kids) on a screen for my generation of geeks.

OculusScreenshot1593535751
Cyberpunk 2077

The real star though was the set of gigs both of dance music, some urban house or something I don’t even know and some live bands performing all manner of wonderful tunes.

Now this is where my experience may have got a little more freaky. I had my Oculus Rift S ready to go for the visuals, but I also wore my Woojer haptics vest that hits your body with bass lines and clever sensory perception through sound, then that fed to my Nuraphone in ear and over ear headphones, that are tuned to my individual hearing pattern. Its a lot of clobber to be working but it had to be tried. (Note this is all bought and paid for no product placement here!)

Sansar lost horizon
All the kit

As I logged into Sansar and I arrived at the Nexus I immediately got a blast on through all the kit, as I teleported to the first stage I arrived at the landing zone, the stage was a little walk away, but already the dance music bass line was hitting me physically as well as sounding great. I had to turn the haptics down a little to start off with to get used to it as I approached the arena.

Once in, and this applied to all the shows I went to, the DJ, in this case, was green screened into the booth, so whilst we were all avatars, that DJ was theirselves doing their thing.

Sansar lost horizon

Everyone else was an avatar of many different forms, the vanilla basic first avatar to some incredible creations. I couldn’t find my normal predator kit so I went with my second option of green hair and some cyberpunk kit. I also picked up a set of animations and dance moves free form the shop but went on to buy a whole lot more to add to the fun.

The music and the feeling was incredibly intense and I tried all the stages in my first 3 hours at the event. I liked the DJ stuff but live bands were even better.

Sansar lost horizon

I dropped out after 3 hours just to make sure I wasn’t shaking myself to pieces, and to grab some food. I also mentioned it to Predlet 2.0, he installed Sansar and got set up, he didn’t set his VR up but it supports desktop.

He was having a blast with silly avatars too. We were at the Fat Boy Slim gig, as you can see below. Fat Boy Slim is waving in the background. I am rocking a guitar that I went and bought, as I was enjoying the actual bands and Predlet 2.0 was a dancing shark for this one.

Lost horizon festival Sansar 2020
Fatboy slim and us

In Sansar you have a choice about how much you want to interact, I love having lots of dance moves and gesture and string them together in time with the music, it was clear others were just listening, living the avatar to dance, but it’s all good. Predlet 2.0 shark liked doing head spins.

Lost horizon festival Sansar 2020
More shark

You can mix between the first person view, or as in these photos orbit the camera around. If you are dancing its best to see what you are doing with the avatar from the outside.

Occasional crashes or re-sharding sent me back to my own little room but I stayed for another 3 or 4 hours on the first day. Thats a lot of VR and haptic feedback and banging tunes in your head I have to say. I was worn out ๐Ÿ™‚

Lost horizon festival Sansar 2020
Waiting to get back in

Some of the venues got us all flying around with low gravity or bouncing pads, so I got to be all rock and roll on a rooftop.

Lost horizon festival Sansar 2020
roof

I got an awful lot of moves and animations, many from names I know well from Second Life. These all strung together in a wonderfully esoteric way and the puppetry of dancing got more and more fun.

It was also clear that there were an awful lot of people with years in Sansar, intricate avatars, lots of youtubers streaming too. Equally there were some people who apparently had never been in any virtual world or games before. These people were sort of amusing in they would arrive and shout to their friend, “can you hear me”, “Is that you?”, “Look what I can do?” which was moderately funny, but then they would start pointing out weirdness in avatars thinking no one could hear them. I had left my channel open so I could. There were also some people, as in any crowd looking for a fight, swearing or being abusive, they were politely reminded by Sansar helpers/bouncers to chill out and that was in fact PG-13. I saw a few of those less capable of enjoying space with others leave, or get booted for being grievers. Its 2006 all over again, as we used to have to do that event at the IBM Wimbledon Second Life event.

I saw a lot of bands, so many I can’t name them all but on day 2 probably about 5 hours in I caught UK metal/punk/street band PENGSHUi and they absolutely rocked. If nothing else it was the mix of styles messing and thrashing my haptic Woojer. It was certainly time to break out the guitar. Yes I even bought the album afterwards, the first music I have bought for years as a memory of the event.

Lost horizon festival Sansar 2020
PENGSHUi

I spent about 10-12 hours in total out of the 24 available. The Woojer and Nuraphones added a great deal to my personal experience. I know a lot of other people were enjoying it and the production was fantastic. It is the best virtual event I have been too, not the longest as 2 weeks running Wimbledon in SL would probably take that record, and that was different. This was pure escapism, rocking music and a bunch of fellow mad virtual world enthusiasts and n00bs alike coming together in this weird locked down pandemic.

If I have note already posted all the photos I managed to take above they are in this album, but go check out the videos and other official content, not least PENGSHUi (Yay I see my avatar)

Or once of the DJ’s like Fatboy Slim (There were multiple shared rooms, looks like I wasn’t in the filmed room this time ๐Ÿ™‚ )