Communicating through avatar size

It always amazes me how there is always something to learn or be surprised about tech, you think you have a handle on it and then you see or somehow sense something else. I recently got hit with a thought about the relative sizes of an avatar in a shared virtual space/metaverse application and what that can be used for. I have often talked about the scale of virtual worlds being a mutable thing, we can be sub atomic or at the scale of a universe in a virtual environment. I have also talked about being the room or place, i.e. a sort of dungeon master approach controlling the room (as a version of an avatar) whilst others experience something about that place.

Avatars of all sizes

The first experience that got me thinking about relative avatar size was getting to play Beat Saber as a 3 player VR game. I had done 2 player, here you see your fellow player off to the side facing you but experiencing the same song and blocks. You are so busy playing and concentrating, but it is a good experience to be shared. However once over 2 people, at 3 in this case, you are all on your own track but facing a common central point, like being on a star. The player that is doing best has their avatar zoomed and projected up towards that central point as a reminder you need to work harder. Its a great mechanic and is then using the avatar size as a communication mechanism, i.e. they are better at Beat Saber than you.

The next experience was to properly play the VR dungeon crawling turn based dice and card Demeo as a multiplayer. This game has you and your friends gathered around a board decked with figurines. You are represented by you hands, to pick up the pieces and moves them, or to roll the dice, and by a set of glasses or small mask. The avatar is not a whole thing, no need for legs etc. The board is the thing you all want to see. Each player can change not only the direction of the board just for them, by dragging it around but also zoom in and out to see the lay of the lay or get right in and look at how the characters have been digitally painted. The game is collaborative and turn based, and you get to see the other players had and mask avatars. Here though is the twist, you also get a sense of whether the other person is stood looking around the table top or if they have zoomed close because the avatar you see of them scales up and down in size according to their view of the world. Not only can you see the direction they are looking but how detailed their view might be. If you are zoomed in and you look up you see the giant avatar of your fellow player looking at you. This is all very fluid, gameplay is not messed up by it, and it shows a level of body language that only really exists in metaverse style applications. The VR element makes it feel even better but the effect works on 2d screens too.

We often talk about sense of presence, remember you were next to someone in a virtual place, but this is another dynamic that is obvious, but only obvious now having experienced it. Anyway, that’s the thing I learned earlier this week. Also in order to find a picture to express the image above started as one DALL-E sentence for the AI generation, but I also tinkered around with the new web editing that lets you add new frames of generation to existing images, creating AI generated composites!. The stately home was an accident, but somehow looks very much like my old 2006 metaverse evangelist physical and virtual home of IBM Hursley. The ability to create an image and a feeling for this post, not just grab a screenshot for the games (which is always tricky to get what you want anyway when it involves other players), is also rather cool and is the first time I have created an image for a subject, rather than the subject be image generation. The future is arriving fast!

Anyway, avatars, not always what you think they are are they? πŸ™‚

Talking games and tech on Games At Work dot Biz

I was really happy to be invited back to my favourite podcast GamesAtWork dot Biz to share some air time with Michael and Andy last week. (One less Michael than usual unfortunately). As Andy and I are both in the UK this took the number of Brits on the show up to a majority hence the title.

You can hear the latest episode on all the usual podcast places, or follow this link to ep386 Too many brits

Amongst other things I got to enthuse about the latest iteration of my favourite real guitar teaching and playing app/game Rocksmith, now Rocksmith+ which for me really shows the power of instrumentations of erm…. instruments πŸ™‚ The original was back in 2012 and very quickly broke my guitar πŸ™‚ I had since 1988. I also learned that the action on that guitar was way too high and my replacement electric was much easier to push down not the strings.

Rocksmith, amongst other things, lets you feel like you are a rock god at concert :), as per the midjourney geared image below. Though the enjoyment in feeling improvement very quickly through still intense practice is wonderful.

More AI image generation

I have carried on experimenting with the wonderful Midjourney app on Discord. My previous post had some images and a degree of explanation, but as a reminder. This is one of the new generation of publicly available of AI based image generation from text. Just watching the flow of requests and the range of styles and images that appear constantly is fascinating. It generates 4 images for each text request which you can then curate and push for further generation of more similar to that one. It can also take a web image not generated by it as inspiration.

This however, made me chortle, then stop to ponder, then simply just say way. I said /Imagine Lego John Wick – that is all I asked

Midjourney predator wick hardy
Lego John Wick

Look at the variety of styles from the top right and bottom left as actual minifies with shadows to a stylised top left looking as if he is paid in lego pieces not gold, and bottom right an angular representation but a cool image.

Then of course that got me going onto the obvious /imagine Lego predator

Midjourney predator wick hardy
Lego Predator

This just looks great, lots of predator variations, which as has happened with the films is pretty much in the storyline to have multiple tribes. (BTW Watch Prey the latest Predator movie on Disney+/Hulu it is really good !

Once again Midjourney hit me with its range of styles it delves into. I moved off the lego thing for a while and asked to see Laurel and Hardy breakdancing

Midjourney predator wick hardy
Laurel and Hardy breakdancing

This is a second generation image from a set of already very good images. Laurel and Hardy are of course well known, but there are no breakdancing scenes to refer to. Also, it chose black and white, presumably because most of the source content was black and white. The quirky style of this compared to the photo realistic lego minifigs and the stacks of other images show how amazingly versatile this approach is. You should be able to get a glimpse of the madness on my public facing page in midjourney (if that works )so I can stop posting here all the time saying “look at this: πŸ™‚ You may need to change the second in from the right navigation drop down to grids or all, it seems to default to the large upscaled individual images.

Bring on virtual world/metaverse generation.

AI image generation – bots FTW

Firstly, yes its been a while since I posted here, I could have done with an AI clone of me carrying on sharing interesting things. I was out mainly due to an annoying post-ailment “thing” (medical term) hitting my facial nerves which meant for months any screen time trying to read or write was visually and mentally uncomfortable. That was followed with a dose of Covid that knocked me out for a while but also seems to have had a potential positive impact on whatever else was going on. Anyway, I am back. Anything happened the past few months, I mean 1/2 a year?

I suddenly felt compelled to blog about some of the marvellous work going on with AI image generation of late with things such as Dall-e and Midjourney becoming available to more people. I have so far only used Midjourney, which is accessed through Discord. In the relevant channel you simply type the command /imagine followed by any descriptive text you fancy and it will generate four images based on that, with often surprising results. Any of those images can be selected as the root to generate more in that style, or you can upscale, get a big version, of one you favourite.

Handy hint- The mobile version of Discord really likes to scroll to new items, midjourney works by updating your post to the channel. As so many posts are being generated every few seconds you might think you can leave the app on your post to see the results but very quickly you will zoom up the screen. Constant scrolling back to try and catch your images being generated is a bit of a pain. However if you log into midjourney.com with Discord ID it will present you with your collection and lots of links to get back to do things with them. The Discord search doesn’t seem to work on userid etc so it took me a while to figure this out.

After a few attempts, and with one eye on the political situation and crisis faced in the UK I asked mid journey “an artist forced to stop painting in order to work in a bank” and got this wonderful set of images

epredator_an_artist_forced_to_stop_painting_in_order_to_work_in_bdef4924-d482-4aa2-9156-404abea5841a
an artist forced to stop painting in order to work in a bank

I also asked it to generate a protest by the people

epredator_a_crowd_of_down_trodden_people_protesting_and_rising__79a1561b-94a8-47ce-8f87-51d6610ce657
a crowd of down trodden people protesting and rising up against the billionaires and corrupt politicians

Rather than continue the doom and gloom it was time to go a bit more techie πŸ™‚ I asked it to show me directing the metaverse from a control centre

epredator_epredator_directing_the_Metaverse_evolution_from_a_hi_631ac745-9d7d-439e-a509-0ec2facdba7b
epredator directing the Metaverse evolution from a hi-tech digital operations center

An obvious course of action was to explore my Reconfigure and Cont3xt novels and this is where I was very impressed by whatever is going on with this AI. I asked to see “a cyberpunk girl manipulating the physical world using an augmented reality digital twin” and it seems that Roisin (@Axelweight) is in there somewhere πŸ™‚

epredator_a_cyberpunk_girl_manipulating_the_physical_world_usin_f751db5b-4d89-4217-ac97-831b2518431a
a cyberpunk girl manipulating the physical world using an augmented reality digital twin

This was worth exploring and picking couple of the images to then seed generating more similar to that which generated these two

epredator_a_cyberpunk_girl_manipulating_the_physical_world_usin_e9b4303b-e51d-4080-8380-0563ae9f1bdf
a cyberpunk girl manipulating the physical world using an augmented reality digital twin (generation 2 from bottom left of gen 1)

And this one

epredator_a_cyberpunk_girl_manipulating_the_physical_world_usin_023f17fc-694c-4b36-a752-243dc0e24ef6
a cyberpunk girl manipulating the physical world using an augmented reality digital twin (generation 2 from top right of gen 1)

These were good but the first version of the top right image in generation 1 was the one I did a more detail version of on its own. It could be the cover of the third book couldn’t it?

epredator_a_cyberpunk_girl_manipulating_the_physical_world_usin_9bac6e1c-1cf6-4914-bd1a-38a56a8e1f0f
Midjourney generated potential book cover for third in the reconfigure trilogy. Of course one fo these from ore geometric results are more in line with my current cover designs asking “A world of Fractal Iterations, Quantum Computing and strange side effects opened up. It appeared to offer a programming interface to everything around her.”
epredator_A_world_of_Fractal_Iterations_Quantum_Computing_and_s_c1fd5bc8-5f97-447e-91d5-89dc97834977
A world of Fractal Iterations, Quantum Computing and strange side effects opened up. It appeared to offer a programming interface to everything around her.

In case this seems a all a bit too serious, I did ask a slightly weirder question of the AI and got this.

epredator_two_ducks_with_a_top_hat_on_carrying_a_salmon_28a2f00f-dd37-4306-8710-750e6ea7e0b0
two ducks with a top hat on carrying a salmon

For a happy image how about this /imagine a bright green planet full of life

epredator_a_bright_green_planet_full_of_life_9df6f240-8bb5-4252-a3ce-500fe649388f
a bright green planet full of life

But back to the surreal and unusual and predlet 1.0 and I share a joke we “invented” when she was quite young, we were passing the pub in Lock’s Heath having been to the supermarket, she looked at the pavement and saw a creature. She said “dad look there’s a worm coming out of the pub” to which I replied “and he is legless”, and here it is

epredator_a_worm_legless_outside_a_pub_e3838ee3-3522-432f-a8a8-9fa7e733fb8e
a worm legless outside a pub

As this kind of AI generation evolves from the static image to the moving one and then onto virtual world and the metaverse, what a wonderfully weird set of worlds we will be able to dream/nightmare up πŸ™‚

The Metaverse is calling again – think bigger

As many people who pop along here will know back in 2006 (and even before that) I have been somewhat passionate in explaining and sharing what can be done with things like game technology, virtual worlds etc to help with human communication, that includes entertainment (which is often regarded as not a serious business), office type business, education, industrial digital twins the list goes on. Metaverse is a handy word to bundle all this together and whilst not tightly defined I think many of us know what we mean. This month I wrote a piece for the BCS which is available to read here and they have nicely highlighted the phrase “β€œTravelling and actual teleportation aside, do you think that we should be able to find better ways to feel we are with people and to have shared experiences?” Whatever you think the Metaverse is evolving to, just consider if the way you are doing things now is the best it could possibly be given the tech available. Things evolve and move, not always replacing everything behind them but adding.

Having sad that, it is worth reminding everyone that the original work in 2006 in Second Life was mostly about pulling data from other sources into the environment, and doing things with it, this eventually led to building Wimbledon’s 2006 course with the ball trajectory from Hawkeye rendered there.

broker
All about data from elsewhere integrating into SL in 2006

There is an ongoing discussions is about interoperability of Metaverse or virtual world platforms. These range from is it desirable to have objects from world x appear in world y and then often hit the barrier of platform x has a completely different rendering engine to platform y. The former is a style and even a business question, why does an environment of any type exist, real, virtual, augmented, projected, whatever. This then leads to the technical wars around content being “in” a development or build environment. Importing of meshes, textures, animation rigs etc is one way to look at Metaverse evolution but I think it is also worth considering the potential for things like remote rendering, if the unifying standard for content sharing was actually containers, of potentially any size or shape and level of detail, the anything from environment x could appear in environment y and even interact with it to vary degrees depending on permissions or needs. My Nvidia RTX ray traced shiny car could appear in the most basic of 3d environments as I am rendering the car my end, and the virtual world is processing a container and stream. This is after all what we do when we video chat, each of our cameras is doing some work and being hosted by a basic integration layer. Yes I know 3d and collisions and physics and and etc. all make it more complex but it is worth considering that other ways of working can exist to the build and compile we are used to from games. Also, from what I remember of Snowcrash avatars are rendered by the person’s own device, those with worse computing power have more basic avatars in world? Also, we already did some of this back n 06/07 it just takes a different way of thinking.

Talking to thinking, literally, I was invited to join the Games at Work dot Biz podcast again and I was more than happy to give a shout out to a book by Prof Richard Bartle on virtual worlds and our god like powers over them. Checkout the podcast and links are here and on any player you have, give it a rating to help the guys too please.

Stunning Matrix Unreal 5 Engine demo

There is a lot of hype of the new Matrix movie but today a demo/experience rendered in Unreal 5 live on Xbox series x and PS5 went live. The intro movie is fantastic, sometimes you can’t tell real from virtual, other times you can, but that often feels deliberate, toying with the uncanny valley.

If you don’t have an next gen console then there is a video, and it is well worth watching, remembering this is live rendered in Unreal 5 not precanned cut scene.

Many of us in tech have an affinity to the Matrix. I remember the first one, sat in the offices in Hursley building early web sites and shops, engaged in the start of digital transformation. The free character drip screen savers everywhere and even the old clunky web we had clips to watch “I Know Kung Fu” etc. Many of us were gamers too so this blaring of physical and digital was really appealing. I even bought the soundtrack, not on old fashioned CD but on Mini disc πŸ™‚ Its always pushing the envelope and as this video shows, they are still at it! Metaverse FTW πŸ™‚

Matrix Unreal 5

Forza Horizon 5 – Mexico here we come

What a bumper time for games it is, and on 9th November the latest of my all time favourite racing games Forza Horizon arrived on the Xbox, The games have always had a large free roaming area to collect, drive, race and photograph some fantastic cars. Each generation gets more jaw dropping as it pushes the limits of what can be done with the graphics, sounds physics. Number 4 in the series was set in the UK, building on a huge map that merged many of key locations and and cities into a manageable terrain. Number 5 has moved to Mexico and even more stunning scenes await there. The game is bundled into Microsoft Xbox Game Pass, but being a fan I bought the full version and all the expansions, knowing full well that I will be playing this on and off for years.

The beauty of Forza Horizon is that “where we are going we don’t need roads”. There are roads and tracks and you can just stick to those but you can also tear across the landscape in even the most unsuitable car possible. This may seem at odds with the precision feel of the driving simulation but it always manages to walk that line between arcade lunacy and pure driving simulator. Why would you go tearing around the scenery, well its for the views like this.

terrain
Mexico desert and volcano

The scenery alters across many different biomes, the time of day and weather also make for great variations and the game itself switches between spring, summer, autumn and winter every week.

Also of note is the radio stations, prerecorded play lists of rock, dance, classical music that have wonderful DJ voice overs that often refer back to things that are happening such as event s just raced or about to come up. It is enjoyable to just pick a car, and casually (or otherwise) drive around the place seeing the sites and smoking up the tunes.

A feature of Forza that I always share and enthuse about is the custom paint and decals for the cars. The game has managed to preserve artwork with each generation of the game so effort in creating these is not wasted. I use the decals as a demonstration of in game advertising and support with both my martial art and my Reconfigure books represented, along with a stylised predator face based on the Wii Mii version. These designs appear with my car in other peoples games and can also be downloaded, though there are some much better painters doing some great work to check out.

Forza horizon 5
Scooby, Choi, Reconfigure
Forza horizon 5
Doughnuts

There are cars from 3 wheelers to high end supercars, over 500 different models and all hyper detailed in and out. There are drag races, giant leaps and even houses to buy.

Forza horizon 5
Drag race
Forza horizon 5
Leap
Forza horizon 5
House

There are lots of online multiplayer angles to join in on, but I really enjoy driving against the Drivatars of friends and family. Here, a single player race is populated by names and cars of people you know driving to some degree in the way they would drive if they were there. This degree of personalisation is always great fun.

If you are looking at these shots and saying, oh it all looks too shiny to be real though, I saw this vehicle parked outside a London train station in a recent (and long time since I have been out) trip.

London nov 21
Shiny

I have already blasted through many of the races and levels, the last game was one of the few I have level capped and “prestiged” on. That really just comes with lots of time more than any particular skill, though doing well in races, performing superb stunts, drifting, speeding and crashing the scenery all ramp up the levels more quickly.

It is also great that I can play this on PC and on Xbox Series X but also now its available as a cloud game. Its juts a pity that our communication infrastructure on the train lines is not up to using that, but its still good if other members of the family are playing something else at home.

See you at the vocano

Forza horizon 5
Mountain lake

Meta, Metaverse, Me

Well what an interesting day yesterday was as Facebook went all out on being a Metaverse company and renamed itself to Meta. It also marked a point that crosses my straight unbiased industry analyst day job with my constancy of being a metaverse evangelist, a title I had in 2006 and maintain. These comments are mine alone and not that of the company I work for. I have lots more to share and discuss on all things Metaverse, but that’s something that people have to pay for nowadays.

However, I will say that I was really pleased to see a slick presentation on all things Metaverse from a multi billion dollar company like Facebook. Last year the same happened to some degree with Microsoft and Mesh at Ignite. The conversations have moved by many people from, “nah we don’t want to interact like that ever” to “no one company should own all this”. The Metaverse the industry and society needs does, as was said in the Meta keynote, need to be able to let people connect with both the physical and multiple virtual realities for a multitude of reasons and with many available ways to be who we are. I have, for many years, talked about how we have had to adjust to the limitations of technology to allow our communication at distance. We still type on qwerty keyboards designed as a layout in part to stop physical typewriter hammers jamming by spacing out the commonly used letters in words to avoid that issue. Not that we need to get rid of this form of typing, but to hold it up as the only way to interact with technology and data seems a little bit odd doesn’t it?

Back a decade or so ago I created a slide for a presentation on virtual worlds and the metaverse that was aimed at showing how far we might be able to go and why. It was in part a presentation about the fact it is not all avatars and islands. Those work, but there are many more things that come together. There are many forms of input, including neural interfaces (see below at the left), there are combinations of virtual to virtual, virtual to real (AR) and as I said at the end virtual to real to real, which is virtual objects becoming physical again (3d printing etc). It is about the connecting of people, a wetware grid, but threaded by narrative and context. The Meta presentation often showed these as quick interactions, thrown a basket ball, looking at a design etc, but the there is of course a longer thread, a digital thread if you like. It’s all very fractal in its nature, the patterns repeat at any scale.

One thing to note though is that everything feeds back into the entire loop, the arrows are multi directional. You might 3d print a “device” or prop you found in a virtual world interaction, but that itself might be an input device for another experience. Imagine if you 3D printed a modelling clay tool you bought in a virtual world then made a clay sculpture that you took a photo of for Flickr, or even scanned the sculpture in and made it a virtual object as a one off that you sold on an NFT market place which then saw it used as a prop in a virtual world movie, and so on. I used to talk about printing off a penny whistle and joining in a virtual band at an irish pub in Second Life. I hope you get the gist.

One key thing though, and it is slightly annoying Facebook have nicked the word but as you see below the entire loop is the meta loop (c 2009).

meta loop
Describing the Meta loop in 2009 – Its not all avatars and worlds/islands

I know many of us old guard or original gangster, however OG you want to be, might be annoyed at the mega corp going Meta on us and riding high on our hard work persuading the world this is a good idea and the way we are all going. However, it is the way we are going, so we get to say “I told you so”, rather than multi billionaires. Money clearly isn’t everything πŸ™‚ We, and I include all my fellow Eightbar people and surrounding community from back in 2006, we were already riding on top of some Double OG work. It’s the nature of waves of societal acceptance and tech development. The hope is that business and experiences that are born of the Metaverse with thrive and bring entertainment, education, excitement and benefit to the world. By metaverse business I mean not the ones building the underlying tech platforms, but the ones thriving on top of them, but then going on to influence the wave after this just as Facebook and co are today. It’s certainly not going away, so my sons birth certificate from 14 years ago saying father’s occupation “Metaverse Evangelist” might turn out to be of use to him someday.

I mentioned above that the physical world is an important part of the metaverse and in the Facebook/Meta pitch there was an instrumented house where the mugs and teapot that were carried in real life updated their position in the digital model. Well, that around the other way is the premise of my Reconfigure and Cont3ct novels. They are very metaverse, IoT, Quantum AR and VR related, but our hero Roisin gets to manipulate the physical world by altering the digital world. It works for an adventure, but it also keeps the “scifi set this afternoon” concept alive and it is not yet all done and dusted πŸ™‚ Go take a look it’s very reasonably priced and might spark a few ideas on the back of the future that Meta are pushing. More here or see the ads not the right.

What’s all this animated dancing in games (revisited) – BCS presentation

The recording of my BCS Animation and games pitch is now live on the site and on youtube. Unfortunately only the live audience will get to see the larger avatar I was using to deliver this pitch, as the zoom presentation feed is only of the shared screen with a tiny top left insert of the avatar and my dulcet and somewhat stuttering tones tones. It took a few minutes to get revved up and to many I might be stating the obvious but it is surprising how many people don’t get the importance of user generated content and/or expression in virtual worlds and games.

Official site is here but added the youtube below to save a click or two.

Metaverse Mozart t-shirts for invisible people

As you may have noticed I have been in and around the virtual world and metaverse for a long while. I have spent decades helping people understand the basics of virtual worlds and benefits they bring to us all. The pandemic has highlighted the need to do some things differently so it is not a suprise that Metaverse is trending in all sorts of places once again. You don’t have to look far to find replays of the ebullient money money from US TV Jim Cramer “describing” the Metaverse is such odd mixed up sets of keywords and company names that it had me crying with laughter and with sadness.

Just to follow his stream of blurb he went through. I only need 10 minuted to explain it all to him, as do any of us, or even better just logon to a virtual world, anyone of them and talk to someone, experience something. I was reminded of the conversation about the internet, the web, ecommerce, social media and so many other emerging technologies. We always get to this stage of apparently influential people getting it so wrong yet bringing so much attention to something.

To be fair, we all have to learn things, but maybe we don’t all then talk about them to a large TV audience in the context of financial investments until we do understand the words. I would be one of many to be happy to explain what the vision of the metaverse is, what the practical implementation of that is today and why it is some of the words used but not in the same order or context. Also… its not all about NFTs either !

He said (slightly paraphrased):

“It’s Unity with an oculus looking at a t-shirt its Nvidia?”

“It’s a hologram”

“You go into a room to listen to Mozart but listen to Beethoven and these people don’t exist”

I hope my virtual world descriptions on TV (of which there are a few) and possibly the relationship between virtual worlds that form a proper metaverse are not quite on this level. As on my TV showreel page with the opensim conversation, added here from 2011 for completeness. Yes a decade ago on saturday morning kids TV, a live virtual world. Cmon people lets do this thing !

Virtual worlds in 2011

Meanwhile I am off to google something up on the inter webs.