A new martial arts journey – Tai Chi

I have been training and teaching Choi Kwang Do as a defensive martial art and health improving activity for many years. I was very amazed to achieve my first black belt in it at the end of 2014. That journey was shared with all the family too for a long while. We all trained, taught and graded together for many years.

In 2019 I was getting ready for my 3rd Dan grading when in a non related incident I managed to get a concussion slipping over in the kitchen making Easter dinner. That made me ease up on training for a good while, as I recovered. On getting back to full time training and classes at the end of 2019 the world took its own hit as we headed into the COVID lockdown years. I continued to train at home though, my various CDK routines and a friendly BOB for impact training became a regular part of life. I wasn’t so keen on zoom call training as all work was zoom based and I wanted to get away from that.

A few other health issues also slowed the training down, a bought of COVID hit me with some longer term balance and focus issues on and off for a year or too. As part of trying to sort this out there were a few doctors and ENT specialists but it was all a bit off the normal diagnostic path, despite being utterly horrible and impactful at times. The result was I thought I would try Chinese traditional acupuncture and sort my Qi (Chi) out. I started this a year ago now. In the first few months the impact was significantly positive, and it remains so. I was almost completely sorted out by September last year but over new year I ended up with flu/covid whatever and it kicked me back to where I had started. One trip to acupuncture and I was well on the road to normality again. Also during the course of the sessions other things, such as training muscle injuries get rolled into the treatment and recovery is way quicker, also my hay fever has diminished to almost nothing. This got me considering what else I could do to maintain and improve this balance of energy. BTW the place I get treatment is https://www.physicalbalance.com in Basingstoke with Carolyn, who is always fully booked up with patients from far and wide.

At the same time as this was happening our local CKD school was winding down as our school owner had a very exciting family opportunity to head to Australia. A few years ago I would have jumped at the chance to take over the school and continue, but I didn’t feel in a position to be able to do that. As everything happens for a reason, the acupuncture and my introduction into these energy flows got me looking around for Chinese martial arts in the area. At the same time my daughter had looked into trying something out as a Christmas present for me. She got me and her trial lessons in Tai Chi at Shin-Gi-Tai martial arts in Basingstoke and the odd serendipitous part of this is that its just a few hundred meters from where my acupuncture is, but not related in any way else.

We went for our Tai Chi lessons and it was really enjoyable, in a great martial arts facility with an incredibly experience team of instructors in all forms of martial arts. I signed straight up in January and have been going a couple of times a week since then. Tai Chi is the very slow moving graceful movements, which exists in many forms alongside QiGong. QiGong is the more health focussed art but Tai Chi is a martial art in that whilst you learn slow meditative and physically beneficial movements they are also offensive and defensive forms. I had not taken up CKD originally to learn to fight, but to learn to defend if need be, but for the mental and physical challenge. Nothing in CKD was about ego and macho competition and so walking into Shin-Gi-Tai this felt very similar. Obviously Tai Chi tends to attract us older people so we are less likely to want to show who is boss etc.. There is a never ending set of tweaks and improvements to be made and patterns to be learned that have more than enough to keep anyone going. I have also found that when I do train at home in CKD still (for a bit of cardio and fast movement) that I try and apply some of what I am starting to learn to feel in Tai Chi.

In case you are not convinced about the potential impact of Tai Chi as a martial art form (just as some are not convinced by other martial arts in a tit for tat mine is better than yours – silly I know). But this video shows a demonstration that at the point it starts art around 11:10 is the basics of the form I have been learning.

Tai Chi Application

Our teachers as you can see here have an incredible history and amount of martial arts experience in so many forms. Bryan often shares how the application of some of the Tai Chi might play out, just as in the video from the US above, which for me helps contextualise it all. Though the real benefits that I am enjoying are building on the energy flows from acupuncture and feeling an improvement in mental and physical well being. It’s great to have a slow flowing set of moves to complement the also flowing and rounded ones of CKD (we have no lockouts in that art).

So there we have it a new martial arts journey, but a complementary one. There are many more things to try at Shin-Gi-Tai too. Lets see where this goes ๐Ÿ™‚

Consistent characters in Midjourney GenAI

It’s been a while since I posted anything for lots of reasons, that’s an offline conversation. However, this weekend something appeared on my feeds that was just too exciting not to do a quick test with and then share, which I already have on Twitter, LinkedIn, Mastodon and Facebook plus a bit of instagram. So, many channels to look at!

Back in August 2022 I dived into using Midjourney’s then new GenAI for images from text. It was a magical moment in tech for me of which there have been few over the 50+ years of being a tech geek (34 of those professionally). The constant updates to the GenAI and the potential for creating things digitally, both in 2D, movies and eventually 3D metaverse content has been exciting and interesting but there were a few gaps in the initial creation and control, one of which just got filled.

Midjourney released its character constant reference approach to point a text prompt at a reference image, in particular of a person, and to then use that person as a base for what is generated in the prompt. Normally you ask it to generate an image and try and describe the person in text or by a well known person, but accurately describing someone starts to make for very long prompts. Any gamers who have used avatar builders with millions of possible combinations will appreciate that a simple sentence is not going to get these GenAI’s to get the same person in two goes. This matters if you are trying to tell a narrative, such as, oh I don’t know… a sci fi novel like Reconfigure? I had previously written about the fun of trying to generate a version of Roisin from the book in one scene and passing that to Runway.ml where she came alive. That was just one scene, and to try any others would not have given me the same representation of Roisin in situ.

Midjourney experiments
Initial render in midjourney some months ago

The image above was the one I used to then push to runway.ml to see what would happen, and I was very suprised how well it all worked. However, on Friday I pointed the –cref tag in midjourney to this image and asked for some very basic prompts related to the book. This is what I got

Another hacking version this time in a computer room not a forest
A more active shot involving a car
A different style, running at the car but similar to the previous one
Looking to buy some Marmite (a key attribute in the story and life of Roisin)
Another marmite shopping version
A more illustrative style in the snow (sleeves are down)
Another snow stylistic approach (notice sleave up and tattoos)
A more videogame/cell shaded look to the computer use in the bunker.

As you can see it is very close to being the same person, same clothing, different styles and these were all very short prompts. With more care, attention and curation these could be made to be even closer to one another. Obviously a bit of uncanny valley may kick in, but as a way to get a storyboard, sources for a genAi movie or create a graphic novel this is great innovation to help assist my own creative process in ways that there is no other way for me to do this without some huge budget from a Netflix series or a larger publisher. When I wrote the books I saw the scenes and pictures vividly in my head, these are not exactly the same, but they will be in the spirit of those.

Metaverse and GenAI webinar for BCS

This month was the AGM for the BCS Animation and Games specialist group that I have been chairing for a very long while now. I gave a presentation from a personal view point (this is not a work presentation and I make that clear in the disclaimers, though it is what I work in too of course), on the advances in Metaverse and GenAI content creation. The full YouTube version is below but the link to the blurb and bio (and the video) at the BCS is here

We are always looking for presenters to come and share some ideas with our specialist group around all things games, animation, metaverse, esports etc, so if you are interested ping me there is a slot waiting for you. We sometimes get a big crowd, other times smaller ones but with the videos published like this it can be a useful thing to do and share.

For those of you who don’t know, BCS (formerly British Computer Society) Chartered Institute for IT is a UK based (but worldwide membership) professional body for anyone in the tech industry. It exists at all levels from just getting going in the business to Fellows with vast amounts of experience and willingness to help. It was part of my professional certification whilst at IBM and I then also became a certifier whilst there too. Volunteering and sharing ideas, such as this presentation, is one of the many ways to get involved (you don’t have to do this). It benefits you as an individual but also elevates tech roles within enterprises and organizations you work in.

You can find more at BCS, The Chartered Institute for IT (bcs.org)

Meta Quest 3 – seems very good

Firstly, yes it looks like its a while since I posted on here, but that’s the challenge when work life and social/hobby life are one and the same thing. I have to save all the really good stuff for my professional output. However, I bought myself what is my 16th VR or AR headset with the Meta Quest 3 and as a long time user of such kit I wanted to share my personal (and only personal) thoughts about this bit of kit.

Quest 3
Quest 3 box
Quest 3
Headset and controllers
Quest 3
Looking like the Aliens from Toy Story 3 with 3 eyes wearing a Quest 3

Our Quest 2 has been kept in our kitchen ready for anyone in the family to use it since it arrived. The kitchen is the only place with just about enough free space to wave arms and make the odd step safely in VR. Though as I may have explained before there is a dent in the fridge from a side fist I did playing Superhot VR when I was training for my 2nd Degree Black Belt in Choi Kwang Do. The Quest 2 went straight to VR, obscuring the world (that was it’s job after all) but the guardian that we manually drew around the kitchen (often repeating that process) tended to keep us in bounds. Picking up the controllers and getting them on the right hands in the right way was always a bit fiddly, like plugging in a USB A and always getting it wrong the first two times despite there only being two ways to do it.

The Quest 3 on the other hand, starts with colour pass through, you put it on (over glasses too as they fit in the headset), and just see a slightly blurry version of where you actually are, its much quicker to get in and stood in the right place. (Quest 2 has pass through but it’s very disorientating and B&W). The new hand controllers are much easier to see and to pick up the right way as they don’t have the massive halo loops of plastic around them for sensors. The headset also does the guardian and scanning around for things on its own, you don’t have to (though you can) draw a line around yourself, which in the Quest 2 seemed like some sort of weird religious or mystical incantation involving a border of salt.

The Quest 3 is also light and smaller than the 2, or at least feels like it, I haven’t weighed them. This makes the whole experience to just get going much less faff an bother. It is a similar leap to when we had to place external sensors and try and line everything up before playing anything.

Going into a VR game of applications is pretty much the same across headsets, though it is noticeably crisper and faster/flashier as all new kit tends to be. Though now games win particular are being upgraded to have richer visuals, which in turn will start slowing it all down again. I plumped for the lower spec memory of 128Gb as mostly the family plays Beat Sabre and Pistol Whip, but I now have some games that seems to be 13Gb so it will eat it up more quickly now.

The Mixed Reality (MR) elements of the Quest 3 blending the pass through view of the world with digital elements anchored in that view is the clever bit. This is augmented reality but calling it mixed reality is as accurate. The view you see is virtual, its a camera showing you the physical world, it is able to scan the environment and build a digital mesh of it knowing what are walls, surfaces and objects in the view. It does in the example first steps app asking you to look around you and you see it overlay what it considers the mesh of the world. It’s very matrix, and deliberately visual, not really any need to show the user other than to intrigue them. The demo app then causes a break in the ceiling with bits falling on the floor and giving a view of a space environment and another planet in the distance. Through this hole drops a spaceship that lands (in my case on the the kitchen floor). Being in pass through makes it easy to walk around not bumping into physical things, as you can see them, so get a better look up into the hole in the ceiling. Over the course of the demo tribble like fluffy things break through the walls and run around you space, they drop behind things like the breakfast bar in our kitchen and you can’t see them unless you go and look over. They creatures also run along the walls and other surfaces. It really is quite effective as a cartoon comedy demo of full AR. as this is not holding up a tablet or phone the field of view and turning of you own head and hence view is 100% natural.

The wonderful old painting application Vermillion had some upgrades, so now in full MR the easel and paint brushes can be in your own room and now even better you can hang you paintings on you physical wall and they will stay there if you want them too. You can see family walking into the room and talk to them, though it’s a little odd for them to know whether you can see them or not, having been used to giving the VR headset a wide birth (as the fridge didn’t above) ๐Ÿ™‚

This Quest 3 for the kitchen liberates by Quest 2 for my gaming PC, which in turn retires the tethered Rift/S. It also means we can play multiplayer games with two of us in the house. The main account can share their apps (though only on one headset). I shared the Quest 3 apps and unshared on Quest 2 but we were able (with me logged into Quest 2 and Predlet 2.0 using his account on Quest 3) to play a good few rounds of the VR Walkabout Mini-golf and some multiplayer beat Sabre. I say a good few rounds but the battery life is pretty short on both headsets. This can be sorted by attaching a very long USB cable to the power outlet, but that sort of removes the untethered nature of the headset.

This Quest 3 and pass through is obviously a much cheaper version of what the Apple Vision Pro is aiming to do, but it’s out first, it works and it has a library of interesting things to engage with. Though the really new stuff like Assassin’s Creed and Asgards Wrath II (Remake) are not out at release. So it’s more to play VR things and the odd MR update at the moment. I say out first but pass through has been the mission of commercially available Varjo headsets for a while.

One other thing pass through allows for is being able to see you phone (sort of as its a bit warped). This is very useful as trying to buy things in the store in VR doesn’t work as it needs the phone app to do all the verification. This used to mean taking the headset off, now it means double le tapping the headset for pass through and finding your phone. That’s a silly use case as it should just work in the environment or represent the phone and its OS in the environment, but that’s payment rails for you.

In short, it’s good, worth it and I like it. IO am looking forward to more proper AR/MR apps and experiences, whilst enjoying the brand new ones like Samba De Amigo that just launched (again not new as we played that a lot on the Sega Dreamcast with a set of maraca controllers way back).. and pose….. Happy memories reborn ๐Ÿ™‚

Reconfigure – The Movie, nearly

Having my books Reconfigure and Cont3xt I am always using them and the story I created as a way to explore other mediums. I started drawing a few years ago to see if I could maybe go manga with it, that included buying some action figure mini drawing mannequins that have some tech and weapon props. I created some meta humans and have a unity based mocap suit from a kickstarter to see if I could produce anime. Audio voice generation has been another thing I tried. Each of these projects would be a significant amount of time, effort and probably cost too, but delving in as a mini proof of concept gives me an appreciation of what would be needed.

Midjourney and the AI image generators have offered another option to explore, an very quickly too. One of the challenges is to get the same character to appear in each generated image (i.e. story board). I decided to sort that out later in this POC. Though I did suggest that the actress to play Roisin should be Ruby Rose to get a potential consistent likeness.

Midjourney experiments
Roisin at the start of Reconfigure in a bit of a last minute situation getting off the grid

Having got this image, which was pretty good for a first attempt, with very little text in the prompt I popped over to runway.ml and tried its new gen-2 image to video generation. This time with NO PROMPT at at all! It decided she should look up and stare into the distance, which in the book she is doing as she is being chased and very aware of her surroundings as she tries to outwit those after the amazing tech she has discovered. They only generate short clips on the free accounts, but those few seconds I found to be quite astounding and bewitching.

GenAI is obviously on a major drive into all sorts of areas, but the ability to potentially use tools like this to create a film version of my books on my own time and budget is emerging as a strong contender, unless Netflix or Prime decide they would like to do it instead ๐Ÿ™‚

EyeBlend better than an Apple Vision Pro – still

In 2016 I wrote Cont3xt the follow up to Reconfigure and part of that was an upgrade our lead character gets. She has access to an API that can manipulate the physical world like the virtual, she builds that up into a Unity application for her smart phone in the first book. The second she gets an EyeBlend headset, state of the art (and still just a little ahead of Apple and co’s attempts. So she ports the Unity app and upgrades the messaging system for this high bandwidth device and to enhance the experience. Rather like all the developers are doing right at the moment. Life imitates art ๐Ÿ™‚ Read Reconfigure first, but this might be a topical time to check out Cont3xt whilst you wait for the Apple demo in a local store next year. (Just ยฃ0.99 each for these adventures ๐Ÿ™‚ )

โ€œAn EyeBlend? They are not on general release yet? How did you get that?โ€ Her eyes lit up wildly as Roisin took in the news.ย 

โ€œOh you know, I have a few friends who hack for a living.โ€ Alex replied with an air of comedy superiority. 

โ€œYou sure do! The EyeBlend is an awesome piece of kit. It does full blended reality applications. It uses field effect projection with live object tracking!โ€ She said enthusiastically. โ€œThey call it a hologram, though it really isnโ€™t as such. You can forgive that though as the effect is the same, apparently.โ€ Roisin was on her tech turf again. 

Pixel clothing – it’s back (again)

I saw some links the other day to a new clothing and bag range here at Loewe’s that I thought, hang on I have seen that somewhere before ๐Ÿ™‚ Not posting a picture because its would be their copyright image so just go have a look instead.

This range is physical, but made to look pixelated and game like with chunky and stepped back borders and gradient colouring. It makes something real look a bit like it’s out of minecraft. There have been a few things before like shades to match the Thug Life internet meme, but this is proper expensive clothing, Zoolander territory.

The things was that in 2016, as I have tweeted and popped onto Linkedin, I wrote a little piece of detail about Roisin in Reconfigure to enhance her gamer geek chic status that said the following

“She slipped on her red Converse shoes and her grey SuperDry hoody. The phone justย 
squeezed into her jeans front pocket. She picked up her odd cartoon looking bag, full of all theย 
essentials she might need, lip gloss, money, cards, keys, moleskin notebook, hand wipes. Theย 
handbag was a regular size but looked slightly larger due to the cartoon cel shaded look that it had.ย 
The front panel looked like a 2d drawing of a 3d object in a video game. Thick black lines addedย 
around the primary coloured sections. โ€œ

Now I can’t claim to have invented this fashion range, because as a writer, with a story set in the current day I drew upon lots of things I had already seen or experienced, i.e. write what you know. However, I am really happy that Roisin’s look from 2016 is now popping up again in 2023. In the same way my first ever public presentation when I started as an industry analyst with 451 Research back in 2016 showed the direction towards an autonomous enterprise, such as we are seeing some people create with AutoGPT/ChatGPT and alike right now.

Don’t even get me started on the metaverse ๐Ÿ™‚

Marvel Midnight Suns – hooked on it

I play a lot of games, of a lot of different types and genres, and have done for too many years (45 of them at least). I know that driving games always attract my attention, I am many hours into the current Forza Horizon and all the preceding Forza’s of all types. I like a good free roam RPG, obviously GTA5 rules the roost there, but No Man’s Sky also scratches that itch. However I also really like turn based combat games, many of which have been Switch based ones of late, though good old Xcom used to be the one to drop into. I also really like Fights in Tight Spaces for a quick blast of card based gameplay with great visuals. These turn based games suit us older slower gamers, I do still enjoy run and gun FPS like COD but in multiplayer those tiny milli seconds of thoughts wading through years of experience to get to a trigger pull reaction on a target don’t work.

At christmas I got Marvel Midnight Suns on the Xbox Series X, thinking it would be an interesting, quick enjoyable game to have a go at. It is most of those things but it certainly has not been quick. In fact since xmas 22 I have not played anything else (10 mins on Forza doesn’t count). It has properly dragged me in with its turn based, card/resource battle, over the top cartoon visuals, stacks of side missions to just play for to level up, a massive storyline and a mini world to explore on foot in between battles. All this super heroes are in it (well not all, but an awful lot!), and the lead character is my very own super hero. There is also a stack of costume and room customisation, collectables, research of upgrades and weird little friendship building conversations and gift giving (I did say that was weird).

The heart of the game uses individual superhero abilities expressed in a small deck of cards to say what a move can do or the impact it has. Some cards give heroism as a resource, others spend it. You get 3 cards to play each turn (or more with some levelled up modifiers) out of selection of cards depending on the 3 particular super heroes in the mission you are engaged with. There are also environment moves to throw things, blow up things and drop heavy objects on the bad guys. The camera work and the animations for all these battling effects get increasingly frenetic. e.g. Wolverine (once he joins through the story) is able to chain hits together on multiple enemies, they often begin with a frenzied battle cry and flexing of his blades in traditional Wolverine fashion. It shares a lot with XCom given its by Fireaxis but it does’t have that infantry style constant loss of the team, after all in most cases superheroes live to fight another day. They do get hurt and damaged and some attention to repair them is required though. Also in the deck there are special battle cards that recipes are research,learned, found etc. Various resources can be used to craft those. Also battles often give an item that Tony Stark will analyse for you, a coil, that opens up a set of new cards, which you get to pick a couple to add to heroes you have taken on battles, or your personal deck. Having more than one of the same card they can be smashed together in the training yard to create more powerful attacks. It goes on and on ๐Ÿ™‚

The start of most missions, once you pick the 3 heroes there is a short slow motion walk to camera scene, with avengers music playing that I have yet to skip, as its very much a “hell yeah!” moment. Examples are below, different versions of my hero’s costume in the centre.

Wanda, Hunter and Spider Man
Iron Man, Hunter and Captain America

I am about 60 hours in to this game, and much of that is enjoying the levelling up in side battles, or finding good player combinations, e.g. cap is very much a defensive tank, digging in with block or tainting enemies, spider man is a bounce around getting everyone character. There are also lots of magic and environment focussed characters, Ghost Rider does a lot of damage, but also to himself, Blade grabs energy and makes bad guys bleed for constant damage.

The enemy roster gets quite complex too, whilst it starts with soldiers and big guts with shields, I now have things that when attacked at all split in two, or spiritually linked demons that you have to kill them all in one turn or they come back resurrecting one another. There are also some huge bosses to deal with, not to mention (not really a spoiler) a corrupted Sabretooth and Hulk.

Different battles have different objectives, and sometimes you can get away with just focussing on the objective, e.g. hack all three terminals before the server gets beaten to a pulp, you don’t always have to clear the bad guys, so an otherwise encounter ending move might be a game saving one. Other times it is a take them all out situation.

There is some DLC heading toward the game this weekend I think, with Deadpool. So sorry any other games vying for my attention at the moment. I am busy saving the world ๐Ÿ™‚

What does it take to make the long running Games At Work dot Biz podcast or any other for that matter?

I had the honour of hosting my good friends from Gamesatwork.biz for a BCS animation and games event to hear about how they have managed to create compelling content in a weekly podcast for what is not approaching 400 episodes! The official BCS event link page and links to everything else about the BCS is here but I have added the youtube directly. Its very free form, not scripted and was a entertaining to be the interviewer, especially as I know Michael, Michael and Andy just know what to say and when to say it.

BCS Animation and Games hosts Games at work dot biz team for a free form chat about the show

As the guys explain a lot of the content is based on us all sending them things we are spotting or are busy doing and enjoying. All the podcasts episodes have their own show notes and links so after watching this go check those out and see the sort of thing they do, or vice versa and then start sharing stuff with them, the more the merrier. Also if you are in tech, check out the BCS as a professional org.

AI Blackadder sketch about Blackadder writing a Blackadder sketch

Ok so I tried something a little recursive in nature to see what ChatGPT AI do, and I think this is very much in the write style and has some surprising little rant in it about the process of writing a sitcom. I like this better than the first Blackadder sketch I had in the previous post.

write a Blackadder sketch of Blackadder writing a Blackadder sketch

Baldrick: My lord, what are you doing?

Blackadder: I am trying to write a sketch for our next Blackadder episode, Baldrick.

Baldrick: But my lord, you are Blackadder. Why are you writing a sketch about Blackadder writing a sketch?

Blackadder: Because, Baldrick, it is a clever and meta way to poke fun at the absurdity of the situation.

Baldrick: Ah, I see. But what is the sketch about, my lord?

Blackadder: It is about Blackadder sitting at his desk, trying to come up with a sketch for the next Blackadder episode. But every idea he comes up with is met with objections from the other characters.

Baldrick: Like what, my lord?

Blackadder: Well, for example, I might come up with an idea for a sketch where Blackadder and I go on a quest to find the Holy Grail. But then Prince George would object, saying that we already did that in a previous episode.

Baldrick: Oh, I see. And then what would happen, my lord?

Blackadder: Then I would come up with another idea, perhaps involving a time-traveling Blackadder going back to the Tudor era. But then Queenie would object, saying that we already did that in a previous episode as well.

Baldrick: So Blackadder can’t come up with any new ideas for sketches?

Blackadder: Exactly, Baldrick. It’s a satirical commentary on the creative challenges of writing for a long-running television show.

Baldrick: Oh, I get it now, my lord. That sounds like a very clever sketch.

Blackadder: Yes, Baldrick, I thought so too. But now I just need to figure out how to end it.

Baldrick: How about Blackadder giving up and going for a walk instead?

Blackadder: Hmm, not bad, Baldrick. Not bad at all. I think I’ll go with that. Thank you for your assistance.

Baldrick: Always happy to help, my lord.

Blackadder: Now let’s hope this sketch is a hit with the audience.

(Blackadder gets up from his desk and exits, with Baldrick following behind.)