Experiments with SL and green screen

I had not explored the options in iMovie other than the recent movie trailer additions and a little bit of picture in picture. I realized that it had a green screen processor, which means it will composite in one video with another treating the green as a replacement.
So I thought I would try something. This is just a render test rather than a finished article, but it shows the principle.
I built a green studio in Second Life, just some prims, full bright and green. Placed my epredator avatar in the scene, pointed the camera at me and recorded the result with SnapzPro.

Next I dropped the result in iMovie onto some of the Predlets movie trailer footage. I had to adjust the green screen crop but it actually worked!
I bought a new camera for the business today, a HD Kodak ZX3 waterproof minicam (like a Flip HD but more robust). The iphone is handy for video but thought I needed something, still homebrew, but dedicated. I popped a 16Gb SD card in to give a good few hours of filming.
I will be giving it a test at the Gadget Show Live press day on Tuesday. As if there was not enough to see there already !

Immersive video live in 360 degrees

A long while back we had 360 degree panoramic webcams that let people direct what they see. Now we of course have google street views of the world, but I was impressed with this 360 video that you can drag around of NYC

According to this article live versions of Immersive Media’s technology is being used for an on demand music service.
Of course once you have live 360 video coverage you can still instrument that display and provide virtual augmented reality over the top, which is an interesting twist I think.

It really is that simple, virtual world whiteboards and diagrams

As a first bite on technology so you dont have too I thought this subject would be a good place to start.
A common theme I often come across is one of the need to deal with certain real world applications in a virtual world. As with all things related to mirror worlds the assumption often has to start at a replication of exactly the way we currently do something. In some ways it could be considered a challenge to virtual worlds that they are made as effortless as a blackboard and chalk are in todays class rooms. The pencil and a piece of paper has done us proud over many years. There is a place for a simple pen outline or diagram, a chalk and talk to a sports team, a hey guys I have a great idea look at this etc. Those work really well in real rooms, where people just grab whatever they have to hand, adjust to their surroundings and draw and share. At the moment virtual worlds do make that quite tricky. Intricate details take time. We have pens and touch tablets, screen sharing etc and it is of course possible to inject that into many virtual worlds in various ways.

However, I want people to go back and think this over. Even with the metaverse technologies we have today we already have very effective and simple tools often built in, that at almost no cost can be used to get a point across. More than one person present at a place in a virtual environment able to see the ideas form and mental placeholders moved around them as the person explaining the idea manipulates the environment. 

I produced this video, a live one take event. The aim to show that if I wanted to explain something, in this case it could even be a sports play of some sort, I could with no frills, no fancy touches show my ideas. The example shows using Second Life or Opensim (though anywhere you can rez anything this would apply) creating objects, moving them around, assigning meaning through colour and position and manipulating the environment live can be done simply and to great effect. 

The idea is to understand these basics and be then able to visualize the potential. If with a little more effort these movements are recorded and replayed back, interchange with other services, if the blocks are replaced with more detailed meaningful models, if simpler interfaces make moving the pieces even easier, if weight and friction and physics are used they all start to enhance the basic feature. One that is an environment that is adjustable and not constrained by the limits of a whiteboard, or piece of paper or laptop presentation.

The takeaway from this is that there are many ways already to use a metaverse platform to exchange ideas. It can be as simple as this. In a given situation this may be enough to sketch a solution. This basic experience can be made much more visually stimulating, if it needs it, but it required no buildings, places, presences, branded experiences to get this across in 2 minutes. Get some people together, just try it and see if it works.

****UPDATE In order to understand and actually visualize how these things can work when directed at a particular sport and with gaming console technology you can see what happens here in a fan video for Madden 09, using the tools available to describe American Football offence tactics.