As we head into the christmas holidays I have a little present for you. It is the latest edition of Flush magazine for Christmas. Amongst all the other wonderful articles and information there on page 97 you can find my article Reel to Real. It is another journey through technology. Starting with me playing with a reel to reel spooling tape machine, passing through and including Roland Rat – Superstar, Instant Music and heading on to Melodyne (which has been on the radar a while)and the wonderful Rocksmith 2014 with a few other bits in between. Maybe leading to a self composed xmas number one tune next year 😉
@tweetthefashion have, as I always say because its true, done a wonderful piece of work in laying out and adding style and finesse with pictures, fonts and lots of hard graft to make my words look good. A huge thank you and bundle of christmas cheer goes to them. It’s been a great year with a lot of articles
So if you are sitting there, fed up watching the Italian Job again, give it a little look 🙂
Have a great Christmas and New Year. See you on the other side.
I was excited to see the new project from Hello Games get a lot of blog and press time the last few weeks. If you don’t know who Hello Games are they are an indie development team who brought us the delightful cartoon fun of Joe Danger. A guy on a motorbike leaping and ducking around a number of tricky levels. It is hard not to bump into Joe Danger on every platform now, but I remember their stories of toil and trouble at Develop where they had houses mortgaged to the hilt in a last ditch attempt to get the game out. Whilst they have capitalised on, and followed up Joe Danger they have also been hard at work on an incredibly diverse project.
Joe Danger is a cute, side scrolling reaction game. It has a certain something, using an existing format but making it more fun, rather like Angry Birds has done.
The new work though is stunning in its claims. They have built a sandbox universe, not a sandbox city, desert or planet. No, a huge space of planets with detail on those planets. The claim is that everything is procedurally generated. Just to clarify, that means that rather than sitting drawing everything by hand, it is code that generates the places in a defined style. Behaviours of creatures, races, vehicles etc can also be put into this. Way back elite had a procedurally generated universe of names of planets. However you did not visit those planets and swim in their oceans. It was also not a shared space with others online. Terrain and in particular trees, or anything of a more ractal nature are often procedurally generated, like the famous Barnsley fern. This ramps that up a whole level. Of course in fractal maths complexity is the same at every level of abstraction so the detail of a leaf is the same as the pattern of the universe. It, rather like probability maths, tends to go against our common sense approach to the world. However when you ponder it a bit longer it does start to make sense.
With this sort of background information the video below gets even more interesting. Procedural generation at an atomic level. A world that your presence in influences its development. A place that is not technically infinite, but so big it would take more than a lifetime to explore. That is all very exciting stuff!
There is a good article with some more detail and a proper interview, rather than my speculation based on how I would do this over at destructiod
I am looking forward to this one though. Well done Hello Games welcome to the metaverse industry 🙂
You may have noticed that whilst I am primarily a techie I do like to dabble and explore the potential of art with technology. I am well aware though, and often have to explain, that I am not a graphic designer or an artist. My creative toolkit is code and components. It may even be considered unusual combination of applications in a sort of collage of code. Nearly everything needs some sort of interface though. I am constantly frustrated that my vision of what something should look like is often not quite what I manage to achieve with the tools I have to hand. However, I have often thought that with a limited number of pixels on a screen and a pallete of colours with enough time and effort it must be possible to make something close to perfect as an image. Of course the key there is time. It is really just a rewrite of the concept of an infinite number of monkeys and typewriters generating the complete works of Shakespeare!.
I was struck, this week, by an ipad painted image of Morgan Freeman that was doing the rounds of social media. I had only seen the finished image. Whilst everything was saying it was a painting it seemed just too good. So I was amazed but skeptical. Then last night I the video below showing the stages of the painting captured as it happened. This is truly stunning.
It is the work of Kyle Lambert and looking at his portfolio it is hard not to feel that he has reached a level that very few other people ever will. What is great though is a link to a lot of tutorials he has created and various articles so he is not keeping this talent to himself.
I would have though a picture of the quality of Morgan Freeman would have put me off even ever trying but instead I am more inspired to learn more about artistic techniques now and try and level up my coder art just a little bit.
Even tools like Forza 5 let us express ourselves artistically through the media of the automobile. The tools (as in previous Forza’s are a decal based approach, similar to Second Life prims that you have to shape and skew, layering up to create the look you want.
I have now upgraded my feeding edge logo and predator face a bit more and my flagship car (which was the Ferrari 430 is now the LaFerrari The predator mask has always worked well (IMHO) on cars with front scoops like the subaru Impreza.
However the LaFerrari has a brilliant cut out front that is even better. The back of the car has hardly any surface so I popped on a very small Feeding Edge logo under the Ferrari logo.
Ok so it’s not a 200 hour Morgan Freeman ipad painting but its a start 🙂
Friday 22nd November was the release day to Xbox One. As an early adopter and Xbox fan (though I own all the consoles) I had my preorder with Amazon in since May. I sometimes think I should try the midnight queueing in the cold at a store for the ultimate release day excitement but instead I opt for the pain of waiting all day for a delivery to arrive. To be fair to Amazon it did arrive on day one but just at 4pm.
I was joking around on the day that it was a race between @elemming flying back from a work trip to Dubai and the Xbox One arriving. @ids pointed out on twitter that this sounded like a Top Gear race 🙂
The Xbox actually arrived 20 minutes before @elemming due to some flight delays out of Dubai due to rain! However she had the last laugh in the race as having unboxed and plugged in the HDMI and audio cables and got it all going I was in the mandatory day one patch downloading sequence.
I have to say that this day one experience, or any day for that matter was incredibly annoying. It is of course part of the point of early adoption that you end up waiting on progress bars. It, like waiting for a delivery is a good exercise in patience!
My internet is slow, BT still have yet to provide Inifnity to me despite being in an enabled area. I had to leave work uploads to a code management system for 9 hours. So seeing a brand new mating sat there demanding that it does a mandatory 600Mb patch in order to do anything with it was strange. It did work for me though, it took a while but then I was in the new dashboard. It plays a nice video to welcome you, and I wonder if that video was part of the 600Mb “patch” if it was then I could have done without it!
The new Kinect 2.0 worked really well straight away as I signed in and picked up my Xbox Live account. I took the Day One giant QR code for Fifa 14 and the camera picked it up straight away and set about the digital download of the game. This of course was going to take ages too as it was 10gb. Oh well, I thought I would leave it in the background anyway. I had the discs of preordered games. I had Call of Duty Ghosts, Ryse Son of Rome and Forza 5.
So excitedly, now I had a brand new patched console I put the disc in the drive. There it was Forza 5. It then started to install, no more playing from the disc you have to fill up your hard drive. However I was met with a message that this also needed a mandatory patch. It then told me the “patch” was 6GB! With my 4Mbps line that would be about 4 hours. So I did not expect to be able to play anything on day one as it was already late in the day. It seemed each game required a huge update for Day one. I tried COD and Ryse and got the same message.
I left it for an hour or so and cooked tea put the predlets to bed etc.
I checked google a few people suggest a fix. Turn off your internet!
So I did! I cancelled the Forza installation and started again. The game then installed from the disc and started up. I was greater with the awesome introduction/ training of getting to drive the Maclaren P1 around a street circuit. Feeling the new features of the controller with independent rumbling on the triggers. This meant the wheel spin of acceleration or sliding feels different to the ABS judder of the brakes. I did a few races and was pleased with the depth of integration of Top Gear, especially the 3rd race around the Top Gear style “simulation” of London. I won’t spoil it for you but if you like Top Gear/Clarkson/cars its brilliant.
I was eager to try each game now I was disconnected. Each disc needed to do its hard disk install. Those install times seemed rather slow but they worked.
Ryse, as a new game, was instantly visually impressive. It is a hack and slash game with a relatively simple dynamic. It is a bit like a 3d version of Streets of Rage or Double Dragon. That is a good thing BTW! It is rated 18 for its violence, unlike Call Of Duty that is now a 16 though for swearing and violence. You engage as a roman centurion in sword and shield combat. Most of that occurs in real time. It relies on timing blocks, shield breaks and slashing with the sword. Eventually you wear down the opponent and skull appears over their head. Then it is time for an execution move. Here, when you trigger it, everything goes into close up slow motion, lots of cinematic camera moves. The moves become QTE (Quick time events). A subtle colour flash around the victim indicates yellow or blue buttons need to be pressed. Each time you do it speeds up and performs a suitable move. Before long you are seeing limbs loped off, swords driving up and through and out of victim and various stomps onto a downed opponent. It sounds horrible, but it has a sort of boom, boom, bash, take that bad guy feel to it. The viciousness of this is very much in the style of 300 or Gladiator. I am not sure it needs to be 18, though I have yet to complete the game. It is great fun though.
Finally I popped COD Ghosts in and was greeted with the usually things we see inn the franchise. I think knowing that the exact same game was out on the previous generation I was not instantly blown away. However I have played all the COD single player stories all the way through since the original one. I know it will drag me in.
After getting to play a bit on Day One despite Microsoft’s best efforts to stop me due to the patch sizes and mandatory nature of those (which was clearly not the case as the games ran fine!) I thought I would reconnect to the net. I sparked it up and popped Forza 5 back in. I was expecting to trigger a background download, but instead it went back to saying I couldn’t play until 6Gb was downloaded. This was surprising. The previous Xbox 360 did do patches but they were always really small, and quick, 5 minutes max. It was the Sony PS3 that annoyed me with its mandatory patches. You wanted to have a quick go on something and it would stop you and demand you patch, for hours. I double checked my Xbox packaging that Sony had not bought out the Xbox and applied this draconian patching to Xbox!
I gave up and started all the downloads. It only does one at a time, it queues the others. Obviously the pipe into the house is way to slow (Come on BT fix it please!!!!) but it seems to have a go downloading one thing, then swaps to another in a queue. I am not sure if there is a setting (who can tell in the confusathon of the interface) to ask it to finish one thing first but the end result after a nights sleep was this.
That’s right, only Ryse Son of Rome had “completed” Fifa was at 91% but was apparently able to start. Forza 93%, COD 97% and the demo of Kinect Sports only 28%. I checked the Xbox network settings and despite a 4Mb line it was claiming 1Mbps. I guess the servers were somewhat slammed by 1 million xbox’s trying to patch !
I clicked on Fifa 14, as it was ready to start. It started a bout then said it need to patch some more, and dropped back to 90% !
I was pleased then to just shout at the Kinect “Xbox Turn Off” (and was prompted with a yes/no) so I said yes definitely ! The downloads still happen when the box is off so I thought I would let the poor thing concentrate!
So I am not overly impressed with all this extra delay, which of course is compounded by horrendous internet speeds. However, I do not understand how, after all the online/offline fuss and reversals of policy Microsoft managed to pick the worst of both worlds?
Anyway, back to Forza 5. It is all working really well now (I am just dreading any more game stopping patches).
Forza 5 has utterly stunning models of the cars with lots of information, voice overs from all the top gear team and much more. Back once again is the really enjoyable car painting. Each Forza generation I end up spending a bit of time adding to the cars. 3 to 4 allowed an import of the vinyl sets, but this gen was a start from scratch moment.
I have once again created a predator style face, similar but different to previous versions. It still works really well on the front air scoop of the scooby. I have my Feeding Edge company logo, g33k script work too.
One of the interesting things the Xbox One provides is recording of game clips. This used to be only in a particular game. However now any game will record clips. If something interesting happens you can bark out to the Kinect “Xbox record that!” and it saves the last 30 seconds of gameplay. You can set up a recorder for longer and you can still use Forza’s replay to go back and find the moment. Once you have the moment recorded you can go to upload studio (another app you have to download!) and do some basic editing. This includes being able to record a voice over or picture in picture combination. It is going to be interesting to see what people do with this. I had a scary near miss at Spa (as I have full damage turned on in Forza 5 so this would have been race ending if I had hit the wall). I also added the “night vision” intro template in a predator style reference.
All the cars in Forza 5 are supposed to not just be in game AI but are Drivatar’s. When you play it collects data on how you drive, you style and pace. It then sends “you” to other games when you are not racing online. It seems a lot of early Drivatars want to brake going up the hill at Eau Rouge. It has a blind crest at the top and the compression on the slight right hand leading up to it unsettles the car, but don’t brake so much 🙂
I think also the Drivatar’s may take their paint jobs with them too. There is an option to turn off accepting paint work. That means, in theory, when I am not driving my Feeding Edge company car with my epred and company logo is busy turning up in peoples living rooms around the world. Up until now that had only been if I took my car into an online race. I expect some more advercar/Drivatars will start appearing very soon?
There is a great video here of a real car on the real track hitting that corner and nearly losing it about 40 seconds in. I think it shows how far simulation are getting, and yet still being playable.
Finally Fifa 14. we are not really a football family, though Predlet 2.0 is growing to love footy. Fifa 14 (and I have had a lot of other Fifa games) is really slick. The players seem to be a koch more complicated physical model as running into one another rather than just slide tackles can cause all sorts of fouls. We have had some great games as 2.0 has got the hang of the mechanic of Fifa, mostly from playing the iPad version. Predlet 1.0 is also getting to grips with it. It’s a great game, and was free with the Xbox One Day one edition. That was a good call, it is just a pity they didn’t pre-install so that you could just plug and play in seconds.
There is a lot more to experience and figure out. Do I bother plugging the Sky box on the HDMI pass thru? So the Xbox in a pre Sky filter? I am not sure that gives me much? Especially as each box in my set up is an optical output to the 5.1 speaker amp. There is a “beta” setting on the Xbox to enable 3d surround to be passed through, but that means its going to have come down HDMI get processed by the Xbox and pumped out. It sounds a lot of messing around, though it will removed the channel selection on the AV receiver. However it also means the Xbox has to be turned on all the time the TV is? Oh well I beset go and experiment. Feeding Edge – Taking a bite out of technology so you don’t have to 🙂
The title almost sounds like a football score but we are very close to the release of the next wave of consoles. I am particularly interested in the Xbox One for a number of reasons.
1. Most of my gaming has gravitated towards the 360 so I am geared up for the next gen franchises like the fantastic looking Forza 5.
2. The Kinect 2.0 has some features that are going to be really useful for any Choi Kwang Do applications.
3. Unity 3d has a tie in with Xbox One development.
4. Proper Cloud processing, utility computing looks like it is part of Microsofts plan. Not just streaming games from elsewhere, but farming off processing to large servers and delivering results back. (Grid computing as we used to call it 🙂 )
As a Unity developer, whilst much of what I do is private rather than publicly available I am really interested in being able to deploy to the Xbox One. It opens up a lot of possibilities from a research point of view and may lead to some extra commercial work.
I have applied for the ID@Xbox scheme which is to help developers get onboard with the Xbox One. Eventually any Xbox One will be a potential piece of development kit, which is great news, but at the moment they are still in the old console model of needing a special Xbox One to develop on. Unity3d have announced free versions of Pro to go with those kits. As a Pro licence owner already I really just need access to the kit.
In particular when you see the different in how the Kinect 2.0 can deal with the human form as in this video.
***fixed the link as it was the wrong kinect video 🙂
Having a richer skeleton, complete with real shoulders, but also the tools already in place to look at weight distribution, muscle tension, limb acceleration and interestingly too the heart rate from the face.
You can see that this provides me with a whole lot more tech vocabulary to be able to analyse what we do in Choi Kwang Do and provide a training aid, or training mirror. This is compared to where I am up to with Kinect 1.0 as in this previous post (one of my virtual/gaming technology Use Case examples)
I am not sure if I meet Microsoft’s requirements to be called a developer, but then most of what I do never fits on any of these forms that I have to fill in 🙂 If I do and I get access to dev kit that is great. Either way this is much more useful than the alternative platforms, so I am hopeful.
As it is my Xbox One has been on preorder since they were launched so I am hoping the post delivers it promptly on 22nd November, just 10 days away. That is the fun side of playing, but it as ever is also part of learning and understanding the world of technology.
A few days ago my a new version of my favourite console application arrived Rocksmith 2014. I had seen the press release video of some of the new things in Rocksmith and they did look interesting. For once my expectations have been exceeded though. Rocksmith 2014 session mode is probably the best piece of software I have ever seen or used. That is a pretty big claim to make but it makes playing the guitar just feel right.
They have improved a lot of other things about the regular tune learning too. Just in case you are not aware this is a guitar “game” that you plug a real electric guitar into you console/PC and the device acts both as an amp but also as a guide. It knows the notes you are playing.
So to learn a tune the notes stream at you indicating the fret and the string (colour coded as well as positional). If you hit the note, great, it will start adding more notes and chords in to level you up. Miss and it will start to make life simpler. This is with real music, the original versions in every style under the sun.
The new version provides challenges in each song and coaches you through, there is less focus on the score, though score attack mode still exists. It is about the playing and about it feeling and sounding right.
The previous version had a way to isolate a section or a riff and repeat it, first slowly then speeding up. The new version has this but with lots more options about the number of repeats, the initial speed the level jumps etc, all easy to adjust. It also lets you just hit a button whilst playing the whole song and jump to riff repeating where you are in that song. Something that was awkward to navigate to before.
All the old DLC from the previous version is ported to the new version, though…. they rather cynically charge you another £7 or so to do so. If you have bought a lot of tunes then I would have expected some loyalty bonus. The additional tunes are now also £1.99 which seems steep, but then they are not just the music but the structure of the song and how to learn it. If you think how much guitar lessons might cost it is minimal really.
The real extra star of the show though is session mode. However it has been built it provides, to me at least, a fantastic backing band for any guitar noodling I feel like doing.
I love playing the blues scale, in the past I have tried playing along with famous tunes on CD’s and tapes, obviously that never gets to the standard of the stars of the blues world that are playing those tunes. I have also had backing tracks with books etc they are fine if you are trying a fixed thing to play. Session mode lets you load a band, lots of styles and types. Blues has 5 or 6 specific types on its own let alone Rock, Metal, Indie etc. In the set up you can pick a key, a scale and things around tempo and the relaxed or rigid nature of your backing band.
Then you start to play. Initially the band are doing nothing, but the first few notes on the scale you have chosen (which is highlighted on the fretboard on screen) starts to set the band in motion. The drummer and bass kick in at the sort of pace and intensity you start. Even just playing the scale slowly gets them going. Before long you find a little riff to play and maybe some chords and before you know it the other instruments keyboards, and guitars are joining in filling you room, in this case, with the blues. If you have set it to a more formal structure the band will start making the changes as in a 12 bar blues and the the onscreen scale suggests the notes that work in those step changes. Of course you can play what you want. Speed up slow down the band goes with you and it doesn’t just feel like a load of loops being triggered. There seems to be a more complex analysis of the notes you play that gets the band doing their thing. You can almost nod to them when you want to step it up or hit a quite solo. It is stunning.
I am not a musician by any sense of the imagination, but I have tried guitar many times. The previous Rocksmith taught me a lot, something that it continues to do in learning songs. However session mode is the delight. It is a very clever, very patient, non judgemental band. Experimenting with riffs and scales in lots of styles is truly enlightening and relaxing. It is not about trying to hit a note, keep up, make a score, copy a sound but instead find a self expression through making music.
It is not without it’s downside but one does have to suffer for one’s art darling
My fingers have now hardened a bit again, you just have to soak blisters in lemon juice and keep playing on them.
There was something else very cool about 2014. It is even better for 2 guitars now. Predlet 2.0 plugged his little electric guitar in and we jammed in session mode. He explored what happened as hit hit notes fast and slow and we even did a bit of riff turn taking. A right noise for everyone else in the house but we had fun. We then moved on an made more racket playing Billy Idol White Wedding which is a bit go a fave tune of ours 🙂
I am not on their payroll, just a very enthusiastic fan. If you like guitar you have to get this !
I took inspiration from @elemming recent fall ice skating in which she broke her wrist for this months article in Flush Magazine. It seemed a few things came to mind around safety and in particular the use of air bags in all sorts of places.
You can read it and see the really nice layout, once again great job making my words and ideas look awesome on the page. Thankyou again @tweetthefashion for all the hard work getting this issue out there.
The direct link is here
As usual its quite a mix but does share a common theme, mars landings, motorbikes and ice 🙂
I hope you enjoy the article, and the rest of the excellent magazine.
As I am looking at a series of boiled down use cases of using virtual world and gaming technology I thought I should return to the exploration of body instrumentation and the potential for feedback in learning a martial art such as Choi Kwang Do.
I have of course written about this potential before, but I have built a few little extra things into the example using a new windows machine with a decent amount of power (HP Envy 17″) and the Kinect for Windows sensor with the Kinect SDK and Unity 3d package.
The package comes with a set of tools that let you generate a block man based on the the join positions. However the controller piece of code base some options for turning on the user map and skeleton lines.
In this example I am also using unity pro which allows me to position more than one camera and have each of those generate a texture on another surface.
You will see the main block man appear centrally “in world”. The three screens above him are showing a side view of the same block man, a rear view and interestingly a top down view.
In the bottom right is the “me” with lines drawn on. The kinect does the job of cutting out the background. So all this was recorded live running Unity3d.
The registration of the block man and the joints isn’t quite accurate enough at the moment for precise Choi movements, but this is the old Kinect, the new Kinect 2.0 will no doubt be much much better as well as being able to register your heart rate.
The cut out “me” is a useful feature but you can only have that projected onto the flat camera surface, it is not a thing that can be looked at from left/right etc. The block man though is actual 3d objects in space. The cubes are coloured so that you can see join rotation.
I think I will reduce the size of the joints and try and draw objects between them to give him a similar definition to the cutout “me”.
The point here though is that game technology and virtual world technology is able to give a different perspective of a real world interaction. Seeing techniques from above may prove useful, and is not something that can easily be observed in class. If that applies to Choi Kwang Do then it applies to all other forms of real world data. Seeing from another angle, exploring and rendering in different ways can yield insights.
It also is data that can be captured and replayed, transmitted and experienced at distance by others. Capture, translate, enhance and share. It is something to think about? What different perspectives could you gain of data you have access to?
With my metaverse evangelist hat on I have for many years, in presentations and conversations, tried to help people understand the value of using game style technology in a virtual environment. The reasons have not changed, they have grown, but a basic use case is one of being able to experience something, to know where something is or how to get to it before you actually have too. The following is not to show off any 3d modelling expertise, I am a programmer who can use most of the tool sets. I put this “place” together mainly to figure out Blender to help the predlets build in things other than minecraft.With new windows laptop, complementing the MBP, I thought I would document this use case by example.
Part 1 – Verbal Directions
Imagine you have to find something, in this case a statue of a monkey’s head. It is in a nice apartment. The lounge area has a couple of sofas leading to a work of art in the next room. Take a right from there and a large number of columns lead to an ante room containing the artefact.
What I have done there is describe a path to something. It is a reasonable description, and it is quite a simple navigation task..
Now lets move from words, or verbal description of placement to a map view. This is the common one we have had for years. Top down.
Part 2 – The Map
A typical map, you will start from the bottom left. It is pretty obvious where to go, 2 rooms up and turn right and keep going and you are there. This augments the verbal description, or can work just on its own. Simple, and quite effective but filters a lot of the world out in simplification. Mainly because maps are easy to draw. it requires a cognitive leap to translate to the actual place.
Part 3 – Photos
You may have often seen pictures of places to give you a feel for them. They work too. People can relate to the visuals, but it is a case of you get what you are given.
The columned corridor
Again in a short example this allows us to get quite a lot of place information into the description. “A picture paints a thousand words”. It is still passive.
A video of a walkthrough would of course be an extra step here, that is more pictures one after the other. Again though it is directed.You have no choice how to learn, how to take in the place.
Part 4 – The virtual
Models can very easily now be put into tools like Unity3d and published to the web to be able to be walked around. If you click here, you should get a unity3d page and after a quick download (assuming you have the plugin 😉 if not get it !) you will be placed at the entrance to the model, which is really a 3d sketch not a full on high end photo realistic rendering. You may need to click to give it focus before walking around. It is not a shared networked place, it is not really a metaverse, but it has become easier than ever to network such models and places if sharing is an important part of the use case (such as in the hospital incident simulator I have been working on)
The mouse will look around, and ‘w’ will walk you the way you are facing (s is backwards a,d side to side). Take a stroll in and out down to the monkey and back.
I suggest that now you have a much better sense of the place, the size, the space, the odd lighting. The columns are close together you may have bumped into a few things. You may linger on the work of art. All of this tiny differences are putting this place into you memory. Of course finding this monkey is not the most important task you will have today, but apply the principle to anything you have to remember, conceptual or physical. Choosing your way through such a model or concept is simple but much more effective isn’t it? You will remember it longer and maybe discover something else on the way. It is not directed by anyone, your speed your choice. This allows self reflection in the learning process which re-enforces understanding of the place
Now imagine this model, made properly, nice textures and lighting, a photo realistic place and pop on a VR headset like the Oculus Rift. Which in this case is very simple with Unity3d. You sense on being there is even further enhanced and only takes a few minutes.
It is an obvious technology isn’t it? A virtual place to rehearse and explore.
Of course you may have spotted that this virtual place whilst in unity3d to walk around provided the output for the map and for the photo navigation. Once you have a virtual place you can still do things the old way if that works for you. Its a Virtual virtuous circle!