technical


CES 2017 – Bucket list, tick

For many years I have seen the CES show appear in magazines, then TV and then of course all over social media. As a long time tech geek and early adopter I have always wanted to attend, but never been able to. In my corporate days getting approval for a train to London was a chore. As a startup I never had the time or money either, preferring to invest in the gadgets like the Oculus rift or paying for a Unity license so I could build things. On the TV show we talked about CES, and if we had gone to a 4th series it was on the cards.
This year, with my industry analyst role in IoT I was able to go. Of course as a work trip it was a bit different to just being able to take the show in.
CES 17
I had briefing after briefing with a bit of travel time in between for the 2 main days I was there. Once thing that is not always obvious is just how big the show is. Firstly there is the Vegas convention centre with North and South Halls that is bigger than most airports. It was so big that I only got to really visit the south halls, the north hall of cars, a motor show in its own right alluded me. All the first days meeting were around the south halls. Day 2 was at the other areas of the show, the hotels have their own convention centres and also floors and suites get rented out. The Venetian Sands, Bellagio and Aria all had lots going on, each as big as any UK show it seemed.
Walking around 9-10 miles a day, still not seeing everything, at a trade show gives you an indication of the size.
#ces2017 steps
Again I pretty much missed most of the expo floor with meetings but the day I felt out I had an hour to pop back to the Sands main hall and see some things.
The split across the entire show of giant corporate powerhouses to tiny startups with a single table was amazing. I had assumed it was all the former, but the latter is heavily supported and with kickstarters and maker culture now mainstream it will continue to be really important.
One thing I was there to see was how much Augmented Reality was taking off running parallel with the VR wave, there were a lot of glasses and of course the Hololens and the industrial focussed Daqri smart helmet. Still not there as a consumer focus really yet, though the Asus Zenfone AR powered by Tango and Qualcomm was announced but not on sale until later in the year (no date given) which may put true AR into people’s hands.
#ces2017 day 1
HoloLens
#ces2017 day 1
DAQRI Smart Helmet

It was CES’s 50th anniversary, and that was fitting given I turn 50 this year too. It may not be the most exiting bucket list tick but I have already done lots of mine, and need to refresh the list anyway. I am not sure that will include riding in this human size quad copter that you fly with a smart phone though!
#ces2017 day 1

I guess we best experience everything before these guys and their bretheren take over.

#ces2017

Still at least we can 3d print new parts for ourselves

#ces2017

As you will see in this album the whole place just becomes a blur of everything looking the same, lights, sound, people, attract loops etc. All very fitting to be in Vegas.

CES 17

@xianrenaud and I wrote a spotlight piece for 451 Research as a show roundup which may end up outside the paywall CES 2017: connected, autonomous and virtual in case you do have access.

So that made a whirlwind start to this year. This time last year I was published Cont3xt and wondering what the next steps were going to be. This year I have stacks of IoT research and writing work to get on with, a 50th birthday to not get worried about, imminent wisdom tooth removal (yuk) and all being well a 2nd Degree/Il Dan black belt text in Choi Kwang Do. So onwards and upwards. Pil Seung!

Shady Maths for Shaders

I like a good tech challenge and I decided to look a bit more into the world of shaders. I had not fully appreciated how many competing languages there were for these low level compute systems. I had used shaders in Unity3D and I knew we could do fancy things with them. They are wonderfully complicated and full of maths. As with all code you can go native and just write the code directly. There are also tools to help. Things like shadertoy let you see some of the fun that can be had. It reminds me of hacking with the copper process in the old Amiga days. Low level graphics manipulation, direct to the pipeline.
In Unity3d there is a tool I bought a while back called ShaderForge. It allows for editing of the shaders but in a visual way. Each little node shows the results of the maths that is going on in its own area. It is common to have this sort of material editing in 3D applications. There is a lot of maths available and I am only just skimming the surface of what can be done.
I was trying to create a realistic wood ring shader. I wanted to do something like that demonstrated here in normal code (not any of the shader languages).
I ended up with something that looked like this.
ShaderForge Unity3d
It was nearly the concentric rings, but I can only get it to start from the bottom left corner. I have yet to work out which number I can use for an offset so I can get full circles. I have worked out where to put in variance and noise to make the lines wobble a little. So I am a little stuck if anyone has any suggestions I would be very grateful 🙂 I want a random shader texture, slightly different each time which is why I am not using images as I normally do. I am not worried about the colour at the moment BTW. That is a function of scaling the mapping in the sin sweeping function that creates the ripples. I stuck a few extra value modifiers (some set to 0) to see if I could tweak the shader to do what I wanted, but no luck yet. Shaderforge has its own meta data but the thing can be compiled into a full shader in native code.
Just look at what it generates, its fairly full on code.
So it looks like I have quite a lot of learning to do. With code there is always more, always another language or format for something 🙂

***Update Yay for the internet. Dickie replied to this post very very quickly with an image of a very much simpler way for generate the concentric rings. This has massively added to my understanding of the process and is very much appreciated.


// Compiled shader for Web Player, uncompressed size: 14.0KB

// Skipping shader variants that would not be included into build of current scene.

Shader "Shader Forge/wood3" {
Properties {
_Color ("Color", Color) = (0.727941,0.116656,0.0856401,1)
_node_4617 ("node_4617", Color) = (0.433823,0.121051,0.0287089,1)
_node_1057 ("node_1057", Float) = 0.2
}
SubShader {
Tags { "RenderType"="Opaque" }

// Stats for Vertex shader:
// d3d11 : 4 math
// d3d9 : 5 math
// opengl : 10 math
// Stats for Fragment shader:
// d3d11 : 9 math
// d3d9 : 15 math
Pass {
Name "FORWARD"
Tags { "LIGHTMODE"="ForwardBase" "SHADOWSUPPORT"="true" "RenderType"="Opaque" }
GpuProgramID 47164
Program "vp" {
SubProgram "opengl " {
// Stats: 10 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
"!!GLSL#version 120

#ifdef VERTEX

varying vec2 xlv_TEXCOORD0;
void main ()
{
gl_Position = (gl_ModelViewProjectionMatrix * gl_Vertex);
xlv_TEXCOORD0 = gl_MultiTexCoord0.xy;
}

#endif
#ifdef FRAGMENT
uniform vec4 _Color;
uniform vec4 _node_4617;
uniform float _node_1057;
varying vec2 xlv_TEXCOORD0;
void main ()
{
vec4 tmpvar_1;
tmpvar_1.w = 1.0;
tmpvar_1.xyz = mix (_node_4617.xyz, _Color.xyz, vec3(((
sqrt((pow ((xlv_TEXCOORD0.x + 0.2), _node_1057) + pow ((xlv_TEXCOORD0.y + 10.0), _node_1057)))
* 11.0) + -10.0)));
gl_FragData[0] = tmpvar_1;
}

#endif
"
}
SubProgram "d3d9 " {
// Stats: 5 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
Matrix 0 [glstate_matrix_mvp]
"vs_3_0
dcl_position v0
dcl_texcoord v1
dcl_position o0
dcl_texcoord o1.xy
dp4 o0.x, c0, v0
dp4 o0.y, c1, v0
dp4 o0.z, c2, v0
dp4 o0.w, c3, v0
mov o1.xy, v1

"
}
SubProgram "d3d11 " {
// Stats: 4 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
ConstBuffer "UnityPerDraw" 336
Matrix 0 [glstate_matrix_mvp]
BindCB "UnityPerDraw" 0
"vs_4_0
root12:aaabaaaa
eefiecedaffpdldohodkdgpagjklpapmmnbhcfmlabaaaaaaoeabaaaaadaaaaaa
cmaaaaaaiaaaaaaaniaaaaaaejfdeheoemaaaaaaacaaaaaaaiaaaaaadiaaaaaa
aaaaaaaaaaaaaaaaadaaaaaaaaaaaaaaapapaaaaebaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaabaaaaaaadadaaaafaepfdejfeejepeoaafeeffiedepepfceeaaklkl
epfdeheofaaaaaaaacaaaaaaaiaaaaaadiaaaaaaaaaaaaaaabaaaaaaadaaaaaa
aaaaaaaaapaaaaaaeeaaaaaaaaaaaaaaaaaaaaaaadaaaaaaabaaaaaaadamaaaa
fdfgfpfaepfdejfeejepeoaafeeffiedepepfceeaaklklklfdeieefcaeabaaaa
eaaaabaaebaaaaaafjaaaaaeegiocaaaaaaaaaaaaeaaaaaafpaaaaadpcbabaaa
aaaaaaaafpaaaaaddcbabaaaabaaaaaaghaaaaaepccabaaaaaaaaaaaabaaaaaa
gfaaaaaddccabaaaabaaaaaagiaaaaacabaaaaaadiaaaaaipcaabaaaaaaaaaaa
fgbfbaaaaaaaaaaaegiocaaaaaaaaaaaabaaaaaadcaaaaakpcaabaaaaaaaaaaa
egiocaaaaaaaaaaaaaaaaaaaagbabaaaaaaaaaaaegaobaaaaaaaaaaadcaaaaak
pcaabaaaaaaaaaaaegiocaaaaaaaaaaaacaaaaaakgbkbaaaaaaaaaaaegaobaaa
aaaaaaaadcaaaaakpccabaaaaaaaaaaaegiocaaaaaaaaaaaadaaaaaapgbpbaaa
aaaaaaaaegaobaaaaaaaaaaadgaaaaafdccabaaaabaaaaaaegbabaaaabaaaaaa
doaaaaab"
}
SubProgram "opengl " {
// Stats: 10 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
"!!GLSL#version 120

#ifdef VERTEX

varying vec2 xlv_TEXCOORD0;
void main ()
{
gl_Position = (gl_ModelViewProjectionMatrix * gl_Vertex);
xlv_TEXCOORD0 = gl_MultiTexCoord0.xy;
}

#endif
#ifdef FRAGMENT
uniform vec4 _Color;
uniform vec4 _node_4617;
uniform float _node_1057;
varying vec2 xlv_TEXCOORD0;
void main ()
{
vec4 tmpvar_1;
tmpvar_1.w = 1.0;
tmpvar_1.xyz = mix (_node_4617.xyz, _Color.xyz, vec3(((
sqrt((pow ((xlv_TEXCOORD0.x + 0.2), _node_1057) + pow ((xlv_TEXCOORD0.y + 10.0), _node_1057)))
* 11.0) + -10.0)));
gl_FragData[0] = tmpvar_1;
}

#endif
"
}
SubProgram "d3d9 " {
// Stats: 5 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
Matrix 0 [glstate_matrix_mvp]
"vs_3_0
dcl_position v0
dcl_texcoord v1
dcl_position o0
dcl_texcoord o1.xy
dp4 o0.x, c0, v0
dp4 o0.y, c1, v0
dp4 o0.z, c2, v0
dp4 o0.w, c3, v0
mov o1.xy, v1

"
}
SubProgram "d3d11 " {
// Stats: 4 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
ConstBuffer "UnityPerDraw" 336
Matrix 0 [glstate_matrix_mvp]
BindCB "UnityPerDraw" 0
"vs_4_0
root12:aaabaaaa
eefiecedaffpdldohodkdgpagjklpapmmnbhcfmlabaaaaaaoeabaaaaadaaaaaa
cmaaaaaaiaaaaaaaniaaaaaaejfdeheoemaaaaaaacaaaaaaaiaaaaaadiaaaaaa
aaaaaaaaaaaaaaaaadaaaaaaaaaaaaaaapapaaaaebaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaabaaaaaaadadaaaafaepfdejfeejepeoaafeeffiedepepfceeaaklkl
epfdeheofaaaaaaaacaaaaaaaiaaaaaadiaaaaaaaaaaaaaaabaaaaaaadaaaaaa
aaaaaaaaapaaaaaaeeaaaaaaaaaaaaaaaaaaaaaaadaaaaaaabaaaaaaadamaaaa
fdfgfpfaepfdejfeejepeoaafeeffiedepepfceeaaklklklfdeieefcaeabaaaa
eaaaabaaebaaaaaafjaaaaaeegiocaaaaaaaaaaaaeaaaaaafpaaaaadpcbabaaa
aaaaaaaafpaaaaaddcbabaaaabaaaaaaghaaaaaepccabaaaaaaaaaaaabaaaaaa
gfaaaaaddccabaaaabaaaaaagiaaaaacabaaaaaadiaaaaaipcaabaaaaaaaaaaa
fgbfbaaaaaaaaaaaegiocaaaaaaaaaaaabaaaaaadcaaaaakpcaabaaaaaaaaaaa
egiocaaaaaaaaaaaaaaaaaaaagbabaaaaaaaaaaaegaobaaaaaaaaaaadcaaaaak
pcaabaaaaaaaaaaaegiocaaaaaaaaaaaacaaaaaakgbkbaaaaaaaaaaaegaobaaa
aaaaaaaadcaaaaakpccabaaaaaaaaaaaegiocaaaaaaaaaaaadaaaaaapgbpbaaa
aaaaaaaaegaobaaaaaaaaaaadgaaaaafdccabaaaabaaaaaaegbabaaaabaaaaaa
doaaaaab"
}
SubProgram "opengl " {
// Stats: 10 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" "VERTEXLIGHT_ON" }
"!!GLSL#version 120

#ifdef VERTEX

varying vec2 xlv_TEXCOORD0;
void main ()
{
gl_Position = (gl_ModelViewProjectionMatrix * gl_Vertex);
xlv_TEXCOORD0 = gl_MultiTexCoord0.xy;
}

#endif
#ifdef FRAGMENT
uniform vec4 _Color;
uniform vec4 _node_4617;
uniform float _node_1057;
varying vec2 xlv_TEXCOORD0;
void main ()
{
vec4 tmpvar_1;
tmpvar_1.w = 1.0;
tmpvar_1.xyz = mix (_node_4617.xyz, _Color.xyz, vec3(((
sqrt((pow ((xlv_TEXCOORD0.x + 0.2), _node_1057) + pow ((xlv_TEXCOORD0.y + 10.0), _node_1057)))
* 11.0) + -10.0)));
gl_FragData[0] = tmpvar_1;
}

#endif
"
}
SubProgram "d3d9 " {
// Stats: 5 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" "VERTEXLIGHT_ON" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
Matrix 0 [glstate_matrix_mvp]
"vs_3_0
dcl_position v0
dcl_texcoord v1
dcl_position o0
dcl_texcoord o1.xy
dp4 o0.x, c0, v0
dp4 o0.y, c1, v0
dp4 o0.z, c2, v0
dp4 o0.w, c3, v0
mov o1.xy, v1

"
}
SubProgram "d3d11 " {
// Stats: 4 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" "VERTEXLIGHT_ON" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
ConstBuffer "UnityPerDraw" 336
Matrix 0 [glstate_matrix_mvp]
BindCB "UnityPerDraw" 0
"vs_4_0
root12:aaabaaaa
eefiecedaffpdldohodkdgpagjklpapmmnbhcfmlabaaaaaaoeabaaaaadaaaaaa
cmaaaaaaiaaaaaaaniaaaaaaejfdeheoemaaaaaaacaaaaaaaiaaaaaadiaaaaaa
aaaaaaaaaaaaaaaaadaaaaaaaaaaaaaaapapaaaaebaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaabaaaaaaadadaaaafaepfdejfeejepeoaafeeffiedepepfceeaaklkl
epfdeheofaaaaaaaacaaaaaaaiaaaaaadiaaaaaaaaaaaaaaabaaaaaaadaaaaaa
aaaaaaaaapaaaaaaeeaaaaaaaaaaaaaaaaaaaaaaadaaaaaaabaaaaaaadamaaaa
fdfgfpfaepfdejfeejepeoaafeeffiedepepfceeaaklklklfdeieefcaeabaaaa
eaaaabaaebaaaaaafjaaaaaeegiocaaaaaaaaaaaaeaaaaaafpaaaaadpcbabaaa
aaaaaaaafpaaaaaddcbabaaaabaaaaaaghaaaaaepccabaaaaaaaaaaaabaaaaaa
gfaaaaaddccabaaaabaaaaaagiaaaaacabaaaaaadiaaaaaipcaabaaaaaaaaaaa
fgbfbaaaaaaaaaaaegiocaaaaaaaaaaaabaaaaaadcaaaaakpcaabaaaaaaaaaaa
egiocaaaaaaaaaaaaaaaaaaaagbabaaaaaaaaaaaegaobaaaaaaaaaaadcaaaaak
pcaabaaaaaaaaaaaegiocaaaaaaaaaaaacaaaaaakgbkbaaaaaaaaaaaegaobaaa
aaaaaaaadcaaaaakpccabaaaaaaaaaaaegiocaaaaaaaaaaaadaaaaaapgbpbaaa
aaaaaaaaegaobaaaaaaaaaaadgaaaaafdccabaaaabaaaaaaegbabaaaabaaaaaa
doaaaaab"
}
SubProgram "opengl " {
// Stats: 10 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" "VERTEXLIGHT_ON" }
"!!GLSL#version 120

#ifdef VERTEX

varying vec2 xlv_TEXCOORD0;
void main ()
{
gl_Position = (gl_ModelViewProjectionMatrix * gl_Vertex);
xlv_TEXCOORD0 = gl_MultiTexCoord0.xy;
}

#endif
#ifdef FRAGMENT
uniform vec4 _Color;
uniform vec4 _node_4617;
uniform float _node_1057;
varying vec2 xlv_TEXCOORD0;
void main ()
{
vec4 tmpvar_1;
tmpvar_1.w = 1.0;
tmpvar_1.xyz = mix (_node_4617.xyz, _Color.xyz, vec3(((
sqrt((pow ((xlv_TEXCOORD0.x + 0.2), _node_1057) + pow ((xlv_TEXCOORD0.y + 10.0), _node_1057)))
* 11.0) + -10.0)));
gl_FragData[0] = tmpvar_1;
}

#endif
"
}
SubProgram "d3d9 " {
// Stats: 5 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" "VERTEXLIGHT_ON" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
Matrix 0 [glstate_matrix_mvp]
"vs_3_0
dcl_position v0
dcl_texcoord v1
dcl_position o0
dcl_texcoord o1.xy
dp4 o0.x, c0, v0
dp4 o0.y, c1, v0
dp4 o0.z, c2, v0
dp4 o0.w, c3, v0
mov o1.xy, v1

"
}
SubProgram "d3d11 " {
// Stats: 4 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" "VERTEXLIGHT_ON" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
ConstBuffer "UnityPerDraw" 336
Matrix 0 [glstate_matrix_mvp]
BindCB "UnityPerDraw" 0
"vs_4_0
root12:aaabaaaa
eefiecedaffpdldohodkdgpagjklpapmmnbhcfmlabaaaaaaoeabaaaaadaaaaaa
cmaaaaaaiaaaaaaaniaaaaaaejfdeheoemaaaaaaacaaaaaaaiaaaaaadiaaaaaa
aaaaaaaaaaaaaaaaadaaaaaaaaaaaaaaapapaaaaebaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaabaaaaaaadadaaaafaepfdejfeejepeoaafeeffiedepepfceeaaklkl
epfdeheofaaaaaaaacaaaaaaaiaaaaaadiaaaaaaaaaaaaaaabaaaaaaadaaaaaa
aaaaaaaaapaaaaaaeeaaaaaaaaaaaaaaaaaaaaaaadaaaaaaabaaaaaaadamaaaa
fdfgfpfaepfdejfeejepeoaafeeffiedepepfceeaaklklklfdeieefcaeabaaaa
eaaaabaaebaaaaaafjaaaaaeegiocaaaaaaaaaaaaeaaaaaafpaaaaadpcbabaaa
aaaaaaaafpaaaaaddcbabaaaabaaaaaaghaaaaaepccabaaaaaaaaaaaabaaaaaa
gfaaaaaddccabaaaabaaaaaagiaaaaacabaaaaaadiaaaaaipcaabaaaaaaaaaaa
fgbfbaaaaaaaaaaaegiocaaaaaaaaaaaabaaaaaadcaaaaakpcaabaaaaaaaaaaa
egiocaaaaaaaaaaaaaaaaaaaagbabaaaaaaaaaaaegaobaaaaaaaaaaadcaaaaak
pcaabaaaaaaaaaaaegiocaaaaaaaaaaaacaaaaaakgbkbaaaaaaaaaaaegaobaaa
aaaaaaaadcaaaaakpccabaaaaaaaaaaaegiocaaaaaaaaaaaadaaaaaapgbpbaaa
aaaaaaaaegaobaaaaaaaaaaadgaaaaafdccabaaaabaaaaaaegbabaaaabaaaaaa
doaaaaab"
}
}
Program "fp" {
SubProgram "opengl " {
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
"!!GLSL"
}
SubProgram "d3d9 " {
// Stats: 15 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
Vector 0 [_Color]
Float 2 [_node_1057]
Vector 1 [_node_4617]
"ps_3_0
def c3, 0.200000003, 10, 11, -10
def c4, 1, 0, 0, 0
dcl_texcoord v0.xy
add r0.xy, c3, v0
pow r1.x, r0.x, c2.x
pow r1.y, r0.y, c2.x
add r0.x, r1.y, r1.x
rsq r0.x, r0.x
rcp r0.x, r0.x
mad r0.x, r0.x, c3.z, c3.w
mov r1.xyz, c1
add r0.yzw, -r1.xxyz, c0.xxyz
mad oC0.xyz, r0.x, r0.yzww, c1
mov oC0.w, c4.x

"
}
SubProgram "d3d11 " {
// Stats: 9 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
ConstBuffer "$Globals" 144
Vector 96 [_Color]
Vector 112 [_node_4617]
Float 128 [_node_1057]
BindCB "$Globals" 0
"ps_4_0
root12:aaabaaaa
eefiecedbabakleflijnjmcmgkldbcbdippjnhehabaaaaaaceacaaaaadaaaaaa
cmaaaaaaieaaaaaaliaaaaaaejfdeheofaaaaaaaacaaaaaaaiaaaaaadiaaaaaa
aaaaaaaaabaaaaaaadaaaaaaaaaaaaaaapaaaaaaeeaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaabaaaaaaadadaaaafdfgfpfaepfdejfeejepeoaafeeffiedepepfcee
aaklklklepfdeheocmaaaaaaabaaaaaaaiaaaaaacaaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaaaaaaaaaapaaaaaafdfgfpfegbhcghgfheaaklklfdeieefcgeabaaaa
eaaaaaaafjaaaaaafjaaaaaeegiocaaaaaaaaaaaajaaaaaagcbaaaaddcbabaaa
abaaaaaagfaaaaadpccabaaaaaaaaaaagiaaaaacabaaaaaaaaaaaaakdcaabaaa
aaaaaaaaegbabaaaabaaaaaaaceaaaaamnmmemdoaaaacaebaaaaaaaaaaaaaaaa
cpaaaaafdcaabaaaaaaaaaaaegaabaaaaaaaaaaadiaaaaaidcaabaaaaaaaaaaa
egaabaaaaaaaaaaaagiacaaaaaaaaaaaaiaaaaaabjaaaaafdcaabaaaaaaaaaaa
egaabaaaaaaaaaaaaaaaaaahbcaabaaaaaaaaaaabkaabaaaaaaaaaaaakaabaaa
aaaaaaaaelaaaaafbcaabaaaaaaaaaaaakaabaaaaaaaaaaadcaaaaajbcaabaaa
aaaaaaaaakaabaaaaaaaaaaaabeaaaaaaaaadaebabeaaaaaaaaacambaaaaaaak
ocaabaaaaaaaaaaaagijcaaaaaaaaaaaagaaaaaaagijcaiaebaaaaaaaaaaaaaa
ahaaaaaadcaaaaakhccabaaaaaaaaaaaagaabaaaaaaaaaaajgahbaaaaaaaaaaa
egiccaaaaaaaaaaaahaaaaaadgaaaaaficcabaaaaaaaaaaaabeaaaaaaaaaiadp
doaaaaab"
}
SubProgram "opengl " {
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
"!!GLSL"
}
SubProgram "d3d9 " {
// Stats: 15 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
Vector 0 [_Color]
Float 2 [_node_1057]
Vector 1 [_node_4617]
"ps_3_0
def c3, 0.200000003, 10, 11, -10
def c4, 1, 0, 0, 0
dcl_texcoord v0.xy
add r0.xy, c3, v0
pow r1.x, r0.x, c2.x
pow r1.y, r0.y, c2.x
add r0.x, r1.y, r1.x
rsq r0.x, r0.x
rcp r0.x, r0.x
mad r0.x, r0.x, c3.z, c3.w
mov r1.xyz, c1
add r0.yzw, -r1.xxyz, c0.xxyz
mad oC0.xyz, r0.x, r0.yzww, c1
mov oC0.w, c4.x

"
}
SubProgram "d3d11 " {
// Stats: 9 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
ConstBuffer "$Globals" 144
Vector 96 [_Color]
Vector 112 [_node_4617]
Float 128 [_node_1057]
BindCB "$Globals" 0
"ps_4_0
root12:aaabaaaa
eefiecedbabakleflijnjmcmgkldbcbdippjnhehabaaaaaaceacaaaaadaaaaaa
cmaaaaaaieaaaaaaliaaaaaaejfdeheofaaaaaaaacaaaaaaaiaaaaaadiaaaaaa
aaaaaaaaabaaaaaaadaaaaaaaaaaaaaaapaaaaaaeeaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaabaaaaaaadadaaaafdfgfpfaepfdejfeejepeoaafeeffiedepepfcee
aaklklklepfdeheocmaaaaaaabaaaaaaaiaaaaaacaaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaaaaaaaaaapaaaaaafdfgfpfegbhcghgfheaaklklfdeieefcgeabaaaa
eaaaaaaafjaaaaaafjaaaaaeegiocaaaaaaaaaaaajaaaaaagcbaaaaddcbabaaa
abaaaaaagfaaaaadpccabaaaaaaaaaaagiaaaaacabaaaaaaaaaaaaakdcaabaaa
aaaaaaaaegbabaaaabaaaaaaaceaaaaamnmmemdoaaaacaebaaaaaaaaaaaaaaaa
cpaaaaafdcaabaaaaaaaaaaaegaabaaaaaaaaaaadiaaaaaidcaabaaaaaaaaaaa
egaabaaaaaaaaaaaagiacaaaaaaaaaaaaiaaaaaabjaaaaafdcaabaaaaaaaaaaa
egaabaaaaaaaaaaaaaaaaaahbcaabaaaaaaaaaaabkaabaaaaaaaaaaaakaabaaa
aaaaaaaaelaaaaafbcaabaaaaaaaaaaaakaabaaaaaaaaaaadcaaaaajbcaabaaa
aaaaaaaaakaabaaaaaaaaaaaabeaaaaaaaaadaebabeaaaaaaaaacambaaaaaaak
ocaabaaaaaaaaaaaagijcaaaaaaaaaaaagaaaaaaagijcaiaebaaaaaaaaaaaaaa
ahaaaaaadcaaaaakhccabaaaaaaaaaaaagaabaaaaaaaaaaajgahbaaaaaaaaaaa
egiccaaaaaaaaaaaahaaaaaadgaaaaaficcabaaaaaaaaaaaabeaaaaaaaaaiadp
doaaaaab"
}
}
}
}
Fallback "Diffuse"
}

Silence isn’t golden – VOIP and Unity3d

Once again I am back on the trail of a good voice solution to use in Unity3d to use in multiple users situations. Having done a good few projects using a variety of solutions, mainly with an adjusted version of the now defunct Uspeak (as in removed from the assets store but replaced with Daiken Forge voice) combined with self hosted Photon server I think it must be time to find a more scalable solution. Very often the pressures of voice lead to a swamping of any server that is also dealing with position and data model synchronisation across the network. This clearly makes sense as there are only a few bytes of message for something moving a long distance, yet an audio stream is a constant collection of data flowing in and out of a server.
Where's your head at
Whilst there is huge focus on VR for the eyes, being able to communicate needs more than one channel but voice is one of those.
A voice connection can be 100Kbps per person so without a different tuning or set of priorities the regular multiplayer data messages, even if they are reliable, will get lost on the noise of an open mic.
My projects up to now have ended up with a push to talk or even my scarily complicated dial, ring answer/engaged phone simulator in order to cut down on the amount of voice. Raising a hand to speak works in many formal situations but not in social collaboration.
So whilst we can squeeze voice packets onto the same socket server as the rest of the “stuff” it feels like it should be a separate service and thread.
Oddly there is not a solution just sitting there for Unity3d? It seems that many of the existing voice clients for games and related services just don’t play well in the Unity3d/c# environment.
Unfortunately this seems to be down to the generic nature of Unity3d (which is a major strength) compared with the specifics of PC architecture and also in how to deal with shared memory and pipes.
There are a lot of voice servers out there for games. Mumble/Murmur is a major one. it is open source and appears to bolt on top of many games even supporting a 3d positional audio.
Many games are of course closed code bases. So Mumble has to piggy back on top of the game or have a little bit of help form the game developers to drop some hooks in. The main problem seems to be the use of shared memory versus a more closed boundary of applications. Obviously a shared memory space to communicate can bring problems and hacks. Strangely it was shared memory and various uses of it that let us do the early client server work back in the 90’s scraping from a terminal screen to make a legacy CICs application work with a fancy GUI!. Initially I figured that there would be a c# client for mumble that any of us could drop into Unity3d and away we go. However there are a few unfinished or implementations being attempted. Connecting to mumble and dealing with text is fine but voice and the various implementations of encoding seems to be a sticking point. Mumble also has a web interface implemented through something called ICE which is another Remote Procedure Call set of interfaces. It seems to be again focussed on building with C++ or PHP. Whilst ICE for .Net exists it does not work on all platforms. I am still looking at this though as surely if a web interface works we can get it to work on Unity3d. Of course the open source world is wide and diverse so there is always a chance I am missing something?
If a Mumble, or similar client can exists in a Unity3d context then we have our solution. It is nice to have the entire interface you need inside Unity3d not side loaded or overlayed. However it may be that a solution is to have implement the web interfaces next door to a unity interface and use plugin to webpage communication as the control. This is fraught with errors, browsers, DOM’s, script fails etc. Just a nice drop in mumble client hooked u to the shiny new UI interfaces.
I looked at the others in the area, Teamspeak is one of the biggest but incurs licensing charges so is not very open source minded. Ventrillo and Vivox seem to have fallen away.
Then of course there are the actual phone voip packages and standards. Maybe they offer the answer. I used Freeswitch on Opensim to provide a voice channel. Maybe I need to look at those again.
Or another solution could be trying to persuade the socket server applications like Photon than they need to talk to more than one server. At the moment the assumption is one server connection (though the load balancing may occur further on the server itself). If the client knew to talk to a different configure machine and network that was happy to deal with the firehose of audio but the normal movements talked to the server configure to deal with normal movements then we might have a solution.
Either way, its not straight forward and hasn’t been totally solved yet, or if it has everyone is keep quiet (Yes thats ironic when talking about VOIP!)
Let me know if you do have some great opensource or open sourceable solution.

Unreal Engine – an evolution

As you may have noticed I am a big fan of Unity3d. It has been the main go to tool for much of my development work over the years. A while back I had started to look at Unreal Engine too. It would be remiss of me to say Unity is the way I have always done things so I will stick with that 😉 My initial look at Unreal Engine though left me a little cold. This was a few years ago when it started to become a little more available. As a programmer, and with a background in C++ I was more than happy to take on the engineering challenge. However it was almost too much being thrown at me in one go. There was a great deal of focus on the graphics side of things. It felt, at the time, more like a visual designers tool with code hidden away. This was different from Unity3d that seemed to cross that workflow boundary offering either simple code, complicated graphics, or vice versa.
The new version of Unreal Engine, now fully free unless you make a lot of money building something with it, in which case you pay seems a much more friendly beast now. It has clearly taken onboard the way Unity3d does things. That initial experience and the packaging of various types of complexity allowing you to unwrap and get down to the things you know, but not get in the way on the things you don’t.
The launcher has access to the tool but also acts as a hub for other information and to the marketplace. I can’t remember this from the last time I looks at Unreal, but it is very obvious now.
Epic Games Launcher
The one document that leapt out is UE4 for Unity developers. This is addresses the differences and similarities between the two environments. Some of it is obviously a bit of “this is why we are better” and in some cases not strictly correct, particularly on object composition. However it is there and it does help. It recognises how huge Unity3d is rather the that slightly more arrogant stance that the toolset seemed to have as “we know best”. That is just a personal opinion and a feeling. That may seem odd for a software dev tool but when you work with these things you have a sense of who they are for and what they want to do. Unity3d has demo from humble beginnings as an indie tool grown to a AAA one. Unreal Engine, obviously had to start somewhere but was an in house toolkit that grew and grew making AAA titles then burst out into the world as a product. They both influence one another, but here it is the influence of Unity3d on Unreal Engine I am focussing on.
Also on this launcher are the quickstarts. Showing the roles of Artists, Level designer and programmer as different entry points. Another good point, talking the right language at the start.
Unity has a lot of sample projects and assets, some great tutorials. Unreal Engine now has this set of starter projects in a new project wizard. It is easy for more experience developers to sniff at these, but as a learning tool, or a prototyper being able to pick from these styles of project is very handy.
Unreal Engine Wizard
I had a number of “oh that’s how unreal works!” moments via these. First person, puzzle, 2d side scroller, twin stick shooter etc are all great. Unity3d does of course have 2D or 3D as a starting point for a project. Though I have always found 2D a bit strange in that environment, as I have build 2d in a 3d environment anyway.
The other interesting thing here is the idea of a C++ version or a Blueprint version. Blueprint is Unreal Engine providing a visual programming environment. Behavours and object composition is described through a programming facade. The blueprint can mix and match with C++ and shares some similarity with a Unity3d prefab, though it has more interactions described in visual composition than just exposing variable and functions/events in a prefab. Whilst blueprints may help people who don’t want to type c++. like many of the visual programming environments it is still really programming. You have to known what is changing, what is looping, what is branching etc. It is a nice feature and option and the fact it is not exclusively in that mode makes it usable.
Unreal Engine also seems to be happy to work on a Mac, despite much of the documentation mentioning Visual Studio and Windows it does play well with Xcode. It has to really to be able to target iOS platforms. So this was another plus in the experience.
The main development environment layout by default is similar to Unity3d too. All this helps anyone to have a look at both and see what works for them.
Unreal Engine Dev Env
I am not a total convert yet. I still need to explore the multiplayer/server side of things. The ability to interface with other systems (which all my work ends up needing to do). Though I am not quite so turned off by it now. It seems a real contender in my field. So just like all these things you have to give it a go and see how it feels.

Unity 4.6 – A new UI

I have always really enjoyed developing in Unity3D as you may have noticed from my various posts and projects. Watching it evolve into such a major platform has been really interesting too. Yesterday Unity moved form version 4.5 to 4.6. Now in most systems a minor point release is not big deal. In fact the whole patch and push versioning with lack of backwards compatibility that bedevils the tech industry is something I find I have to battle with on a regular basis. Do a change in version numbers… no thanks. Except this is the exception to prove the rule.
Unity3d is great at allowing a developer to start placing objects in an environment, attaching behaviours and code to those objects. A lot can be done with point and click, drag and drop but there is also room for real lines of code and proper software engineering. In fact that is essential. I can however, with ease, make a 3d character animated and moving around, colliding with scenery etc. Up to now though if I wanted to use any other form of user input with a Graphical User Interface I had to really go back in time.
The GUI creation in 4.5 and since the earlier releases has been awful. In code in the OnGUI loop you had to try and describe layouts and positions where much of what you put was implied. So you have to describe the user interface and its behaviour with code that merges function with display whilst not being able to see where things are until you actually run it. This is the opposite of most of Unity3d which lets you see where things are in the development environment both when running and when not.
I have lost track of the number of times I have tried a “fancy” layout of buttons, one that starts nicely, with flowing positions based on screen resolution, only to get lost in the code and resort to fixed x,y positions which generally a hit and miss for the first few times. In Unity3d you can expose parameters to allow compound prefab objects to be configured without code changes. I have ended up with UI helper objects that change my spacing and positions at runtime, so I can tweak the numbers to try and shuffle buttons around. Unity3d is great at letting you move alter runtime values, and also great at going back to the originals when you stop running. Unfortunately this works against you on UI’s. You have to tweak the values then remember to write them down on a piece of paper before hit stop so that you can apply them in the code for next time once you get it ‘right’.
That may or may not make sense, but take my word for it, it was a right pain. Let alone what you then had to do to try and alter the look of the buttons, sliders etc with the GUI.Skin which is like CSS but much much worse. So all my research projects have had plain buttons. I am not a graphic designer, and I was not easily able to explain the UI process or how to work with designers to make is nicer.
All that is now gone though. The old way still works in 4.6 (which is a relief from a phased improvement position) but a new shiny UI creation tool and framework is now at last public. It has long been promised, by long I mean literally years! I am sure when I get to the depth of it there will be a few things that cause grief, but that programming folks!.
Now the UI exists as a real type of Unity3d visual object. If you create a layout area you can see it, a button its there all the time. Each of the objects is part of a proper event system but also many of the normal functions are exposed as parameters. If you want a button to do something when pushed you tell it in its setup which function to call.
Previous UI buttons were only ever placed on the view rectangle, like a Heads Up Display (HUD). I have often needed thing to be interactive element in a 3d environment but to act like buttons. Again I have worked around this but now a UI canvas and all its actions, roll overs etc can be detached as a hud and placed into the environment. In my hospital environment this means I can have an better AR style UI at each bed. I have that at the moment but it does not have all the nice feedback and customisation a UI element can have.
UnityScreenSnapz017
The other major change is how the UI elements can be typed and hence changed. Each element can be a regular button, slider etc but they can also be a graphic or even the result of another camera rendering a live texture.
Here is a the game view (though not running) of the demo UI. The continue button is highlighted so on the right are its properties, events etc. The mini menu is showing the choices just for that button on how to transition on normal events such as rollover and click. Here it is set to animation, hence it uses mechanim and potential can go flying around the screen twirling. Simpler transitions such as just a colour tint or a sprite swap can also be defined.

This was possible before but again it was a lot of messing around. The old system of what to do on a roll over etc was buried in the skin code or overridden in the onGUI. Now it makes use of the regular mechanim animation system. This state machine allows definition of all sort of things, and it how we make character jump and dive into rolls, duck etc. It makes sense to have that for more complex UI transitions and states. In multiplayer environments I use a mechanim state change value to send over the network to make sure any mirrored/ghosted network object acts in the same way. So I guess now I will be able to use that for UI’s too to keep players in synch with certain modal activity.
Anyway, this 4.6 and new UI is obviously a precursor to the much bigger Unity 5.0 coming very soon. However it has dropped at the right time for the next phase of a project, so I get to use it in anger and re-engineer something that just about works into something much better.
As I tweeted, this is a great xmas gift, than you Unity 🙂

Kinect 2.0 and Choi Kwang Do

My kinect 2.0 arrived this afternoon so I got straight to trying it out. The previous Kinect was less able to cope with shoulders and some of the subtle extra s of joints.
The new Kinect 2.0 seems to be able to cope much better. Though maybe not with the speed of a martial art like Choi Kwang Do.
However, with the basics of form it is doing a very good job just in the Kinect Studio. This enables developers to turn on and off features. As I was using this out of the box it may well be doing more than it needed to do. e.g. just focusing on the skeleton might be smoother than dealing with all the point cloud data and and the ghost image.
The studio has the same thing I tried in my previous example of being able to change the view form front to side to top. The video shows this in this order. The side view is about 40 seconds in and I think is the most useful in terms of technique. We often train with mirrors or looking at another person but seldom see side on unless it is recorded and played back. This is a live mirror from the side view 🙂
I seem to confuse it with a twisting kick too 🙂

Now to look at specific code and trying to match movements to a reference move. Spotting the weight transfer etc.
Still it looks like this might be another step forward to another helpful tool for training.
Lets see how this goes. I have not seen if there is a unity3d plugin yet but thats next on the list.

An interesting game tech workshop in Wales

Last week I took a day out from some rather intense Unity3d development to head off to North Wales to Bangor. My fellow BCS Animation and Games Dev colleague Dr Robert Gittins invited me to keynote at a New Computer Technologies Wales event on Animation and Games 🙂
It is becoming an annual trip to similar events and it was good to catch up with David Burden of Daden Ltd again as we always both seem to be there.
As I figured that many of the people there were going to be into lots of games tech already I did not do my usual type of presentation, well not all the way through anyway. I decided to help people understand the difference between development in a hosted virtual world like Second Life and developing from scratch with Unity3d. This made sense as we had Unity3d on the agenda and there were also projects from Wales that were SL related so I though it a good overall intro.
I have written about the difference before back here in 2010 but I thought I could add a bit extra in explaining it in person and drawing on the current project(s) without sharing too much of things that are customer confidential.

Why SL development is not Unity3d development from Ian Hughes

I did of course start with a bit about Cool Stuff Collective and how we got Unity3d on kids TV back on the haloween 2010 edition. This was the show that moved us from CITV to ITV prime saturday morning.
I added a big slide of things to consider in development that many non game developers and IT architects will recognise. Game tech development differs in content to a standard application, the infrastructure is very similar. The complication is in the “do something here” boxes of game play and the specifics of real time network interaction between clients. Which is different to many client server type applications (like the web)

After that I flipped back from tech to things like Forza 5 and in game creation of content, Kinect and Choi Kwang Do, Project Spark and of course the Oculus Rift. I was glad I popped that in as it became a theme throughout the pitches and most people mentioned it in some way shape of form 🙂

It was great to see all the other presentations too. They covered a lot of diverse ground.

Panagiotis Ritsos from Bangor University gave some more updates on the challenges of teaching and rehearsing language interpretation in virtual environments with EVIVA/IVY, the Second Life projects and now the investigations into Unity3d.

Llyr ap Cenydd from Bangor University shared his research on procedural animation and definitely won the prize for the best visuals as he showed his original procedural spider and then his amazing Oculus Rift deep sea experience with procedural generated animations of Dolphins.
Just to help in case this seems like gobbledegook. very often animations have been “recorded” either by someone or something being filmed in a special way that takes their movements and makes them available digitally as a whole. Procedural generation uses a sense and respond to the environment and the construction of the thing being animated. Things are not recorded but happen in real time because they have to. An object can be given an push or an impulse to do something, the rest is discovered but he collection of bits that make up the animated object. It is very cool stuff!

Just before the lunch break we had Joe Robins from Unity3d, the community evangelist and long term member of the Unity team show us some of the new things in Unity 5 and have a general chat about Unity. He also did a session later that afternoon as a Q&A session. It was very useful as there is always more to learn or figure out.
We all did a bit of a panel, quite a lot of talk about education of kids in tech and how to just let them get on with it with the teachers, not wait for teachers to have to become experienced programmers.
After lunch it was Pikachu time, or Pecha Kucha whatever it is called 🙂 http://www.pechakucha.org 20 slides each of 20 seconds in a fast fire format. It is really good, covers lots of grounds raises lots of questions.

David Burden of Daden Ltd went first. VR the Second Coming of Virtual Worlds exploring the sudden rise of VR and where it fits in the social adoption and tech adoption curves. A big subject, and of course VR is getting a lot of press as virtual worlds did. It is all the same, but different affordances of how to interact. They co-exist.

Andy Fawkes of Bohemia Interactive talked about the Virtual Battlespace – From Computer Game to Simulation. His company has the Arma engine that was originally used for Operation Flashpoint, and now has a spin of with the cult classic Day Z. He talked about the sort of simulations in the military space that are already heavily used and how that is only going to increase. An interesting question was realised about the impact of increasingly real simulations, his opinion was that no matter what we do currently we all still do know the difference and that the real effects of war are drastically different. The training is about the procedures to get you through that effectively. There has been concern that drone pilots, who are in effect doing real things via a simulation are to detached from the impact they have. Head to the office, fly a drone, go home to dinner. A serious but interesting point.

Gaz Thomas of The Game HomePage than gave a sparky talk on How to entrain 100 million people from your home office. Gaz is a budding new game developer. He has made lots of quick fire games, not trained as a programmer he wanted to do something on the web, set up a website but then started building games as ways to bring people to his site. This led to some very popular games, but he found he was cloned very quickly and now tries to get the mobile and web versions released at the same time. It was very inspirational and great to see such enthusiasm and get up and go.

Ralph Ferneyhough of newly formed Quantum Soup Studios talked about The New AAA of Development – Agile, Artistic, Autonomous. This was a talk about how being small and willing to try newer things is much more possible and needed that the constant churn in the games industry of the sequel to the sequel of the sequel. The sums of money involved and sizes of projects leads to stagnation. It was great to hear from someone who has been in the industry for a while branching out from corporate life. A fellow escapee, though from a different industry vertical.

Chris Payne of Games Dev North Wales gave the final talk on Hollywood vs VR:The Challenge Ahead. Chris works in the games industry and for several years has been a virtual camera expert. If you have tried to make cameras work in games, or played one where it was not quite right you will appreciate this is a very intricate skill. He also makes films and pop videos. It was interesting to hear about the challenges that attempting to do 360 VR films is going to have for what is a framed 2d medium. Chris showed a multi camera picture of a sphere with lenses poking out all around it, rather like the star wars training drone on the Millennium Falcon that Luke tries his light sabre with. This new camera shoots in all directions. Chris explain though that it was not possible to build one that was stereoscopic. The type of parallax and offsets that are needed can only really be done post filming. So a lot has to be done to make this giant 360 thing able to be interacted with in a headset like the rift. However that is just the start of the problems. As he pointed out, the language of cinema, the tricks of the trade just don’t work when you can look anywhere and see anything. Sets can’t have crew behind the camera as there is no behind the camera. Story tellers have to consider if you are in the scene and hence acknowledged or a floating observer, focus pulls to gain attention don’t work. Instead game techniques to attract you to the key story elements are needed. Chris proposed that as rendering gets better it is more likely that the VR movies are going to be all realtime CGI in order to be able to get around the physical problems of filming. It is a fascinating subject!

So it was well worth the 4am start to drive the 600 miles round trip and back by 10pm 🙂

It’s got the lot – metaverse development

My current project has kept me pretty busy with good old fashioned hands on development. However, sometimes it is good to step back and see just how many things a project covers. I can’t go into too much detail about what it is for but can share the sort of development coverage.

(*update 11/6/14 Just adding this picture from a later post that describes this environment)
It is a unity3d multi-user environment with point and click. It works on the web so it needs a socket server to broker the network communications. So it has a Photon Server. That Photon Server is not on their cloud but on a separately hosted box with a major provider. So that needs my attention sys-admin wise configuring and keeping it up to date.
The unity3d system needs to be logged into and to record things that have happened in the environment. So I had to build a separate set of web pages and php to act as the login and the API for the unity3d web plugin to talk to. This has to live on the server of course. As soon as you have login and data, users etc you need a set of admin screens and code to support that to.
The unity3d system also needs voice communication as well as text chat. So that’s through Photon too.
The actual unity3d environment has both regular users and an admin user in charge. So there are lots of things flowing back and forth to keep in sync across the session and to pass to the database. All my code is in c# though sometimes a bit of js will slip in. WE have things like animations using the animation controller and other unity goodies like Navmesh in place too.
I am working with a 3d designer so this is a multi person project. So I have had to set up mercurial repositories and hosting the repo on bitbucket. We sync code and builds using Atlassian SourceTree which is really great. I also have an error tracking system with Atlassian so we have a JIRA. It means when I check code in and push the repository I can specify the JIRA reference number for the issue and it appears logged on the issue log. That combined with all the handy notifications to all concerned.
As I have a separate server component running I had to set up another repository to enable me to protect and synchronise any server changes, the server has its own repository ID so it can pull the unity3d builds to the server too.
There are complications in doing a database communication as Unity will only talk to the server that is is served from using the www classes. So it makes local testing of multiuser a little tricky. The unity dev environment is able to emulate the server name but the built versions can’t so there is a lot of testing bypass code needed.
Oh I forgot to mention, this is all in Arabic too. There is nothing wrong with that except I don’t know the language. Also Arabic is a right to left language so things have to be put in place to ensure that text, chat etc all flows correctly.
A few little problems arose with this. Unity has an excellent Arabic component that allows you to apply right to left to any output text, however it does not work on input fields. That is a bit tricky when you need text chat, typing in questions and responses etc. So I have ended up writing a sort of new input field, I use a text label but capture the keys pass it to the Arabic fixer component which then returns the right to left version that is displayed in the label. I do of course loose things like cursor and focus as the label is an output device but needs must.
In order to support Arabic in html and in the database I had to ensure that the text encoding of everything is UTF-8, there is also a directive tag dir=rtl that helps browsers know what to do with things. However I have found that this works with HTML input fields but seems to not work with password fields. My password field will not let me type Arabic into it. The keyboard language chooser on the mac reverts to uk and Arabic is greyed out. This cause me a lot of confusion on logging in.
There is also the confusion of what to type, it is relatively easy to cut and paste translated Arabic labels into strings, but when testing a chat system or user names I needed to know what english keystrokes generated what Arabic phrase (that’s not a translation thats a how do I type something meaningful in Arabic and see it come up on the screen).
Luckily my good friend Rob Smart came to my aid with “wfhp hgodn” which equates to صباح الخير which is a variant of good morning. It helped me see where and when I was getting the correct orientation. Again this is not obvious when you start this sort of thing 🙂
Anyway its back to layering and continuos improvement. Fixing bugs, adding function. It is pretty simply on paper but the number of components and systems, languages and platforms that this crosses is quite full on.
The project is a 3 person one, Project manager/producer, graphic designer and me. We all provide input to the project.
So if you need any help or work doing with unity3d, c#, photon,html, php, MySQL, rtl languages, cloud servers, bitbucket, mercurial, sourcetree, JIRA then I am more than slightly levelled up though there is always more to learn.
Phew!

Flush 12 – Secret Agent! on whose side?

It’s time to share another article in the excellent Flush magazine. As usual my collection of words and ideas have been massively enhanced by great layout and bing in the company of other fantastic articles and features, thankyou @tweetthefashion. This issue I decided to explore the world of hacking, surveillance and counter measures, encryption and t-shirts. The title “Secret Agent, on whose side?” will be familiar to many people but particularly a few old collegues where it became a bit of a mantra at work. It applied to most situations where things went wrong, usually when some sales or management over commitment made us wonder whose side they were on as we cleaned up the mess 🙂
The recent revelations about the mass government data collection may seem a shock to many but the battle for information and the counter measures around it in recent modern technology show us the trajectory this was on. This includes fictional spies like James Bond of course 🙂
I was trying to strike a balance of information, historical information that may be of interest and moderate outrage at the escalation.
The article is on page 108 thru 115.
The direct link is here which should take you right to the page.
The embedded issue on issuu is below.


Check out the entire magazine though including the fantastic front cover. I will let you discover that 🙂

BCS Animation and Games group AGM

Last night I popped down to Southampton Solent University for our BCS Animation and Games specialist group AGM (for which I am the chairperson). More on the BCS and the group is available We had the usual formalities and reports to make at the start but then I had to switch into performance mode to give another talk. Obviously I can’t keep doing the same one and the same subjects 🙂 I had created a new one that was about all the upgrades, and things that have come to fruition of the past year. Xbox One, Leap, Oculus Rift, Rocksmith 2014.
I was pleased to see a very large audience, some BCS members but a lot of students from the games design course at Southampton Solent. With an audience that are into games and tech it is hard to not tell everyone things they already know. However I took a lot of demo kit with me. In particular the Oculus Rift went down a storm as an experience.
I also like to share things that happen in life, so the anecdotes are useful as they are either a vehicle to help people hear about something new, or if they know about it to relate to shared experiences.
I had a brave audience member some up and use the Rift whilst I talked about it. Getting audience participation is always good fun for everyone.
Thankyou to everyone for coming, for all the great discussion and feedback during and after the session. I feel very lucky to be able to get such a buzz out of enthusing and sharing the tech that is part of my life.
As usual the pitch was more show and tell than slides but here are the slides anyway.
The videos are replaced with links to blog articles with videos embedded too.

Bcs Review 2013 tech in 2014 from Ian Hughes