code


Shady Maths for Shaders

I like a good tech challenge and I decided to look a bit more into the world of shaders. I had not fully appreciated how many competing languages there were for these low level compute systems. I had used shaders in Unity3D and I knew we could do fancy things with them. They are wonderfully complicated and full of maths. As with all code you can go native and just write the code directly. There are also tools to help. Things like shadertoy let you see some of the fun that can be had. It reminds me of hacking with the copper process in the old Amiga days. Low level graphics manipulation, direct to the pipeline.
In Unity3d there is a tool I bought a while back called ShaderForge. It allows for editing of the shaders but in a visual way. Each little node shows the results of the maths that is going on in its own area. It is common to have this sort of material editing in 3D applications. There is a lot of maths available and I am only just skimming the surface of what can be done.
I was trying to create a realistic wood ring shader. I wanted to do something like that demonstrated here in normal code (not any of the shader languages).
I ended up with something that looked like this.
ShaderForge Unity3d
It was nearly the concentric rings, but I can only get it to start from the bottom left corner. I have yet to work out which number I can use for an offset so I can get full circles. I have worked out where to put in variance and noise to make the lines wobble a little. So I am a little stuck if anyone has any suggestions I would be very grateful 🙂 I want a random shader texture, slightly different each time which is why I am not using images as I normally do. I am not worried about the colour at the moment BTW. That is a function of scaling the mapping in the sin sweeping function that creates the ripples. I stuck a few extra value modifiers (some set to 0) to see if I could tweak the shader to do what I wanted, but no luck yet. Shaderforge has its own meta data but the thing can be compiled into a full shader in native code.
Just look at what it generates, its fairly full on code.
So it looks like I have quite a lot of learning to do. With code there is always more, always another language or format for something 🙂

***Update Yay for the internet. Dickie replied to this post very very quickly with an image of a very much simpler way for generate the concentric rings. This has massively added to my understanding of the process and is very much appreciated.


// Compiled shader for Web Player, uncompressed size: 14.0KB

// Skipping shader variants that would not be included into build of current scene.

Shader "Shader Forge/wood3" {
Properties {
_Color ("Color", Color) = (0.727941,0.116656,0.0856401,1)
_node_4617 ("node_4617", Color) = (0.433823,0.121051,0.0287089,1)
_node_1057 ("node_1057", Float) = 0.2
}
SubShader {
Tags { "RenderType"="Opaque" }

// Stats for Vertex shader:
// d3d11 : 4 math
// d3d9 : 5 math
// opengl : 10 math
// Stats for Fragment shader:
// d3d11 : 9 math
// d3d9 : 15 math
Pass {
Name "FORWARD"
Tags { "LIGHTMODE"="ForwardBase" "SHADOWSUPPORT"="true" "RenderType"="Opaque" }
GpuProgramID 47164
Program "vp" {
SubProgram "opengl " {
// Stats: 10 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
"!!GLSL#version 120

#ifdef VERTEX

varying vec2 xlv_TEXCOORD0;
void main ()
{
gl_Position = (gl_ModelViewProjectionMatrix * gl_Vertex);
xlv_TEXCOORD0 = gl_MultiTexCoord0.xy;
}

#endif
#ifdef FRAGMENT
uniform vec4 _Color;
uniform vec4 _node_4617;
uniform float _node_1057;
varying vec2 xlv_TEXCOORD0;
void main ()
{
vec4 tmpvar_1;
tmpvar_1.w = 1.0;
tmpvar_1.xyz = mix (_node_4617.xyz, _Color.xyz, vec3(((
sqrt((pow ((xlv_TEXCOORD0.x + 0.2), _node_1057) + pow ((xlv_TEXCOORD0.y + 10.0), _node_1057)))
* 11.0) + -10.0)));
gl_FragData[0] = tmpvar_1;
}

#endif
"
}
SubProgram "d3d9 " {
// Stats: 5 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
Matrix 0 [glstate_matrix_mvp]
"vs_3_0
dcl_position v0
dcl_texcoord v1
dcl_position o0
dcl_texcoord o1.xy
dp4 o0.x, c0, v0
dp4 o0.y, c1, v0
dp4 o0.z, c2, v0
dp4 o0.w, c3, v0
mov o1.xy, v1

"
}
SubProgram "d3d11 " {
// Stats: 4 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
ConstBuffer "UnityPerDraw" 336
Matrix 0 [glstate_matrix_mvp]
BindCB "UnityPerDraw" 0
"vs_4_0
root12:aaabaaaa
eefiecedaffpdldohodkdgpagjklpapmmnbhcfmlabaaaaaaoeabaaaaadaaaaaa
cmaaaaaaiaaaaaaaniaaaaaaejfdeheoemaaaaaaacaaaaaaaiaaaaaadiaaaaaa
aaaaaaaaaaaaaaaaadaaaaaaaaaaaaaaapapaaaaebaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaabaaaaaaadadaaaafaepfdejfeejepeoaafeeffiedepepfceeaaklkl
epfdeheofaaaaaaaacaaaaaaaiaaaaaadiaaaaaaaaaaaaaaabaaaaaaadaaaaaa
aaaaaaaaapaaaaaaeeaaaaaaaaaaaaaaaaaaaaaaadaaaaaaabaaaaaaadamaaaa
fdfgfpfaepfdejfeejepeoaafeeffiedepepfceeaaklklklfdeieefcaeabaaaa
eaaaabaaebaaaaaafjaaaaaeegiocaaaaaaaaaaaaeaaaaaafpaaaaadpcbabaaa
aaaaaaaafpaaaaaddcbabaaaabaaaaaaghaaaaaepccabaaaaaaaaaaaabaaaaaa
gfaaaaaddccabaaaabaaaaaagiaaaaacabaaaaaadiaaaaaipcaabaaaaaaaaaaa
fgbfbaaaaaaaaaaaegiocaaaaaaaaaaaabaaaaaadcaaaaakpcaabaaaaaaaaaaa
egiocaaaaaaaaaaaaaaaaaaaagbabaaaaaaaaaaaegaobaaaaaaaaaaadcaaaaak
pcaabaaaaaaaaaaaegiocaaaaaaaaaaaacaaaaaakgbkbaaaaaaaaaaaegaobaaa
aaaaaaaadcaaaaakpccabaaaaaaaaaaaegiocaaaaaaaaaaaadaaaaaapgbpbaaa
aaaaaaaaegaobaaaaaaaaaaadgaaaaafdccabaaaabaaaaaaegbabaaaabaaaaaa
doaaaaab"
}
SubProgram "opengl " {
// Stats: 10 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
"!!GLSL#version 120

#ifdef VERTEX

varying vec2 xlv_TEXCOORD0;
void main ()
{
gl_Position = (gl_ModelViewProjectionMatrix * gl_Vertex);
xlv_TEXCOORD0 = gl_MultiTexCoord0.xy;
}

#endif
#ifdef FRAGMENT
uniform vec4 _Color;
uniform vec4 _node_4617;
uniform float _node_1057;
varying vec2 xlv_TEXCOORD0;
void main ()
{
vec4 tmpvar_1;
tmpvar_1.w = 1.0;
tmpvar_1.xyz = mix (_node_4617.xyz, _Color.xyz, vec3(((
sqrt((pow ((xlv_TEXCOORD0.x + 0.2), _node_1057) + pow ((xlv_TEXCOORD0.y + 10.0), _node_1057)))
* 11.0) + -10.0)));
gl_FragData[0] = tmpvar_1;
}

#endif
"
}
SubProgram "d3d9 " {
// Stats: 5 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
Matrix 0 [glstate_matrix_mvp]
"vs_3_0
dcl_position v0
dcl_texcoord v1
dcl_position o0
dcl_texcoord o1.xy
dp4 o0.x, c0, v0
dp4 o0.y, c1, v0
dp4 o0.z, c2, v0
dp4 o0.w, c3, v0
mov o1.xy, v1

"
}
SubProgram "d3d11 " {
// Stats: 4 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
ConstBuffer "UnityPerDraw" 336
Matrix 0 [glstate_matrix_mvp]
BindCB "UnityPerDraw" 0
"vs_4_0
root12:aaabaaaa
eefiecedaffpdldohodkdgpagjklpapmmnbhcfmlabaaaaaaoeabaaaaadaaaaaa
cmaaaaaaiaaaaaaaniaaaaaaejfdeheoemaaaaaaacaaaaaaaiaaaaaadiaaaaaa
aaaaaaaaaaaaaaaaadaaaaaaaaaaaaaaapapaaaaebaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaabaaaaaaadadaaaafaepfdejfeejepeoaafeeffiedepepfceeaaklkl
epfdeheofaaaaaaaacaaaaaaaiaaaaaadiaaaaaaaaaaaaaaabaaaaaaadaaaaaa
aaaaaaaaapaaaaaaeeaaaaaaaaaaaaaaaaaaaaaaadaaaaaaabaaaaaaadamaaaa
fdfgfpfaepfdejfeejepeoaafeeffiedepepfceeaaklklklfdeieefcaeabaaaa
eaaaabaaebaaaaaafjaaaaaeegiocaaaaaaaaaaaaeaaaaaafpaaaaadpcbabaaa
aaaaaaaafpaaaaaddcbabaaaabaaaaaaghaaaaaepccabaaaaaaaaaaaabaaaaaa
gfaaaaaddccabaaaabaaaaaagiaaaaacabaaaaaadiaaaaaipcaabaaaaaaaaaaa
fgbfbaaaaaaaaaaaegiocaaaaaaaaaaaabaaaaaadcaaaaakpcaabaaaaaaaaaaa
egiocaaaaaaaaaaaaaaaaaaaagbabaaaaaaaaaaaegaobaaaaaaaaaaadcaaaaak
pcaabaaaaaaaaaaaegiocaaaaaaaaaaaacaaaaaakgbkbaaaaaaaaaaaegaobaaa
aaaaaaaadcaaaaakpccabaaaaaaaaaaaegiocaaaaaaaaaaaadaaaaaapgbpbaaa
aaaaaaaaegaobaaaaaaaaaaadgaaaaafdccabaaaabaaaaaaegbabaaaabaaaaaa
doaaaaab"
}
SubProgram "opengl " {
// Stats: 10 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" "VERTEXLIGHT_ON" }
"!!GLSL#version 120

#ifdef VERTEX

varying vec2 xlv_TEXCOORD0;
void main ()
{
gl_Position = (gl_ModelViewProjectionMatrix * gl_Vertex);
xlv_TEXCOORD0 = gl_MultiTexCoord0.xy;
}

#endif
#ifdef FRAGMENT
uniform vec4 _Color;
uniform vec4 _node_4617;
uniform float _node_1057;
varying vec2 xlv_TEXCOORD0;
void main ()
{
vec4 tmpvar_1;
tmpvar_1.w = 1.0;
tmpvar_1.xyz = mix (_node_4617.xyz, _Color.xyz, vec3(((
sqrt((pow ((xlv_TEXCOORD0.x + 0.2), _node_1057) + pow ((xlv_TEXCOORD0.y + 10.0), _node_1057)))
* 11.0) + -10.0)));
gl_FragData[0] = tmpvar_1;
}

#endif
"
}
SubProgram "d3d9 " {
// Stats: 5 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" "VERTEXLIGHT_ON" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
Matrix 0 [glstate_matrix_mvp]
"vs_3_0
dcl_position v0
dcl_texcoord v1
dcl_position o0
dcl_texcoord o1.xy
dp4 o0.x, c0, v0
dp4 o0.y, c1, v0
dp4 o0.z, c2, v0
dp4 o0.w, c3, v0
mov o1.xy, v1

"
}
SubProgram "d3d11 " {
// Stats: 4 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" "VERTEXLIGHT_ON" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
ConstBuffer "UnityPerDraw" 336
Matrix 0 [glstate_matrix_mvp]
BindCB "UnityPerDraw" 0
"vs_4_0
root12:aaabaaaa
eefiecedaffpdldohodkdgpagjklpapmmnbhcfmlabaaaaaaoeabaaaaadaaaaaa
cmaaaaaaiaaaaaaaniaaaaaaejfdeheoemaaaaaaacaaaaaaaiaaaaaadiaaaaaa
aaaaaaaaaaaaaaaaadaaaaaaaaaaaaaaapapaaaaebaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaabaaaaaaadadaaaafaepfdejfeejepeoaafeeffiedepepfceeaaklkl
epfdeheofaaaaaaaacaaaaaaaiaaaaaadiaaaaaaaaaaaaaaabaaaaaaadaaaaaa
aaaaaaaaapaaaaaaeeaaaaaaaaaaaaaaaaaaaaaaadaaaaaaabaaaaaaadamaaaa
fdfgfpfaepfdejfeejepeoaafeeffiedepepfceeaaklklklfdeieefcaeabaaaa
eaaaabaaebaaaaaafjaaaaaeegiocaaaaaaaaaaaaeaaaaaafpaaaaadpcbabaaa
aaaaaaaafpaaaaaddcbabaaaabaaaaaaghaaaaaepccabaaaaaaaaaaaabaaaaaa
gfaaaaaddccabaaaabaaaaaagiaaaaacabaaaaaadiaaaaaipcaabaaaaaaaaaaa
fgbfbaaaaaaaaaaaegiocaaaaaaaaaaaabaaaaaadcaaaaakpcaabaaaaaaaaaaa
egiocaaaaaaaaaaaaaaaaaaaagbabaaaaaaaaaaaegaobaaaaaaaaaaadcaaaaak
pcaabaaaaaaaaaaaegiocaaaaaaaaaaaacaaaaaakgbkbaaaaaaaaaaaegaobaaa
aaaaaaaadcaaaaakpccabaaaaaaaaaaaegiocaaaaaaaaaaaadaaaaaapgbpbaaa
aaaaaaaaegaobaaaaaaaaaaadgaaaaafdccabaaaabaaaaaaegbabaaaabaaaaaa
doaaaaab"
}
SubProgram "opengl " {
// Stats: 10 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" "VERTEXLIGHT_ON" }
"!!GLSL#version 120

#ifdef VERTEX

varying vec2 xlv_TEXCOORD0;
void main ()
{
gl_Position = (gl_ModelViewProjectionMatrix * gl_Vertex);
xlv_TEXCOORD0 = gl_MultiTexCoord0.xy;
}

#endif
#ifdef FRAGMENT
uniform vec4 _Color;
uniform vec4 _node_4617;
uniform float _node_1057;
varying vec2 xlv_TEXCOORD0;
void main ()
{
vec4 tmpvar_1;
tmpvar_1.w = 1.0;
tmpvar_1.xyz = mix (_node_4617.xyz, _Color.xyz, vec3(((
sqrt((pow ((xlv_TEXCOORD0.x + 0.2), _node_1057) + pow ((xlv_TEXCOORD0.y + 10.0), _node_1057)))
* 11.0) + -10.0)));
gl_FragData[0] = tmpvar_1;
}

#endif
"
}
SubProgram "d3d9 " {
// Stats: 5 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" "VERTEXLIGHT_ON" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
Matrix 0 [glstate_matrix_mvp]
"vs_3_0
dcl_position v0
dcl_texcoord v1
dcl_position o0
dcl_texcoord o1.xy
dp4 o0.x, c0, v0
dp4 o0.y, c1, v0
dp4 o0.z, c2, v0
dp4 o0.w, c3, v0
mov o1.xy, v1

"
}
SubProgram "d3d11 " {
// Stats: 4 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" "VERTEXLIGHT_ON" }
Bind "vertex" Vertex
Bind "texcoord" TexCoord0
ConstBuffer "UnityPerDraw" 336
Matrix 0 [glstate_matrix_mvp]
BindCB "UnityPerDraw" 0
"vs_4_0
root12:aaabaaaa
eefiecedaffpdldohodkdgpagjklpapmmnbhcfmlabaaaaaaoeabaaaaadaaaaaa
cmaaaaaaiaaaaaaaniaaaaaaejfdeheoemaaaaaaacaaaaaaaiaaaaaadiaaaaaa
aaaaaaaaaaaaaaaaadaaaaaaaaaaaaaaapapaaaaebaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaabaaaaaaadadaaaafaepfdejfeejepeoaafeeffiedepepfceeaaklkl
epfdeheofaaaaaaaacaaaaaaaiaaaaaadiaaaaaaaaaaaaaaabaaaaaaadaaaaaa
aaaaaaaaapaaaaaaeeaaaaaaaaaaaaaaaaaaaaaaadaaaaaaabaaaaaaadamaaaa
fdfgfpfaepfdejfeejepeoaafeeffiedepepfceeaaklklklfdeieefcaeabaaaa
eaaaabaaebaaaaaafjaaaaaeegiocaaaaaaaaaaaaeaaaaaafpaaaaadpcbabaaa
aaaaaaaafpaaaaaddcbabaaaabaaaaaaghaaaaaepccabaaaaaaaaaaaabaaaaaa
gfaaaaaddccabaaaabaaaaaagiaaaaacabaaaaaadiaaaaaipcaabaaaaaaaaaaa
fgbfbaaaaaaaaaaaegiocaaaaaaaaaaaabaaaaaadcaaaaakpcaabaaaaaaaaaaa
egiocaaaaaaaaaaaaaaaaaaaagbabaaaaaaaaaaaegaobaaaaaaaaaaadcaaaaak
pcaabaaaaaaaaaaaegiocaaaaaaaaaaaacaaaaaakgbkbaaaaaaaaaaaegaobaaa
aaaaaaaadcaaaaakpccabaaaaaaaaaaaegiocaaaaaaaaaaaadaaaaaapgbpbaaa
aaaaaaaaegaobaaaaaaaaaaadgaaaaafdccabaaaabaaaaaaegbabaaaabaaaaaa
doaaaaab"
}
}
Program "fp" {
SubProgram "opengl " {
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
"!!GLSL"
}
SubProgram "d3d9 " {
// Stats: 15 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
Vector 0 [_Color]
Float 2 [_node_1057]
Vector 1 [_node_4617]
"ps_3_0
def c3, 0.200000003, 10, 11, -10
def c4, 1, 0, 0, 0
dcl_texcoord v0.xy
add r0.xy, c3, v0
pow r1.x, r0.x, c2.x
pow r1.y, r0.y, c2.x
add r0.x, r1.y, r1.x
rsq r0.x, r0.x
rcp r0.x, r0.x
mad r0.x, r0.x, c3.z, c3.w
mov r1.xyz, c1
add r0.yzw, -r1.xxyz, c0.xxyz
mad oC0.xyz, r0.x, r0.yzww, c1
mov oC0.w, c4.x

"
}
SubProgram "d3d11 " {
// Stats: 9 math
Keywords { "DIRECTIONAL" "SHADOWS_OFF" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
ConstBuffer "$Globals" 144
Vector 96 [_Color]
Vector 112 [_node_4617]
Float 128 [_node_1057]
BindCB "$Globals" 0
"ps_4_0
root12:aaabaaaa
eefiecedbabakleflijnjmcmgkldbcbdippjnhehabaaaaaaceacaaaaadaaaaaa
cmaaaaaaieaaaaaaliaaaaaaejfdeheofaaaaaaaacaaaaaaaiaaaaaadiaaaaaa
aaaaaaaaabaaaaaaadaaaaaaaaaaaaaaapaaaaaaeeaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaabaaaaaaadadaaaafdfgfpfaepfdejfeejepeoaafeeffiedepepfcee
aaklklklepfdeheocmaaaaaaabaaaaaaaiaaaaaacaaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaaaaaaaaaapaaaaaafdfgfpfegbhcghgfheaaklklfdeieefcgeabaaaa
eaaaaaaafjaaaaaafjaaaaaeegiocaaaaaaaaaaaajaaaaaagcbaaaaddcbabaaa
abaaaaaagfaaaaadpccabaaaaaaaaaaagiaaaaacabaaaaaaaaaaaaakdcaabaaa
aaaaaaaaegbabaaaabaaaaaaaceaaaaamnmmemdoaaaacaebaaaaaaaaaaaaaaaa
cpaaaaafdcaabaaaaaaaaaaaegaabaaaaaaaaaaadiaaaaaidcaabaaaaaaaaaaa
egaabaaaaaaaaaaaagiacaaaaaaaaaaaaiaaaaaabjaaaaafdcaabaaaaaaaaaaa
egaabaaaaaaaaaaaaaaaaaahbcaabaaaaaaaaaaabkaabaaaaaaaaaaaakaabaaa
aaaaaaaaelaaaaafbcaabaaaaaaaaaaaakaabaaaaaaaaaaadcaaaaajbcaabaaa
aaaaaaaaakaabaaaaaaaaaaaabeaaaaaaaaadaebabeaaaaaaaaacambaaaaaaak
ocaabaaaaaaaaaaaagijcaaaaaaaaaaaagaaaaaaagijcaiaebaaaaaaaaaaaaaa
ahaaaaaadcaaaaakhccabaaaaaaaaaaaagaabaaaaaaaaaaajgahbaaaaaaaaaaa
egiccaaaaaaaaaaaahaaaaaadgaaaaaficcabaaaaaaaaaaaabeaaaaaaaaaiadp
doaaaaab"
}
SubProgram "opengl " {
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
"!!GLSL"
}
SubProgram "d3d9 " {
// Stats: 15 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
Vector 0 [_Color]
Float 2 [_node_1057]
Vector 1 [_node_4617]
"ps_3_0
def c3, 0.200000003, 10, 11, -10
def c4, 1, 0, 0, 0
dcl_texcoord v0.xy
add r0.xy, c3, v0
pow r1.x, r0.x, c2.x
pow r1.y, r0.y, c2.x
add r0.x, r1.y, r1.x
rsq r0.x, r0.x
rcp r0.x, r0.x
mad r0.x, r0.x, c3.z, c3.w
mov r1.xyz, c1
add r0.yzw, -r1.xxyz, c0.xxyz
mad oC0.xyz, r0.x, r0.yzww, c1
mov oC0.w, c4.x

"
}
SubProgram "d3d11 " {
// Stats: 9 math
Keywords { "DIRECTIONAL" "SHADOWS_SCREEN" "LIGHTMAP_OFF" "DIRLIGHTMAP_OFF" "DYNAMICLIGHTMAP_OFF" }
ConstBuffer "$Globals" 144
Vector 96 [_Color]
Vector 112 [_node_4617]
Float 128 [_node_1057]
BindCB "$Globals" 0
"ps_4_0
root12:aaabaaaa
eefiecedbabakleflijnjmcmgkldbcbdippjnhehabaaaaaaceacaaaaadaaaaaa
cmaaaaaaieaaaaaaliaaaaaaejfdeheofaaaaaaaacaaaaaaaiaaaaaadiaaaaaa
aaaaaaaaabaaaaaaadaaaaaaaaaaaaaaapaaaaaaeeaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaabaaaaaaadadaaaafdfgfpfaepfdejfeejepeoaafeeffiedepepfcee
aaklklklepfdeheocmaaaaaaabaaaaaaaiaaaaaacaaaaaaaaaaaaaaaaaaaaaaa
adaaaaaaaaaaaaaaapaaaaaafdfgfpfegbhcghgfheaaklklfdeieefcgeabaaaa
eaaaaaaafjaaaaaafjaaaaaeegiocaaaaaaaaaaaajaaaaaagcbaaaaddcbabaaa
abaaaaaagfaaaaadpccabaaaaaaaaaaagiaaaaacabaaaaaaaaaaaaakdcaabaaa
aaaaaaaaegbabaaaabaaaaaaaceaaaaamnmmemdoaaaacaebaaaaaaaaaaaaaaaa
cpaaaaafdcaabaaaaaaaaaaaegaabaaaaaaaaaaadiaaaaaidcaabaaaaaaaaaaa
egaabaaaaaaaaaaaagiacaaaaaaaaaaaaiaaaaaabjaaaaafdcaabaaaaaaaaaaa
egaabaaaaaaaaaaaaaaaaaahbcaabaaaaaaaaaaabkaabaaaaaaaaaaaakaabaaa
aaaaaaaaelaaaaafbcaabaaaaaaaaaaaakaabaaaaaaaaaaadcaaaaajbcaabaaa
aaaaaaaaakaabaaaaaaaaaaaabeaaaaaaaaadaebabeaaaaaaaaacambaaaaaaak
ocaabaaaaaaaaaaaagijcaaaaaaaaaaaagaaaaaaagijcaiaebaaaaaaaaaaaaaa
ahaaaaaadcaaaaakhccabaaaaaaaaaaaagaabaaaaaaaaaaajgahbaaaaaaaaaaa
egiccaaaaaaaaaaaahaaaaaadgaaaaaficcabaaaaaaaaaaaabeaaaaaaaaaiadp
doaaaaab"
}
}
}
}
Fallback "Diffuse"
}

It’s got the lot – metaverse development

My current project has kept me pretty busy with good old fashioned hands on development. However, sometimes it is good to step back and see just how many things a project covers. I can’t go into too much detail about what it is for but can share the sort of development coverage.

(*update 11/6/14 Just adding this picture from a later post that describes this environment)
It is a unity3d multi-user environment with point and click. It works on the web so it needs a socket server to broker the network communications. So it has a Photon Server. That Photon Server is not on their cloud but on a separately hosted box with a major provider. So that needs my attention sys-admin wise configuring and keeping it up to date.
The unity3d system needs to be logged into and to record things that have happened in the environment. So I had to build a separate set of web pages and php to act as the login and the API for the unity3d web plugin to talk to. This has to live on the server of course. As soon as you have login and data, users etc you need a set of admin screens and code to support that to.
The unity3d system also needs voice communication as well as text chat. So that’s through Photon too.
The actual unity3d environment has both regular users and an admin user in charge. So there are lots of things flowing back and forth to keep in sync across the session and to pass to the database. All my code is in c# though sometimes a bit of js will slip in. WE have things like animations using the animation controller and other unity goodies like Navmesh in place too.
I am working with a 3d designer so this is a multi person project. So I have had to set up mercurial repositories and hosting the repo on bitbucket. We sync code and builds using Atlassian SourceTree which is really great. I also have an error tracking system with Atlassian so we have a JIRA. It means when I check code in and push the repository I can specify the JIRA reference number for the issue and it appears logged on the issue log. That combined with all the handy notifications to all concerned.
As I have a separate server component running I had to set up another repository to enable me to protect and synchronise any server changes, the server has its own repository ID so it can pull the unity3d builds to the server too.
There are complications in doing a database communication as Unity will only talk to the server that is is served from using the www classes. So it makes local testing of multiuser a little tricky. The unity dev environment is able to emulate the server name but the built versions can’t so there is a lot of testing bypass code needed.
Oh I forgot to mention, this is all in Arabic too. There is nothing wrong with that except I don’t know the language. Also Arabic is a right to left language so things have to be put in place to ensure that text, chat etc all flows correctly.
A few little problems arose with this. Unity has an excellent Arabic component that allows you to apply right to left to any output text, however it does not work on input fields. That is a bit tricky when you need text chat, typing in questions and responses etc. So I have ended up writing a sort of new input field, I use a text label but capture the keys pass it to the Arabic fixer component which then returns the right to left version that is displayed in the label. I do of course loose things like cursor and focus as the label is an output device but needs must.
In order to support Arabic in html and in the database I had to ensure that the text encoding of everything is UTF-8, there is also a directive tag dir=rtl that helps browsers know what to do with things. However I have found that this works with HTML input fields but seems to not work with password fields. My password field will not let me type Arabic into it. The keyboard language chooser on the mac reverts to uk and Arabic is greyed out. This cause me a lot of confusion on logging in.
There is also the confusion of what to type, it is relatively easy to cut and paste translated Arabic labels into strings, but when testing a chat system or user names I needed to know what english keystrokes generated what Arabic phrase (that’s not a translation thats a how do I type something meaningful in Arabic and see it come up on the screen).
Luckily my good friend Rob Smart came to my aid with “wfhp hgodn” which equates to صباح الخير which is a variant of good morning. It helped me see where and when I was getting the correct orientation. Again this is not obvious when you start this sort of thing 🙂
Anyway its back to layering and continuos improvement. Fixing bugs, adding function. It is pretty simply on paper but the number of components and systems, languages and platforms that this crosses is quite full on.
The project is a 3 person one, Project manager/producer, graphic designer and me. We all provide input to the project.
So if you need any help or work doing with unity3d, c#, photon,html, php, MySQL, rtl languages, cloud servers, bitbucket, mercurial, sourcetree, JIRA then I am more than slightly levelled up though there is always more to learn.
Phew!

Dear BBC I am a programmer and a presenter let me help

I was very pleased to see that the Tony Hall the new DG of the BBC wants to get the nation coding. He plans to “bring coding into every home, business and school in the UK”. http://www.bbc.co.uk/news/technology-24446046
So I thought, as I am lacking a full time agent in the TV world, I should throw my virtual hat in the ring to offer to work on the new programme that the BBC has planned for 2015.
It is not the first time I have offered assistance to such an endeavour, but this is the most public affirmation of it happening.
So why me? Well I am a programmer and have been since the early days of the shows on TV back in zx81/c64/bbc model a/b/spectrum days. I was initially self taught through listings in magazine and general tinkering before studying to a degree level, and then pursuing what has been a very varied career generally involving new tech each step of the way.
I was lucky enough to get a TV break with Archie Productions and the ITV/CITV show The Cool Stuff Collective, well documented on this blog 😉 In that I had an emerging technology strand of my own. The producers and I worked together to craft the slot, but most of it was driven by things that I spend my time sharing with C-level executives and at conferences about the changing world and maker culture.
It was interesting getting the open source arduino, along with some code on screen in just a few short minutes. It became obvious there was a lot more that could be done to help people learn to code. Of course these days we have many more ways to interact too. We do not have to just stick to watching what is on screen, that acts as a hub for the experience. Code, graphics, art work, running programs etc can all be shared across the web and social media. User participation, live and in synch with on-demand can be very influential. Collections of ever improving assets can be made available then examples of how to combine them put on TV.
We can do so much with open source virtual worlds, powerful accessible tools like Unity 3d and of course platforms like the Raspberry Pi. We can also have a chance to explore the creativity and technical challenges of user generated content in games. Next gen equipment like the Oculus rift. Extensions to the physical world with 3d printers, augmented reality and increasingly blended reality offer scope for innovation and invention by the next generation of technical experts and artists. Coding and programming is just the start.
I would love to help, it is such an important a worthy cause for public engagement.
Here is a showreel of some of the work.

There is more here and some writing and conference lists here
So if you happen to read this and need some help on the show get in touch. If you are an agent and want to help get this sort of thing going then get in touch. If you know someone who knows someone then pass this on.
This blog is my CV, though I do have a traditional few pages if anyone needs it.
Feeding Edge, taking a bite out of technology so you don’t have to.
Yours Ian Hughes/epredator

Adventures with Photon and Unity3d

The unity3d hospital I have been working on has, up to now, been running on the Photon Cloud. (Photon server from Exit games is a socket server that allows client applications like those in Unity3d to talk to one another. They run a simple to use hosted version called Photon Cloud which is great for testing things out.
I decided though that some of the traffic we were pushing through might break the tiers for hosting on the cloud so thought I would run my own server. It was not the concurrent user as we have a few users, but they do a lot. Rather than a lot of users doing a little which is the general profile for gaming.
Unity dev
In part that is because on of the unity clients acts as the master for the application. It holds a lot of simulation data and changes to that have to be communicated (in various cached ways). If I had built it as server logic we may have cut down on traffic but would have to stick to a single way of working. As it is the application is also designed to fall back to disconnected mode and can be run as a non network demo (though that has its own challenges).
I did have a few difficulties to start off with but many of those were actually very simple to solve, and if you read some of the annotations in the docs they all make sense.
I sparked up a rackspace windows 2008r2 server first. Photon is windows based. I had dabbled with the Azure cloud hosting version for it but much of that required a windows development environment to deploy to and I am a Mac user with occasional windows use 🙂 So it was much simpler to have a rackspace server and use the remote desktop to attach to it.
Downloading the files via the remote desktop was a problem to start off with due to all the various firewall restrictions, so there was a bit of clicking around windows admin.
I followed the 5 minute setup (kind of). Once downloaded you just end up with a set of bin directories for the Photon control and directories with the various server applications and configurations.
Remote Desktop ConnectionScreenSnapz001
All I needed was a lobby host and then simple game rooms to hold broker the data flow and RPC calls. I had not extra server logic. If something moves in unity3d on one client it needs to move on the other.
I didn’t have much luck having asked photon to start the Default application. I was not getting connected so I added a few extra firewall rules just to be on the safe side. I was starting to wonder if I could get to the hosted machine at all but I think there were some other network problems conspiring to confuse me too.
Then I read that if you are switching from Photon Cloud to your own server you should use the other application configuration, cunningly named Loadbalancing(MyCloud). I switched to that and ran the test client on the actual server and things seemed better. Still no luck connecting from unity3d though. Then I looked at the menu option that said Game Server IP config. It was set to a local address, so obviously the server was not going to be letting itself be known to the outside world. A simple click to autodetect public IP and I was able to connect from unity3d.
It all seemed good until after a few connects and disconnects it started to throw all sorts of errors.
I had to ask on the forum and on twitter, but just asking the question I started to think what I was actually doing and what I was running. I was glad of the response from Exit Games though as it meant that I was going along the right path.
Again it is obvious but… The mycloud application config that I was using had 1 master server and 2 game servers it did say it was not for production, but as this is not a massive scale game I thought I wouldn’t touch any configs. It looked like the master server was getting to a point of asking each game server who was the least busy (to direct traffic to) and getting an answer that they were both maxed out. I initially thought I needed to add more game servers, but it was actually the opposite. Removing one of the game servers from the config (effectively removing and load balancing logic) meant the same game server got the connections. The loadbalancer is really there for other machines to be brought into mix.
Having thought that was what was happening I cut the config files but still found after 30 mins running I got conflict. I did say that in the forum post too. However I had not fully rebooted the windows box only restarted the photon server. I think there is a lot of shared memory and low level resources in play. A few reboots and restarts and things seem to be behaving themselves.
The test will be today when the scenario is run with several groups of 5 users, but all running voice too.
Unity server buttons
I have put some server fallback code in though to allow us to switch back to the Photon Cloud if my server fails. For a while I was publishing two version one for cloud and one for my server. That was getting impractical as each upload on my non infinity broadband was taking 45 mins. So any changes had a 90+ minute roundtrip not including the fix.

Goto; Amsterdam part 1 of 2 – Software engineering is changing

I was really happy to be asked to both attend and speak at this years Goto; conference in Amsterdam. I just got back and whilst I had been tweeting (probably a little too much) from the conference I thought I would try and distill a few things that I noticed and felt about the whole thing. Firstly thanks to Dan (@mintsource) for inviting me along, we were on the emerging interface track and so it was the mad end of software engineering, but as with all emerging stuff as we know, it’s the future.
The venue was the old corn exchange right in the heart of Amsterdam, a very impressive structure and has been modernised inside in some interesting ways that do actually fit.
Great venue for #gotoams
Our track was in the glass cube inside the brick frame 🙂 A cool space (though a little warm 🙂 )
Tomorrow's venue after lunch talking blended reality, learning, games and tv #gotoams
I knew what our track was going to be like but I have not been to a pure software engineering conference for a long time. Times have changed.
The first keynote was from the wonderful Linda Rising. @risinglinda talking about Incentives: why or why not?”. She is a very inspirational figure as she explains the path she has taken in tech and now even more so in dealing with people not process and into the realms of neuroscience. Not as a researcher but as a student. She also explains she may seem an odd person to see at a tech conference for various demographic reasons. This talk was the start of something I was surprised to see addressed quite so much. The importance of actual people, doing actual work and their motivations to do that. Linda pointed out the amount of real research that indicates certain well held corporate beliefs in what motivates people are pretty much wrong. Taylorism seems to have got hold and taken hold everywhere. Several other books were mentioned including The Progress Principle: Using Small Wins to Ignite Joy, Engagement, and Creativity at Work
Also the Pygmalion of Management HBR research showing that people clearly make a first impression and when they are managers they manage to that impression of someone. Which is of course detrimental all around.
Obviously with slightly rebellious and provocative attitude to the ridiculous practices in corporate life that I have often challenged she was speaking to the converted. We had a good chat after the presentation at one of the breaks and I was very much looking forward to her next talk the following day. As I have been busy reading (yes actually reading) Thinking, Fast and Slow and the author was mentioned in the talk it fitted really well as a kick off for a conference for me.
We then split off into tracks. The Rise of Educational Startups, HTML5 Rocks Javascript, Big Data NOSQL search, software craftmanship and bring your own language.
As usual you can’t go to everything. I stuck with the rise of education startups and with HTML5 and Javascript. The former because I do a lot of that sort of thing, the latter because I wanted to see what the high end world of software engineering was saying about the potentially anarchic new web tech.
The first pitch by Matteo Manferdini was a bit of a busman’s holiday as he was pointing out the flaws in educational games that try and have education as an end reward for some play. It did dovetail nicely with the keynote as really this is about rewards or incentives and why anyone would want to do something, including playing a game. He ended up showing the videos that were played at IGBL 2013 too with the never ending bin and the musical stair case. It was also a place for him to tell Jane Mcgonigal (@avantgame) story and also mention Raph Koster and theory of fun. This all made sense, and I was glad to see it being presented as it meant I wasn’t going to be doing my talk to a load of people who had never heard some madness 🙂
The next pitch was Nick Grantham of @fractuslearning asking “Are You Giving Teachers Blisters? – Finding the Right Fit for an EdTech Startup”. being an Aussie who lives in Ireland he had a suitably different presentation. Relating education to shoes. The wrong shoes at the wrong place give you blisters. So chucking in educational tech for the sake of it causes friction, and therefore fails. It was a very good one, some good war stories and consulting style references.
After lunch it was time for HTML5 etc.
Sergi Mansilla started off talking about “HTML5 is the future of mobile” It was really a direct pitch about the new Firefox OS. Not so much as a sales pitch but pointing out the politics of mobile. The walled garden native apps causing all sorts of problems for developers, the lack of open API’s to help use any device in anyway. Also the fact that HTML5 is often thought of as a single thing, just like a simple markup. It is instead a mix and match ecosystem of so many bits and pieces that its flexibility is also its drawback.
Next it was tech royalty time. Douglas Crockford the creator of JSON talked about some code he was working on “Managing Asynchronicity with RQ”. Now this was real code, talking about a set of helper functions to allow multiple asynchronous calls to go out to the world and be composed and returned as results without blocking. Definitions like, call these 3 weather API’s I don’t care which one comes back first, but if one does come back, use that and move on. It was a different model to event driven systems and despite being just code slides it made a nice counterpoint to the other presentations. Well worth checking out.
Finally in the tracks for that day and before the party. Brian Le Roux did a talk “Best of WTFJS”. This was based on him having gathered a collection of weird and wonderful JS bugs and features, work around and hacks. He did the entire thing in a terminal window just typing them 🙂 They all made sense, but were all daft at the same time. It was a great live pitch. One of my favourite pitches too.
Wtfjs #gotoams a live terminal showing mad js
e.g. 3 > 2 > 1 returns false 🙂
Anyway it was nice to end on some code but have started on people. Then it was back to people, beer, wine and food for the mid conference party.
However just before that we had another keynote. Eric Meijer gave a dynamic and crazy speech about “A Monadic Model for Big Data”, basically pointing out the huge flaws in the relational database model. It was partly a joke, but not really. It did conclude with the fact that if it works use it, but that there are better and simpler ways that doing a select statement. In particular when you are dealing with live data, it is just there, not a summation of a report of stored information. The web is a huge repository of live data, distributed and now. His example of an earthquake app pointed out that the application was going to find an earthquake as it happened. Not go and lookup the data of yesterdays reports in a relational table 🙂 Anyway it was a buzzing and well done pitch from Mr c#.
#gotoams party flyers
Much fun was had by all (and some great chips afterwards too 🙂 )
Part 2 to follow. (actually here it is :))

Run Makie Run

Firstly this is not related to anything official to do with any major sports event, so no brand police actions thank you very much.
The other day I blogged about using some homebrew kit to make martial arts related monitoring. It is an ongoing side project, but as part of that I mentioned using the Blobo for some of it as it has a motion sensor. Well in the course of thinking that through I wondered if I might be able to do something a little quirky to bring my Makie alive.
With the hollow head in the Makie designed for an arduino lily pad I thought I would see what happened if I put the Blobo in as the brain. I mean a 3 axis bluetooth transmitting fully functioning device seems like a good idea and saves a lot of soldering 🙂
IMG_3960
The Blobo nearly fitted in the head. Without the eyes in it is the right size as is.
However much of the Blobo is designed to give it a decent size and feel, so with the guts ripped out it looks a much better fit.
IMG_3962
So here he is going for a “run” with the Blobo sprinting app.

I have the SDK kindly provided by Martin Gossling at Quarternion so I can write some more Makie specific Unity3d applications. So if you pick up the Makie or move him at all he will be able to respond.
Trying this project out led Martin to get in touch re the Choi Kwang Do requirements and in a wonderful bout of serendipity they have repackaged and upgraded some of the same principles of the Blobo into BPMPro which looks like a great thing for CKD practitioners to give a go so watch this space as this looks very exciting 🙂 I think that combined with Kinect to some brilliant insights and ways to fine tune technique (not just power an speed) will evolve.
So this has merged the threads of 3d printing with Makie, homebrew use of technology for things other than it was intended (Maker Culture), the modern martial art of Choi Kwang Do and some Unity3d. Funny little ideas need to be explored as they are often lynch pin to a wider goal. “Feeding edge: Taking a bite out of technology so you don’t have to” 🙂

Imperial Treet – Hospitals, Patients and SL

This week Dave Taylor/Davee Commerce and Robin Winter had a special on Treet.tv about lots of the virtual world projects in Second Life that Imperial College London have been up to. It is a great show to watch to see the variety of ways Dave has got Second Life working from public information, targeted patient experiments and doctor training.

The doctor training and evaluation that appears around about 32 mins in Dave says. “This is where we have our virtual patients, and these patients are controlled by software actually outside of Second Life. That software has a knowledge of the patients physiology and condition.” He also explains there are 3 wards and 3 patients in each giving 9 levels of difficulty in scenario.
“We are using this to research how we can asses trainee doctors at different levels of training”. “We have tested about 60 doctors so far on this”.
I am glad this is out in the public as this has been part of the work I have been doing in SL. I can’t explain exactly what does what as its a private project but as Dave points out the patients and the interactions are controlled from outside of Second Life, my part in SL is the broker talking to that external model. I also ended up building the dynamic menus and handlers in world. The menu’s are based on the data coming back, and align to the correct place in world so they are designer friendly. This was built before the web on a prim existed, and we aimed to do everything in world. As you know handling text can be a problem in SL and variants of Fasttext and xy text came to rescue. Though rezzing a dynamic button and making it know what it is supposed to do is a non trivial task. This was also before HTTP in world servers were stable so SL is the controller asking the external software what to do next.
It has been a fascinating project, as has its follow on ones that have increased in complexity and in interactions. Making SL a component in a system not the sole piece of the project makes for a greater richness and flexibility. After all SL is not a database/data handling application.
What is great is that Robin, who is one of SL’s foremost designers (along with his other half) and has been for years(he built the original Dublin sim), is able to craft animations and objects and then trigger them into existence using our message protocol, after the external software model tells my broker code that its got some changes to display.
There are a few of us pushing the bondaries of data interchange with SL and also with opensim and other virtual worlds. I hope this helps people understand that we can do very complex integrated tasks using the best of a Virtual World and the best of a traditional server application. Integration is the key.

Fiducial Markers and Unity 3d

I was looking around for a quick way to use the wonderful Reactivision camera based marker tracking. Ideally I really wanted a fully working Reactable, but without a projector and the music and light software and the physical elements like a glass table I cant really do what I want to do in the time I need to do it.

I bumped into a project from a few years ago the tangiblaptop that was aimed at using the laptop screen rather than a projector to display the things that happen to the markers that are also placed on that surface but picked up by the camera. It looked good but I thought I needed something with a bit more variety.
Then I saw this Uniducial . A small library and and couple of scripts to drop into Unity3d.
Within seconds I had objects appearing and disappearing based on the markers they could see in the camera view.
(I have done a few things with these before as did Roo back in the day
However I think the unity3d gadget is going to be very useful indeed 🙂

Techie Post: Opensim and Freeswitch problems

I was just replying to a note asking about Opensim and Freeswitch based on the fact I have it working(ish) on Ubuntu on a cloud server. I have been meaning to share where I am up to, though I have not found a complete solution to the problem I am having. However this snippet may help someone, or they may be able to help me 🙂

My Opensim and Freewswitch are running Ubuntu Karmic on a cloud service. Both opensim and freeswitch get built rather than binary distributions. However the ini files and config should be the same (as thats the beauty of this).

Freeswitch works by really just opening a conference call which any avatar will dial into. I am not sure how much digging you have done into the problem but there are a few things to look out for.
I am assuming you have followed the main instructions for config etc http://opensimulator.org/wiki/Freeswitch_Module

I had this all running from 6.8 onwards but recently moved to 0.7 I noticed in bin/config-include that there was now a line in StandaloneCommon.ini for Freeswitch not just the Opensim.ini. I think I had to uncomment that when I moved to 0.7

The main thing is to make sure that Freeswitch is started first and ready , then spark up opensim.

The default for the region in Opensim is to not have voice enabled at a land parcel and region level. In Hippo I enter god mode and set both the estate options and the parcel options to allow voice. Having done that (only needed if reloading the terrain or something major, I usually logoff and log on again with the view and it gets enabled. That one has caught em a few times.

The other thing to consider is some of the viewers, in particular the more recent open source ones are not allowed to package the SLVoice components with them. That one caught be when I asked someone to use Hippo on windows. The Mac hippo is old enough to still have the voice DLL but the newer windows one did not have it! SL Viewer 1 should of course be OK, but I think it is still not advisable to use SL Viewer 2 on opensim. Imprudence discusses a patch http://imprudenceviewer.org/wiki/How_to_Re-enable_Voice_Chat which is how I found out the root of the problem

The server firewall is another thing to check, as is the client firewall. As the ports are potentially different as freeswitch is a completely different application to opensim and the viewer can get blocked.

Should all of the above be ok then its console time for both Opensim and Freeswitch. There are some spurious errors it seems as things try to establish connections. However if you get a couple of clients connected (assuming you get the voice enabled in teh client) you can go into freeswitch and use the sofia commands. sofia status and sofia status profile internal both give a bit of information. I am no expert on the commands but I have been able to see if clients have connected from opensim.

This is where we get to a problem I now have. Freeswitch seems to not be working for everyone who comes to my sim. In my testing I used a mac and a windows machine on my own network at home but both talking to the remote cloud server. That has always worked (though technically it shows as the same IP address twice). I patched in someone else across the country who was using SL viewer 1 and we had a conversation, so I thought it was all working. However a few other people, when we have tried a larger meeting, have experienced problems.

It has been a mix of using a viewer with no SLVoice in it, firewalls but also some strange timing behaviour. It is quite difficult to test but that is why I have been using the sofia commands.

The first person to connect will generally get hold music playing. The second person to connect will enter the call, there is a beep as that happens the hold music ends. However you cannot always hear anyone speaking.
I suspect this is the client firewall operating. i.e. Opensim will talk to freeswitch at a server to server level to patch the person in. the Sofia status tends to show me all the users patched in to the freeswitch console. After that its the SL clients doing the work. When I get no voice response it tends to break the call for everyone. I have not found out how to check that or tell freeswitch to ignore bad calls. Getting the once that break it to logoff so there is only 1 person (i.e. me) in the call sometimes takes a few minutes before the hold music kicks in again. There are some sofia reset commands for the profile that I have dabbled with.

It was only a few weeks ago I hit this snag, the people using my freeswitch are not always in a position to mess with their company firewalls so it has been hard to get a test rig that fails consistently to try and debug it.

I have been meaning to write this down somewhere to help others but wanted to try and fix the problem first, but not many people are using or trying freeswitch.

Having said that if you get the voice connected (probably the god mode parcel audio solution at the start of this) you may find it just works. It could be my server having some bandwidth or memory issues etc. Its a tricky one to spot.

Red Dead Redemption – it is what we miss that makes it so good

Good design of any sort is really effortless, or joyous for those on the receiving end of it. Something that has had a whole load of great design go into it is the new gaming classic Red Dead Revolver. Non gamers and “serious” types may just consider this as trivial as hula-hoop. A toy that a few grown ups enjoy playing with. Well…. it is. At one level of abstraction it is a football, a hoop and stick. On another level, it, and the current generation of well crafted gaming experiences are a fantastic example of good design and talent.
Red Dead Redemption is a cowboy gaming experience. It is an exceptionally large free roaming area interspersed with a plot that takes you from set piece to set piece. You can if you want just go for a ride and see what happens though.
The first thing that most people latch onto, with good reason, is the graphics and the animation in the game.
Red Dead Redemption
Given where we were only a few years ago the graphics quality, the detail on things like the horse animations, the size and scale of the terrain, the flora and fauna and even tumbleweeds is very good. Still pictures do not do it justice. Xbox 360 or PS3 though it just works. Just think for a moment the amount of graphic design time that has to go into both the size and scale and the intricate detail. The flowers on the plants, the mane on the horse. Even the bullets in your bandolier are all created by someone. So as a graphic design task, even with tool and middleware support this is a monumental undertaking. Of course tiling, and cut and paste comes into play, but just consider the person hours of skilled design tool usage then add to that the overlaying of the design of “where” all these things go and can go.
Under the covers there is of course code. Programming and detail in allowing things to happen, chaining effects together, determining where and when a bullet has hit. As a programmer I know that most people do not see the code under any of this, but it takes as much design effort and talent as the visuals. The system architecture and middleware combinations becoming the “where” all these things go.
However I think that many people in most enterprise businesses and alike will understand a little about IT (from using it all the time) and maybe a little about visuals from having to create the odd powerpoint. Clearly not the same but at least in the general area. People probably have a moderate understanding of testing (though not the mind numbing repetition and test case coverage that goes into knowing something is right)
The things that people don’t have to get involved with and that have really evolved so much and been taken seriously in the production values of high end games are things around the sounds and voices that you hear.
Red dead redemption good deed
The sound design is generally so well done, creates so much atmosphere and is in some senses more transient than the visuals that it almost melts away. Also the acting quality is just miles beyond the early fumbling attempts to read badly written dialogue we used to see and hear. When the characters in the game talk to one another in cut scenes or as part of atmospherics in a town it feels real. Of course the dialogue has to be layered with western style occasionally over the top elements, but films that we passively sit and watch have all sorts of over the top characters.
Threading all this together is the script, both a story arc and then the micro stories that form at key points in the game progression. Thrown in are also random events that happen around the place as you travel around, thefts and challenges which you choose to engage with.
Its all highly immersive, and very entertaining. That is without even engaging in what is in effect a completely different use for all these assets, the multiplayer games. These take place in the whole environment, fellow real people being cowboys and travelling the land living their own stories.
Now clearly I am a gamer, but I do not always feel compelled to complete a storyline. However recent months have seen Modern Warfare 2, Uncharted 2, Heavy Rain and Red Dead Redemption all making me want to complete the storyline and not ending up disappointed. There are lots of games but these stick in my mind for being good stories and for me wanting to, and actually bothering to complete them. Many people will of course not spend 40+ hours on a single gaming experience of this sort, though they will of course spend 40+ hours grinding on Farmville, or breaking jewels in Bejeweled and alike, or maybe watching Eastenders or Coronation Street repeat soap plots. It is though all good as far as I can see.
Games and gaming experiences, both ones we create for ourselves and ones we are directed through are as memorable as any traditional film or TV experience. The effort and design going into them warrants the time and attention to explore them. For me it is of course business and pleasure, research and a release, which makes it doubly valuable.
Well done Rockstar games (again)