HLSL variable woes

Whereas the OpenGl context in GLSL provides global variables for different aspects, input and output variables must be strictly defined in HLSL. Unity does provide some set variables for easier integration with the graphics engine, like _WorldSpaceLightPos0 for the light position. For a scalar variable I had correctly written Float in ShaderLab properties but float4 (4 dim vector) in the CGPROGRAM section. Faire attention SVP!

GLSL not a good fit for Unity on PC

I have decided to write my shader in HLSL/Cg rather than GLSL. On my Windows PC, Unity assumes that I that the GLSL I am writing is version 4 (that my graphics card supports) but can’t compile as all the resources I find are based on GLSL version 1.3 or lower but as of 1.4 the oft-used type qualifiers attribute and varying were removed. My Android device has OpenGL ES 2.0 hardware which supports only GLSL version 1.1 and parts of 1.2. That means the same code will not run on both my PC for debugging purposes and on my Android device. I have only been able to make simple shaders without portability problems. Also, Unity does not recommend writing directly in GLSL. The Unity editor crashed and froze many times without producing helpful debug messages about the GLSL code errors.

Combining Cg/HLSL and GLSL in Unity

When you have never programmed shaders before, there is quite a lot to decide on. Documentation and tutorials focus on a specific shader language, so even though the concepts might transfer one should make sure to pick the right one. My target is an Android device with support for OpenGL ES 2.0 and that uses GLSL for shaders. Normally HLSL is not portable and only runs on DirectX platforms. While there are compilers from the deprecated language Cg to both Direct3D HLSL and GLSL, Unity 5 actually uses an adaption of HLSL that runs as HLSL for Windows and Xbox One platforms and compiles to GLSL for other OpenGL platforms. One can also write GLSL in Unity, but then it will not be able to recompile that to other platform-specific languages and it will only run on GLSL compatible platforms.

I have not yet decided whether I want to write my shader in GLSL or HLSL/Cg (Cg and HLSL practically only have subtle differences and are almost syntactically identical as they were co-developed by Nvidia and Microsoft and released with different platform support). It will depend on which languages the learning resources that will help me write my shader use. Unity tutorials and documentation favours HLSL for portability reasons but writing directly in GLSL might make for faster shaders on Android.

Unity’s ShaderLab language is a shader specification format like CgFX. The actual code for running the shader program is written in Cg/HLSL/GLSL, and nested in ShaderLab code. When you create a new shader file in Unity 5 – with for example the command Assets > Create > Shader > Unlit Shader – that file contains both ShaderLab code and Cg/HLSL/GLSL code. A shader file can contain different implementations of the shader, for example in both GLSL and HLSL, and ShaderLab can be used to set graphics device states and expose settable variables in Unity. All the file’s code that is not written in ShaderLab is nested within the keywords CGPROGRAM and ENDCG for Cg/HLSL and GLSLPROGRAM and ENDGLSL for GLSL. Mixing the shader program languages can be done by making two SubShader blocks in ShaderLab – one for each. The first SubShader that can be run on the platform will be used. Normally Unity cannot run GLSL shaders on Windows – not even for testing – but if the Unity Editor exe is started with the argument -force-opengl it will force Unity to run in OpenGL mode. This can be accomplished by making a shortcut to the editor exe and changing the target from “[Path]\Unity.exe”to “[Path]\Unity.exe” -force-opengl.

We can exchange all the code in the Unlit shader file that we made for this simple flat single-colour shader.

Shader "Custom/Test1Combo - Flat Color"
{
	Properties
	{
		_Color("Color", Color) = (1,1,1,1)
	}

	SubShader
	{
		Pass
		{
			GLSLPROGRAM

			// includes
			#include "UnityCG.glslinc"

			// user-defined variables
			uniform lowp vec4 _Color;

			// vertex program
			#ifdef VERTEX
			void main() {
				gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
			}
			#endif

			// fragment program
			#ifdef FRAGMENT
			void main() {
				gl_FragColor = vec4(1.0, 0.6, 0.0, 1); // = _Color;
			}
			#endif

			ENDGLSL
		}
	}

	SubShader
	{
		Pass
		{
			CGPROGRAM
			// pragmas and includes
			#pragma vertex vert
			#pragma fragment frag

			#include "UnityCG.cginc"

			// user-defined variables
			uniform float4 _Color;

			// base input structs
			struct vertexInput
			{
				float4 vertex : POSITION;
			};

			struct vertexOutput
			{
				float4 pos : SV_POSITION;
			};

			// vertex program
			vertexOutput vert(vertexInput v)
			{
				vertexOutput o;
				o.pos = mul(UNITY_MATRIX_MVP, v.vertex);
				return o;
			}

			// fragment program
			float4 frag(vertexOutput i) : COLOR
			{
				return _Color;
			}

			ENDCG
		}
	}
	//TODO decomment fallback when testing is finished
	//Fallback "Diffuse"
}

It exposes the Color parameter to the Unity editors material properties in the ShaderLab block Properties. So if one were to create a 3D object, create a new material and attach it to the 3D object, and set that material to use the shader Custom > Test1Combo – Flat Color, then the 3D object should be coloured in the material’s selected colour. That is running the Cg SubShader though. We can test a GLSL SubShader by having it first and making it different from the Cg/HLSL SubShader – just like in the code above. You can see that the colour is hard-coded to orange in the GLSL SubShader. Closing Unity and opening it up in forced OpenGL mode should change the colour of the object from the selected colour to orange. If that works then Unity is running the GLSL shader. Of course it would be best to then correct the hard-coding and setting gl_FragColor = _Color; instead so both the GLSL and HLSL SubShaders do the same thing.

You can read about how to write HLSL/Cg shaders in Unity’s documentation and this tutorial series, and GLSL shaders in this Cambridge lecture and here and here.

My normals were imported wrongly

When using a standard Unity shader the faces of my icosphere were lit unsmoothly, so I thought there was something wrong with the model normals. I could fix them by changing from “import normals” to “calculate normals” in the Unity import setting of my OBJ-file.