Shader Part 1

Motivation

If you have just recently got interested in shader programming then it can be hard to get to grips with to start with.  This tutorial will take you through the basic steps to get a surface and a fragment shader working.  It will also introduce you to some of the functions and variables in Unity’s shader includes which can differ from the documentation you find online.

You should read this article if:

  • You are new to shader programming in general
  • You want to build shaders to do cool things in your game but you can’t find one that fits your needs
  • Strumpy Shader Editor isn’t helping because you don’t get the basic principles
  • You want to manipulate textures inside your shader

This is the first of several articles that will build up to much more complex shaders.  This first article is very simple indeed.

About The Author

I’m new to shader programming too – so I decided to write this guide to help people get to grips with the starting point of shaders that I had so much trouble over.  I am decidedly not an expert shader programmer.

I read and re-read the documentation trying to work out what I wanted to know and it just wasn’t in the right order for me.  So I thought I’d build this tutorial and share the knowledge I used.  Now when I read the documentation it makes a lot more sense!

While all of the examples in this tutorial work, there maybe better ways of doing some of these things.  If you know one – please add a comment!

My reason for getting into shader programming was to build something that I needed for a world populated with an endless array of different characters.  I needed to build a combined mesh out of multiple parts so I only have one draw call per character.

Having enabled and disabled different items of clothing I modify the base meshes with Megafiers – then I need to have only 1 texture for all of the new models but I need to be able to colour skin, clothing and everything else differently for each character.

I came up with a plan that uses 3 tiny 4×4 textures that are unique for every character and a shader to do the work of using these to colour the model.  I’m planning to fully explain this shader in this tutorial series.

Shaders & Materials

Ok so a shader‘s job is to take your mesh and render it on the screen.  A shader can define a number of properties that you will use to affect what is displayed when your model is rendered – the settings of those properties when stored are a material.

Shaders come in a number of flavours:

  • Surface shaders – these remove most of the hard work and are great for many circumstances
  • Fragment shaders – make you do much more work, are harder to write, but also allow you to do low level things like Vertex lighting which is useful on mobile devices and multiple passes which are necessary for more advanced effects.
In this article we will concentrate on Surface shaders

Resources for Writing Shaders

So the three most important resources you can use for information on writing shaders are:

The Shader Pipeline

You are going to find a lot of stuff about the pipeline of shaders – I’m going to dumb this down to my level.

The job of a shader is to take some 3D geometry and turn it into pixels on a 2D screen.  The good news is that you only get involved in a couple of bits of that process.  For a surface shader it looks like this:

first

Note that you are really not setting the final color of the pixel which will be lit after your function exits, this means you can pass out a normal to affect the lighting

A fragment shader has a similar path but actually has to have the Vertex Function and needs to do a lot of work in it to work out information for the pixel part.  Surface shaders hide that.

So that’s how your code is called, let’s look at what your code needs to look like:

second

You are going to write a shader and it’s probably going to have some properties and it will have one or more subshaders.  The subshader used will depend on the platform that you are running on. You should also specify a Fallback shader that will be used if none of your subshaders can be run on the target device.

Each subshader has at least one pass at the data being brought in and the data going out.  You can use these passes to perform different operations, for example in a Grab Pass you could capture the pixels that are already rendered on the display where your object will appear.  This could be handy if you wanted some advanced distortion effect.  When you are beginning shader programming you will probably not be doing that though!  The other reason to have multiple passes is to have different things like write and ignore the depth buffer at different stages of creating your effect.

When you write a surface shader it is written directly inside the Subshader- the system compiles your code into multiple passes.

While a shader is writing to a 2D screen it is also maintaining how far away from the camera every pixel it writes is – this is so that should a subsequent piece of geometry actually be behind something that is already drawn then it won’t overwrite the pixel that is already there.You can control whether this Z buffer has any effect on your shader code or whether your shader writes to this buffer with commands inside the Pass: e.g. Zwrite Off which doesn’t update the Z buffer for anything you output.

You can use this as a technique to cut holes in other objects – by writing to the Z buffer, but never outputting any actual coloured pixels then the objects behind your model using this shader would not be written (because the Z buffer says something is there already).

So here’s some shader code:

third

Hopefully you can recognise the Properties section, the Subshader and the Fallback.

Understanding Shader Code Magic

The rest of this article is going to focus on understanding what magic is happening inside that simple block of code – because the magic of coding by convention is coming into play and you really have to get it.

When shader programming you are required to call things by the right name, in fact in some cases naming a variable actually makes it assume a particular value.

An Introduction to Properties

You define the properties for your shader in the Properties {…} section just inside the Shader definition.  Properties are shared by all sub shaders.

Property definitions follow this format:

_Name (Displayed Name”type ) = default value [{options}]

  • _Name is the name that this property will be referred to in your program
  • Displayed Name will appear in the material editor
  • type is the type of the property, your choices are:
    • Color – the value will be a single color for the whole program
    • 2D – the value is a power of 2 sized texture that can be sampled by the program for a particular pixel based on the UVs of the model
    • Rect – the value is a texture that is not a power of 2 size
    • Cube – the value is a 3D cube map texture used for reflections, this can be sampled by the program for a particular pixel
    • Range(min, max) – the value is a floating point between a minimum and maximum value
    • Float – the value is a floating point number with any value
    • Vector – the value is a 4 dimensional vector
  •   default value is the default for the property
    • Color – the color expressed as a floating point representation with for parts (r,g,b,a) – e.g. (1,1,1,1)
    • 2D/Rect/Cube – for the texture types the default value can be: an empty string or “white”, “black”, “gray”, “bump”
    • Float/Range – the value to adopt
    • Vector – the 4D vector expressed as: (x,y,z,w)
  • { options } only relates to the texture types 2D, Rect and Cube – where it must be specified at least as { }.  You can combine multiple options by separating them with spaces, the choices are:
    •  TexGen texgenmode: Automatic texture coordinate generation mode for this texture. Can be one of ObjectLinearEyeLinearSphereMapCubeReflectCubeNormal; these correspond directly to OpenGL texgen modes. Note that TexGen is ignored if you write a vertex function.

So a property might look like one of these:

//Define a color with a default value of semi-transparent red
_MainColor ("Main Color", Color) = (1,0,0,0.5)
//Define a texture with a default of white
_Texture ("Texture", 2D) = "white" {}
Note that there is no ; at the end of a property definition.

Tags

Your surface shader can be decorated with one or more tags.  These tags define things that let the hardware decide when to call your shader.

In our example we have: Tags { “RenderType” = “Opaque” } which instructs the system to call us when it is rendering Opaque geometry – Unity defines a number of these things.  The other obvious one is “RenderType” = “Transparent” which says that your shader is potentially going to output semi-transparent or transparent pixels. The Unity standard shader also provides cutout and fade.

The other tags that are useful are “IgnoreProjector”=”True” while means your object will not be affected by projectors and “Queue”=”xxxx”.

The Queue tag has some very interesting effects and is used when the RenderType equals Transparent.  It basically says when your object will be rendered.

  • Background – this render queue is rendered before any others. It is used for skyboxes and the like.
  • Geometry (default) – this is used for most objects. Opaque geometry uses this queue.
  • AlphaTest – alpha tested geometry uses this queue. It’s a separate queue from Geometry one since it’s more efficient to render alpha-tested objects after all solid ones are drawn.
  • Transparent – this render queue is rendered after Geometry and AlphaTest, in back-to-front order. Anything alpha-blended (i.e. shaders that don’t write to depth buffer) should go here (glass, particle effects).
  • Overlay – this render queue is meant for overlay effects. Anything rendered last should go here (e.g. lens flares).

The interesting thing is that you can add or subtract values from these basic queues.  This has significant effects with transparent objects and if you’ve ever had your water plane overlaying your billboarded trees then its this Queue that causes the effect and is the source of your solution.

For example “Queue”=”Transparent-102” is what I use to ensure that my transparent water is behind my billboarded terrain trees.

Shader Structure

Ok so let’s just review the structure of the code inside our shader:

fourth

So our CG program is written in CG language which is a lot like C with some interesting bells and whistles.  Read the NVidia documentation for lots of detail – I’ll try to cover the basics. We add CGPROGRAM and ENDCG to indicate that whatever comes within is CG programming code. So between those two marks, we can use CG methods.

The most important thing is that floating point and vector variables (vec) can be defined with a number between 2 and 4 after them.  This lets you make a float that has up to 4 elements and then actually deal with them individually or together!

//Define a float variable
vec2 coordinate;
//Define a color variable
float4 color;
//Multiply out a color
float3 multipliedColor = color.rgb * coordinate.x;

You can address the individual elements or the whole and you can address just a collection of the elements.  You can use .xyzw and .rgba notation interchangeably to reflect what you are storing inside the variable (color, position, normal etc).  Obviously you can use just float on its own for a single value. The use of the .rgba etc is known as swizzling.

You will also encounter the half and double types – these are floating point values with half and double (respectively) the resolution of a normal float.  half is often used for performance reasons and is less precise.  There is also fixed which is a fixed point real value.

You may want to clamp things in the range 0..1 for colors, to do that you use the saturate function, this can also use the swizzled versions e.g.  return saturate(somecolor.rgb);You can find the length of a vector using the length command e.g. float size = length(someVec4.xz)

Outputting Information from a Surface Shader

So our surface function is going to be called once per pixel – the system has worked out the current values of our input structure for the single pixel we are working on.  It’s basically interpolating values in the Input structure across every face of our mesh.

Let’s look at our surf function:

void surf (Input IN, inout SurfaceOutput o) 
{
     o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
}

Clearly we are returning something in o.Albedo – which is in our SurfaceOutput structure that Unity has defined for us.  Let’s take a quick look at that.  The Albedo is the color of the pixel.

struct SurfaceOutput 
{
    half3 Albedo;      //The color of the pixel
    half3 Normal;     //The normal of the pixel
    half3 Emission;   //The emissive color of the pixel
    half Specular;     //Specular power of the pixel
    half Gloss;         //Gloss intensity of the pixel
    half Alpha;         //Alpha value for the pixel
};

You just need to return values in this structure and Unity will work out what it needs to do when it generates the actual passes behind the scenes.

I Promised You Some Magic

Let’s first look at what comes into our surf function – we’ve defined an input structure that looks like this:

struct Input {
    float2 uv_MainTex;
};

Simply by creating that structure we have told the system to get us the texture coordinate of MainTex for the current pixel each time we call the surf function.  If we had a second texture called _OtherTexture we could get its uv coordinates simply by adding this:

struct Input {
    float2 uv_MainTex;
    float2 uv_OtherTexture;
};

If we had a 2nd set of uvs for the other texture we could get those too:

struct Input 
{
    float2 uv_MainTex;
    float2 uv2_OtherTexture;
};

Our Input structure normally contains a whole set of uv or uv2 coordinates for all of the textures we are using.

If our shader was complicated and needed to know other things about the pixel being shaded then we can ask for these other variables just by including them in the Input struct.

  • float3 viewDir – will contain view direction, for computing Parallax effects, rim lighting etc.
  • float4  with COLOR semantic – will contain interpolated per-vertex color.
  • float4 screenPos – will contain screen space position for reflection effects.
  • float3 worldPos – will contain world space position.
  • float3 worldRefl – will contain world reflection vector if surface shader does not write to o.Normal.
  • float3 worldNormal – will contain world normal vector if surface shader does not write to o.Normal.
  • INTERNAL_DATA – a structure used by some functions like WorldNormalVector to compute things when we write to o.Normal

You might be asking “What the heck’s COLOR semantic?”  Well when you write a normal fragment shader you tell your input structure for that what each variable really represents, so if you were mad you could say:  float2 MyUncleFred : TEXCOORD0; and MyUncleFred would be the UV of the model.  With surface shaders the only one you have to worry about is COLOR.  So float4 currentColor : COLOR; will be the interpolated color for this pixel.  You probably don’t need to worry much about this.

Actually Doing Something

Right we are down to the last 2 lines that haven’t been fully discussed.

Sampler2D _MainTex;

For every property you have defined in the Properties section you have to put a variable in the CG program to access it.  It must have the same name as the property.

fifth

When its a texture the Input structure (or whatever you call it) must use the same name after either uv or uv2 to get the texture coordinates.

The _MainTex variable is a Sampler2D linked to the main texture – it can read a pixel out of that texture given a uv coordinate.

If we had defined a _Color variable then we should define it’s property as:

float4 _Color;

So now the only functional line of our shader

o.Albedo = tex2d( _MainTex, IN.uv_MainTex).rgb;

tex2d samples _MainTex at the uv coordinates we got from the system.  Because that texture is a float4 (including alpha) we need to just get the .rgb values of it for o.Albedo – if we wanted to set the alpha too (not much point in our Opaque render type) it might look like this:

float4 texColor = tex2d( _MainTex, IN.uv_MainTex );
o.Albedo = texColor.rgb;
o.Alpha = texColor.a;

Summary

You’ve read a lot of stuff and all we’ve done is build a shader that is pretty limited, but armed with this knowledge part 2 will start to build shaders that use multiple textures, normals and the like – things that actually might be cool

  • In part #2 we create an accumulative snow shader, where the snow level modifies the model
  • In part #3 we improve the snow shader to blend the snow at the margin
  • In part #4 we create a toon shader using black edges and ramp textures
  • In part #5 we create a vertex/fragment multipass bumped shader – learning all about the complexities of going beyond the surface shader paradigm.
  • In part #6 we create a vertex/fragment shader with a better toon effect than we managed with the surface shader in part #4

3 Comments

Leave a comment