Unity3D Particle Shaders – The simplest shader

In this series I will teach you how to write shaders for particle effects. This will be very VFX-centric so it will be probably much different than most of the tutorials which are covering mesh shading.

First tutorial will show you how to write the simplest particle shader possible. It will allow you to set a texture and use Shuriken’s color modules to change the color or alpha. I am using Unity 5.6 but you can follow even if you have older Unity 5 versions. I will intentionally leave things unexplained as there is a lot of boring things to cover. Instead, in this very first tutorial we will focus on actually making something. You will possibly feel like a blind person looking for a needle but in each tutorial we will unravel more and more secrets of shader writing. Let’s get started!


Creating template shader

First, right click somewhere on your Project window and select Create->Shader->Unlit Shader and name the file Particle_AdditiveSimple .

create_shader

Unity will create a very basic shader for you. Treat it as a template code which we will modify to our needs. You should see the following code:

Shader "Unlit/Particle_AdditiveSimple"
{
  Properties
  {
    _MainTex ("Texture", 2D) = "white" {}
  }
  SubShader
  {
    Tags { "RenderType"="Opaque" }
    LOD 100

    Pass
    {
      CGPROGRAM
      #pragma vertex vert
      #pragma fragment frag
      // make fog work
      #pragma multi_compile_fog
      
      #include "UnityCG.cginc"

      struct appdata
      {
        float4 vertex : POSITION;
        float2 uv : TEXCOORD0;
      };

      struct v2f
      {
        float2 uv : TEXCOORD0;
        UNITY_FOG_COORDS(1)
        float4 vertex : SV_POSITION;
      };

      sampler2D _MainTex;
      float4 _MainTex_ST;
      
      v2f vert (appdata v)
      {
        v2f o;
        o.vertex = UnityObjectToClipPos(v.vertex);
        o.uv = TRANSFORM_TEX(v.uv, _MainTex);
        UNITY_TRANSFER_FOG(o,o.vertex);
        return o;
      }
      
      fixed4 frag (v2f i) : SV_Target
      {
        // sample the texture
        fixed4 col = tex2D(_MainTex, i.uv);
        // apply fog
        UNITY_APPLY_FOG(i.fogCoord, col);
        return col;
      }
      ENDCG
    }
  }
}

If you never wrote a single line of code it may seem to be overwhelming. But worry not! I will explain every single line of code, word by word in another tutorial. For this one, let’s just stick to the fun part of shader writing.

So as I mentioned earlier, this shader will allow you to pick a texture and use Shuriken’s color modules. Before adding all new functionalities, let’s check what this template shader actually does.

Create a material and select our shader. But hola, hola! Where is our shader? Well, let’s go back to the shader file and look at the very first line.

Shader "Unlit/Particle_AdditiveSimple"

This line defines the path in which our shader is located in the material inspector. Is this case, it will be located in Unlit category and will be displayed as Particle_AdditiveSimple. See for yourself!

select_shader_01

 

Cool but not very convenient. Let’s change the path. Open your shader file again and change the first line to:

Shader "Tutorial/Particle Additive Simple"

After saving the file and going back to Unity your shader should be now available in a new category called Tutorial. You can create as complicated path as you wish, for example “Tutorial/01/Particles/Particle_AdditiveSimple” will create a category inside a category, inside a category with your shader at the end. Try it.

Now assign it to the material. This is what you should be seeing in Inspector:

Not very advanced. We can only assign one texture, set tiling, offset values and Render Queue values. After assigning this material to Shuriken and applying default particle texture you will realize that color modules are not supported and blending is completely wrong.

 


Additive blending mode

This line of code tells us that currently particles are treated as Opaque geometry without any blending nor alpha support. LOD 100 is just a way of telling Unity how expensive the shader is – only useful if you want to set shader quality in your game graphics options which we won’t be doing in this tutorial.

Tags { "RenderType"="Opaque" }
LOD 100

To tell the shader to render geometry as transparent we need to set adequate Tags:

Tags { "RenderType"="Transparent" "Queue"="Transparent" }
LOD 100

Now shader will be rendered in Transparent rendering queue and will be of type Transparent. All that means is that Unity will render this geometry when all transparent objects are rendered and it is aware that this geometry should be transparent too. It does not mean that there is any transparency in our shader though! Visually you won’t see any change yet.

Too add support for additive blending we need to set few render state commands. Those commands can define many different things such as backface culling, blending type (additive, alpha blend, no blending etc.), zwrite (how Unity should handle depth buffer generation) and few other things. Modify your code and add Blend One One below LOD 100:

Tags { "RenderType"="Transparent" "Queue"="Transparent" }
LOD 100
Blend One One

Blend One One is a way of telling Unity how to blend geometry with whatever is behind it. I will explain nuts and bolts in another tutorial but if you want different blend mode check Unity’s docs on Blend here. After saving the file you will see that particles are now additive! There is bit of an issue though. Geometry is incorrectly writing into depth buffer causing some weird artifacts.

I won’t get into too much details here but writing into depth buffer for transparent geometry is extremely difficult in realtime rendering in general. At this moment all you need to know is that most of the times you will want to disable depth buffer write. Simply add this render command below your Blend:

Zwrite Off

 

And that’s it. You successfully added additive blending mode to your shader! But let’s add one last thing in this section. Since most of the time all particles are facing the camera we don’t want to render the backside. To render only the front side you need to add Culling command.  You won’t see any visual difference but it will make the shader slightly faster:

Cull Back

Let’s wrap this up. We moved the shader into transparent rendering queue, enabled additive blending, disabled zwrite and enabled backface culling. All of this was achieved in few simple lines of code.

Tags { "RenderType"="Transparent" "Queue"="Transparent" }
LOD 100
Blend One One
Zwrite Off
Cull Back

 


Shuriken color modules

Unity’s particle system is using vertex colors to transfer Color Module’s data into shaders. Every particle is a flat plane constructed out of four vertices. Whenever you assign a color in either Start Color or Color over Lifetime modules Unity is setting the exact color for every vertex of a particle. Then a shader is responsible for using this vertex color data for final color blending.

 

 

At this step we need to get slightly more technical. Let me explain few important terms which we will be using:

  • Vertex program; Also known as vertex shader. Any shader will execute vertex program code first. This is a section of code in which graphics card will do all calculations per vertex. If particle sprites has 4 vertices all vertex program calculations will be performed 4 times.
  • Fragment program; Also known as pixel shader. Very similar to vertex program but does all calculations per fragment (pixel). If your particle is covering a whole screen and your screen resolution is 640×480 GPU has to perform all calculations 307200 times. If you are very far and particle is only 1 pixel big on a screen then fragment program will calculate everything once.
  • Struct; This is a way of storing many different data types in one place. Think of it as a box. You can put many things inside a box for organization purposes.

When writing shaders you need to know that some things that must be done in vertex program and some things can or should be done in fragment (pixel) program. Also, fragment program can only access some data if you pass it from vertex program first. This is very simplified outline of how basic shader works (order of event matters!):

  1. Unity is sending some data to the shader
  2. Shader contains a struct that tells what data a vertex program can receive (our appdata struct)
  3. This data is sent to the vertex program which is responsible for all per-vertex data
  4. Some of this data can be sent to fragment program also known as pixel shader
  5. Fragment program grabs this data and does all per-pixel calculations
  6. After all fragment program calculations are done, the final result is displayed on screen

 

 

With the scheme in mind lets get to work. This struct is responsible for defining what type of data vertex program uses. Currently it says that it is using vertex data (geometry) and first UV channel.

 struct appdata
 {
     float4 vertex : POSITION;
     float2 uv : TEXCOORD0;
 };

To get vertex color data into the shader modify your struct just like this:

struct appdata
{
  float4 vertex : POSITION;
  float2 uv : TEXCOORD0;
  float4 color : COLOR;
};

Now we are receiving vertex color data which Unity sends from Shuriken color modules. But receiving this data does not mean that we are actually doing anything with it. Since we need to multiply vertex color with our particle texture we need to pass it into the fragment program. This is because most of the texture related calculations should be done in fragment program. We could do this in vertex program but resolution of a texture would be extremely low!

To pass data from vertex program to fragment program you need to first declare what data are you passing. This we do in so called v2f struct. “v2f” is just a name which can be changed but it actually explains what the struct does. v2f stands for vertex to fragment. Modify your v2f struct by adding float4 color : COLOR;

struct v2f
{
  float2 uv : TEXCOORD0;
  UNITY_FOG_COORDS(1)
  float4 vertex : SV_POSITION;
  float4 color : COLOR;
};

Now that we declared what data will be passed from vertex to fragment program we need to actually pass this data. This will be fairly simple. This is our vertex program code:

v2f vert (appdata v)
{
  v2f o;
  o.vertex = UnityObjectToClipPos(v.vertex);
  o.uv = TRANSFORM_TEX(v.uv, _MainTex);
  UNITY_TRANSFER_FOG(o,o.vertex);

  return o;
}

 We only need to add one line of code just above return o;

o.color = v.color;

With this new code we are telling our shader that v.color (vertex color that Unity sends us) needs to be passed into fragment program. Now we can do something useful with this color data!

 

We have not done any changes to our fragment program yet so it should look like this:

fixed4 frag (v2f i) : SV_Target
{
  // sample the texture
  fixed4 col = tex2D(_MainTex, i.uv);
  // apply fog
  UNITY_APPLY_FOG(i.fogCoord, col);
  return col;
}

Let’s focus on first line of code.

fixed4 col = tex2D(_MainTex, i.uv);

It means that we are creating variable called col which is receiving RGBA data from texture called _MainTex. To tint our texture with Shuriken color module we need to modify this line as such:

fixed4 col = tex2D(_MainTex, i.uv) * i.color;

Save your shader and change color in Shuriken. You should see that in fact, particles are changing color!

 


Summary

This is final code:

Shader "Tutorial/Particle Additive Simple"
{
  Properties
  {
    _MainTex ("Texture", 2D) = "white" {}
  }
  SubShader
  {
    Tags { "RenderType"="Transparent" "Queue"="Transparent" }
    LOD 100
    Blend One One
    Zwrite Off
    Cull Back

    Pass
    {
      CGPROGRAM
      #pragma vertex vert
      #pragma fragment frag
      // make fog work
      #pragma multi_compile_fog
      
      #include "UnityCG.cginc"

      struct appdata
      {
        float4 vertex : POSITION;
        float2 uv : TEXCOORD0;
        float4 color : COLOR;
      };

      struct v2f
      {
        float2 uv : TEXCOORD0;
        UNITY_FOG_COORDS(1)
        float4 vertex : SV_POSITION;
        float4 color : COLOR;
      };

      sampler2D _MainTex;
      float4 _MainTex_ST;
      
      v2f vert (appdata v)
      {
        v2f o;
        o.vertex = UnityObjectToClipPos(v.vertex);
        o.uv = TRANSFORM_TEX(v.uv, _MainTex);
        UNITY_TRANSFER_FOG(o,o.vertex);

        o.color = v.color;
        return o;
      }
      
      fixed4 frag (v2f i) : SV_Target
      {
        // sample the texture
        fixed4 col = tex2D(_MainTex, i.uv) * i.color;
        // apply fog
        UNITY_APPLY_FOG(i.fogCoord, col);
        return col;
      }
      ENDCG
    }
  }
}

 

We added additive blend mode, changed rendering queue, received vertex color data and passed it from vertex to fragment program. We later used this data to multiply texture colors with Shuriken color data stored in vertex colors. All this was achieved with just few lines of fairly simple code. If you are doing this for the first time you might be very confused and probably have a lot of questions. How do you know how to add new things, where do you start, what all those weird things mean, why all names are so weird etc. I promise to answer all those questions but for now change things, add things, remove things. Play around and see what happens! Try to figure things out on your own and let me know what would you like to be explained next.

There are plenty of resources you could start reading but most importantly- try breaking your shader 🙂

More to read:

Posted in Shaders, Shuriken, Unity3D and tagged , , , , .

5 Comments

  1. Hi Michael,
    Great tutorial. I’m needing a quad billboard always facing camera shader, for emmisive material, with the added quality that the quad has a minimum on screen size, no matter how far away it gets. For example the quad shrinks with distance normally until it gets down to 3 pixels and doesn’t get any smaller.
    I’m trying to teach myself how to program shaders, and this is one of the most accessible I’ve found. Anything you can do to help me learn how to acheive what I want would be most appreciated!

  2. This is the way to go. I even have a book on Unity shaders, but your text is logically easy to follow and read. Please also explain the terms shortly, such as blending, culling etc. It would also be nice to see a visual representation of the values, and how they link between shader values and where they come from in relation to the actual object. I for one am looking more towards wanting to control the vertices in space, compute shaders etc, and have not found very good tutorials on the matter. One thing that bothers me is how in every other programming variable names are supposed to be descriptive, but it does not apply to shader programming :D. Thanks for the tutorial!

    • Thanks! I’m glad you enjoyed this tutorial. I was going to create more in the series but it takes quite a lot of time which I don’t have at the moment. But we will see 🙂

  3. Loved it. Clear and easy to understand. One of the best shader tutorials I’ve seen.

Leave a Reply

Your email address will not be published. Required fields are marked *