# User:Dirk Hünniger/gsl

## Contents

- 1 Introduction
- 2 Minimal Shader
- 3 RGB Cube
- 4 Debugging of Shaders
- 5 Shading in World Space
- 6 Cutaways
- 7 Transparency
- 8 Order-Independent Transparency
- 9 Silhouette Enhancement
- 10 Diffuse Reflection
- 11 Specular Highlights
- 12 Two-Sided Surfaces
- 13 Smooth Specular Highlights
- 14 Two-Sided Smooth Surfaces
- 15 Multiple Lights
- 16 Textured Spheres
- 17 Lighting Textured Surfaces
- 18 Glossy Textures
- 19 Transparent Textures
- 20 Layers of Textures
- 21 Lighting of Bumpy Surfaces
- 22 Projection of Bumpy Surfaces
- 23 Cookies
- 24 Light Attenuation
- 25 Projectors
- 26 Reflecting Surfaces
- 27 Curved Glass
- 28 Skyboxes
- 29 Many Light Sources
- 30 Brushed Metal
- 31 Specular Highlights at Silhouettes
- 32 Diffuse Reflection of Skylight
- 33 Translucent Surfaces
- 34 Translucent Bodies
- 35 Soft Shadows of Spheres
- 36 Toon Shading
- 37 Screen Overlays
- 38 Billboards
- 39 Nonlinear Deformations
- 40 Shadows on Planes
- 41 Mirrors
- 42 OpenGL ES 2.0 Pipeline
- 43 Vertex Transformations
- 44 Vector and Matrix Operations
- 45 Applying Matrix Transformations
- 45.1 Transforming Points
- 45.2 Transforming Directions
- 45.3 Transforming Normal Vectors
- 45.4 Transforming Normal Vectors with an Orthogonal Matrix
- 45.5 Transforming Points with the Inverse Matrix
- 45.6 Transforming Directions with the Inverse Matrix
- 45.7 Transforming Normal Vectors with the Inverse Transformation
- 45.8 Built-In Matrix Transformations
- 45.9 Further Reading

- 46 Rasterization
- 47 Per-Fragment Operations

# Introduction[edit]

This page is the introduction for the collection Wikibooks:Collections/GLSL Programming in Unity. It should not be referenced by other pages. |

### About GLSL[edit]

GLSL (OpenGL Shading Language) is one of several commonly used shading languages for real-time rendering (other examples are Cg and HLSL). These shading languages are used to program shaders (i.e. more or less small programs) that are executed on a GPU (graphics processing unit), i.e. the processor of the graphics system of a computer – as opposed to the CPU (central processing unit) of a computer.

GPUs are massively parallel processors, which are extremely powerful. Most of today's real-time graphics in games and other interactive graphical applications would not be possible without GPUs. However, to take full advantage of the performance of GPUs, it is necessary to program them directly. This means that small programs (i.e. shaders) have to be written that can be executed by GPUs. The programming languages to write these shaders are shading languages. GLSL is one of them. In fact, it is the shading language of several 3D graphics APIs (application programming interfaces), namely OpenGL, OpenGL ES 2.x, and WebGL. Therefore, GLSL is commonly used in applications for desktop computers, mobile devices, and the web.

### About this Wikibook[edit]

This wikibook was written with students in mind, who like neither programming nor mathematics. The basic motivation for this book is the observation that students are much more motivated to learn programming environments, programming languages and APIs if they are working on specific projects. Such projects are usually developed on specific platforms and therefore the approach of this book is to present GLSL within the game engine Unity.

Chapters 1 to 8 of the book consist of tutorials with working examples that produce certain effects. Note that these tutorials assume that you read them in the order in which they are presented, i.e. each tutorial will assume that you are familiar with the concepts and techniques introduced by previous tutorials. If you are new to GLSL or Unity you should at least read through the tutorials in Chapter 1, “Basics”.

More details about the OpenGL pipeline and GLSL syntax in general are included in an “Appendix on the OpenGL Pipeline and GLSL Syntax”. Readers who are not familiar with OpenGL or GLSL might want to at least skim this part since a basic understanding of the OpenGL pipeline and GLSL syntax is very useful for understanding the tutorials.

### About GLSL in Unity[edit]

GLSL programming in the game engine Unity is considerably easier than GLSL programming for an OpenGL, OpenGL ES, or WebGL application. Import of meshes and images (i.e. textures) is supported by a graphical user interface; mipmaps and normal maps can be computed automatically; the most common vertex attributes and uniforms are predefined; OpenGL states can be set by very simple commands; etc.

A free version of Unity can be downloaded for Windows and MacOS at Unity's download page. All of the included tutorials work with the free version. Three points should be noted:

- First,
**Windows users**have to use the command-line argument`-force-opengl`

[1] when starting Unity in order to be able to use GLSL shaders; for example, by changing the`Target`

setting in the properties of the desktop icon to:`"C:\Program Files\Unity\Editor\Unity.exe" -force-opengl`

. (On MacOS X, OpenGL and therefore GLSL is used by default.) Note that GLSL shaders cannot be used in Unity applications running in web browser on Windows. - Secondly, this book assumes that readers are somewhat familiar with Unity. If this is not the case, readers should consult the first three sections of Unity's User Guide [2] (Unity Basics, Building Scenes, Asset Import and Creation).
- Furthermore, as of version 3.5, Unity supports a version of GLSL similar to version 1.0.x for OpenGL ES 2.0 (the specification is available at the “Khronos OpenGL ES API Registry”); however, Unity's shader documentation [3] focuses on shaders written in Unity's own “surface shader” format and Cg/HLSL [4]. There are only very few details documented that are specific to GLSL shaders [5]. Thus, this wikibook might also help to close some gaps in Unity's documentation. However, optimizations (see, for example, this blog) are usually not discussed.

Martin Kraus, August 2012

# Minimal Shader[edit]

This tutorial covers the basic steps to create a minimal GLSL shader in Unity.

### Starting Unity and Creating a New Project[edit]

After downloading and starting Unity (Windows users have to use the command-line argument `-force-opengl`

), you might see an empty project. If not, you should create a new project by choosing **File > New Project...** from the menu. For this tutorial, you don't need to import any packages but some of the more advanced tutorials require the scripts and skyboxes packages. After creating a new project on Windows, Unity might start without OpenGL support; thus, Windows users should always quit Unity and restart it (with the command-line argument `-force-opengl`

) after creating a new project. Then you can open the new project with **File > Open Project...** from the menu.

If you are not familiar with Unity's Scene View, Hierarchy View, Project View and Inspector View, now would be a good time to read the first two (or three) sections (“Unity Basics” and “Building Scenes”) of the Unity User Guide.

### Creating a Shader[edit]

Creating a GLSL shader is not complicated: In the **Project View**, click on **Create** and choose **Shader**. A new file named “NewShader” should appear in the Project View. Double-click it to open it (or right-click and choose **Open**). An editor with the default shader in Cg should appear. Delete all the text and copy & paste the following shader into this file:

Shader "GLSL basic shader" { // defines the name of the shader SubShader { // Unity chooses the subshader that fits the GPU best Pass { // some shaders require multiple passes GLSLPROGRAM // here begins the part in Unity's GLSL #ifdef VERTEX // here begins the vertex shader void main() // all vertex shaders define a main() function { gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; // this line transforms the predefined attribute // gl_Vertex of type vec4 with the predefined // uniform gl_ModelViewProjectionMatrix of type mat4 // and stores the result in the predefined output // variable gl_Position of type vec4. } #endif // here ends the definition of the vertex shader #ifdef FRAGMENT // here begins the fragment shader void main() // all fragment shaders define a main() function { gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); // this fragment shader just sets the output color // to opaque red (red = 1.0, green = 0.0, blue = 0.0, // alpha = 1.0) } #endif // here ends the definition of the fragment shader ENDGLSL // here ends the part in GLSL } } }

Save the shader (by clicking the save icon or choosing **File > Save** from the editor's menu).

Congratulations, you have just created a shader in Unity. If you want, you can rename the shader file in the Project View by clicking the name, typing a new name, and pressing Return. (After renaming, reopen the shader in the editor to make sure that you are editing the correct file.)

Unfortunately, there isn't anything to see until the shader is attached to a material.

### Creating a Material and Attaching a Shader[edit]

To create a material, go back to Unity and create a new material by clicking **Create** in the **Project View** and choosing **Material**. A new material called “New Material” should appear in the Project View. (You can rename it just like the shader.) If it isn't selected, select it by clicking. Details about the material appear now in the Inspector View. In order to set the shader to this material, you can either

- drag & drop the shader in the
**Project View**over the material or - select the material in the
**Project View**and then in the**Inspector View**choose the shader (in this case “GLSL basic shader” as specified in the shader code above) from the drop-down list labeled**Shader**.

In either case, the Preview in the Inspector View of the material should now show a red sphere. If it doesn't and an error message is displayed at the bottom of the Unity window, you should reopen the shader and check in the editor whether the text is the same as given above. Windows users should make sure that OpenGL is supported by restarting Unity with the command-line argument `-force-opengl`

.

### Interactively Editing Shaders[edit]

This would be a good time to play with the shader; in particular, you can easily change the computed fragment color. Try neon green by opening the shader and replacing the fragment shader with this code:

#ifdef FRAGMENT void main() { gl_FragColor = vec4(0.6, 1.0, 0.0, 1.0); // red = 0.6, green = 1.0, blue = 0.0, alpha = 1.0 } #endif

You have to save the code in the editor and activate the Unity window again to apply the new shader. If you select the material in the Project View, the sphere in the Inspector View should now be green. You could also try to modify the red, green, and blue components to find the warmest orange or the darkest blue. (Actually, there is a movie about finding the warmest orange and another about dark blue that is almost black.)

You could also play with the vertex shader, e.g. try this vertex shader:

#ifdef VERTEX void main() { gl_Position = gl_ModelViewProjectionMatrix * (vec4(1.0, 0.1, 1.0, 1.0) * gl_Vertex); } #endif

This flattens any input geometry by multiplying the coordinate with . (This is a component-wise vector product; for more information on vectors and matrices in GLSL see the discussion in Section “Vector and Matrix Operations”.)

In case the shader does not compile, Unity displays an error message at the bottom of the Unity window and displays the material as bright magenta. In order to see all error messages and warnings, you should select the shader in the **Project View** and read the messages in the **Inspector View**, which also include line numbers, which you can display in the text editor by choosing **View > Line Numbers** in the text editor menu. You could also open the Console View by choosing **Window > Console** from the menu, but this will not display all error messages and therefore the crucial error is often not reported.

### Attaching a Material to a Game Object[edit]

We still have one important step to go: attaching the new material to a triangle mesh. To this end, create a sphere (which is one of the predefined game objects of Unity) by choosing **GameObject > Create Other > Sphere** from the menu. A sphere should appear in the Scene View and the label “Sphere” should appear in the Hierarchy View. (If it doesn't appear in the Scene View, click it in the Hierarchy View, move (without clicking) the mouse over the Scene View and press “f”. The sphere should now appear centered in the Scene View.)

To attach the material to the new sphere, you can:

- drag & drop the material from the
**Project View**over the sphere in the**Hierarchy View**or - drag & drop the material from the
**Project View**over the sphere in the**Scene View**or - select the sphere in the
**Hierarchy View**, locate the**Mesh Renderer**component in the**Inspector View**(and open it by clicking the title if it isn't open), open the**Materials**setting of the Mesh Renderer by clicking it. Change the “Default-Diffuse” material to the new material by clicking the dotted circle icon to the right of the material name and choosing the new material from the pop-up window.

In any case, the sphere in the Scene View should now have the same color as the preview in the Inspector View of the material. Changing the shader should (after saving and switching to Unity) change the appearance of the sphere in the Scene View.

### Saving Your Work in a Scene[edit]

There is one more thing: you should save you work in a “scene” (which often corresponds to a game level). Choose **File > Save Scene** (or **File > Save Scene As...**) and choose a file name in the “Assets” directory of your project. The scene file should then appear in the Project View and will be available the next time you open the project.

### One More Note about Terminology[edit]

It might be good to clarify the terminology. In GLSL, a “shader” is either a vertex shader or a fragment shader. The combination of both is called a “program”.

Unfortunately, Unity refers to this kind of program as a “shader”, while in Unity a vertex shader is called a “vertex program” and a fragment shader is called a “fragment program”.

To make the confusion perfect, I'm going to use Unity's word “shader” for a GLSL program, i.e. the combination of a vertex and a fragment shader. However, I will use the GLSL terms “vertex shader” and “fragment shader” instead of “vertex program” and “fragment program”.

### Summary[edit]

Congratulations, you have reached the end of this tutorial. A few of the things you have seen are:

- How to create a shader.
- How to define a GLSL vertex and fragment shader in Unity.
- How to create a material and attach a shader to the material.
- How to manipulate the ouput color
`gl_FragColor`

in the fragment shader. - How to transform the input attribute
`gl_Vertex`

in the vertex shader. - How to create a game object and attach a material to it.

Actually, this was quite a lot of stuff.

### Further Reading[edit]

If you still want to know more

- about vertex and fragment shaders in general, you should read the description in Section “OpenGL ES 2.0 Pipeline”.
- about the vertex transformations such as
`gl_ModelViewProjectionMatrix`

, you should read Section “Vertex Transformations”. - about handling vectors (e.g. the
`vec4`

type) and matrices in GLSL, you should read Section “Vector and Matrix Operations”. - about how to apply vertex transformations such as
`gl_ModelViewProjectionMatrix`

, you should read Section “Applying Matrix Transformations”. - about Unity's ShaderLab language for specifying shaders, you should read Unity's ShaderLab reference.

# RGB Cube[edit]

This tutorial introduces **varying variables**. It is based on Section “Minimal Shader”.

In this tutorial we will write a shader to render an RGB cube similar to the one shown to the left. The color of each point on the surface is determined by its coordinates; i.e., a point at position has the color . For example, the point is mapped to the color , i.e. pure blue. (This is the blue corner in the lower right of the figure to the left.)

### Preparations[edit]

Since we want to create an RGB cube, you first have to create a cube game object. As described in Section “Minimal Shader” for a sphere, you can create a cube game object by selecting **GameObject > Create Other > Cube** from the main menu. Continue with creating a material and a shader object and attaching the shader to the material and the material to the cube as described in Section “Minimal Shader”.

### The Shader Code[edit]

Here is the shader code, which you should copy & paste into your shader object:

Shader "GLSL shader for RGB cube" { SubShader { Pass { GLSLPROGRAM #ifdef VERTEX // here begins the vertex shader varying vec4 position; // this is a varying variable in the vertex shader void main() { position = gl_Vertex + vec4(0.5, 0.5, 0.5, 0.0); // Here the vertex shader writes output data // to the varying variable. We add 0.5 to the // x, y, and z coordinates, because the // coordinates of the cube are between -0.5 and // 0.5 but we need them between 0.0 and 1.0. gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif // here ends the vertex shader #ifdef FRAGMENT // here begins the fragment shader varying vec4 position; // this is a varying variable in the fragment shader void main() { gl_FragColor = position; // Here the fragment shader reads intput data // from the varying variable. The red, gree, blue, // and alpha component of the fragment color are // set to the values in the varying variable. } #endif // here ends the fragment shader ENDGLSL } } }

If your cube is not colored correctly, check the console for error messages (by selecting **Window > Console** from the main menu), make sure you have saved the shader code, and check whether you have attached the shader object to the material object and the material object to the game object.

### Varying Variables[edit]

The main task of our shader is to set the output fragment color (`gl_FragColor`

) in the fragment shader to the position (`gl_Vertex`

) that is available in the vertex shader. Actually, this is not quite true: the coordinates in `gl_Vertex`

for Unity's default cube are between -0.5 and +0.5 while we would like to have color components between 0.0 and 1.0; thus, we need to add 0.5 to the x, y, and z component, which is done by this expression: `gl_Vertex + vec4(0.5, 0.5, 0.5, 0.0)`

.

The main problem, however, is: how do we get any value from the vertex shader to the fragment shader? It turns out that the **only** way to do this is to use varying variables (or varyings for short). Output of the vertex shader can be written to a varying variable and then it can be read as input by the fragment shader. This is exactly what we need.

To specify a varying variable, it has to be defined with the modifier `varying`

(before the type) in the vertex and the fragment shader outside of any function; in our example: `varying vec4 position;`

. And here comes the most important rule about varying variables:

The type and name of a varying variable definition in the vertex shader has to match exactly the type and name of a varying variable definition in the fragment shader and vice versa. |

This is required to avoid ambiguous cases where the GLSL compiler cannot figure out which varying variable of the vertex shader should be matched to which varying variable of the fragment shader.

### A Neat Trick for Varying Variables in Unity[edit]

The requirement that the definitions of varying variables in the vertex and fragment shader match each other often results in errors, for example if a programmer changes a type or name of a varying variable in the vertex shader but forgets to change it in the fragment shader. Fortunately, there is a nice trick in Unity that avoids the problem. Consider the following shader:

Shader "GLSL shader for RGB cube" { SubShader { Pass { GLSLPROGRAM // here begin the vertex and the fragment shader varying vec4 position; // this line is part of the vertex and the fragment shader #ifdef VERTEX // here begins the part that is only in the vertex shader void main() { position = gl_Vertex + vec4(0.5, 0.5, 0.5, 0.0); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif // here ends the part that is only in the vertex shader #ifdef FRAGMENT // here begins the part that is only in the fragment shader void main() { gl_FragColor = position; } #endif // here ends the part that is only in the fragment shader ENDGLSL // here end the vertex and the fragment shader } } }

As the comments in this shader explain, the line `#ifdef VERTEX`

doesn't actually mark the beginning of the vertex shader but the beginning of a part that is **only** in the vertex shader. Analogously, `#ifdef FRAGMENT`

marks the beginning of a part that is only in the fragment shader. In fact, both shaders begin with the line `GLSLPROGRAM`

. Therefore, any code between `GLSLPROGRAM`

and the first `#ifdef`

line will be shared by the vertex and the fragment shader. (If you are familiar with the C or C++ preprocessor, you might have guessed this already.)

This is perfect for definitions of varying variables because it means that we may type the definition only once and it will be put into the vertex and the fragment shader; thus, matching definitions are guaranteed! I.e. we have to type less and there is no way to produce compiler errors because of mismatches between the definitions of varying variables. (Of course, the cost is that we have to type all these `#ifdef`

and `#end`

lines.)

### Variations of this Shader[edit]

The RGB cube represents the set of available colors (i.e. the gamut of the display). Thus, it can also be used show the effect of a color transformation. For example, a color to gray transformation would compute either the mean of the red, green, and blue color components, i.e. , and then put this value in all three color components of the fragment color to obtain a gray value of the same intensity. Instead of the mean, the relative luminance could also be used, which is . Of course, any other color transformation (changing saturation, contrast, hue, etc.) is also applicable.

Another variation of this shader could compute a CMY (cyan, magenta, yellow) cube: for position you could subtract from a pure white an amount of red that is proportional to in order to produce cyan. Furthermore, you would subtract an amount of green in proportion to the component to produce magenta and also an amount of blue in proportion to to produce yellow.

If you really want to get fancy, you could compute an HSV (hue, saturation, value) cylinder. For and coordinates between -0.5 and +0.5, you can get an angle between 0 and 360° with `180.0+degrees(atan(z, x))`

in GLSL and a distance between 0 and 1 from the axis with `2.0 * sqrt(x * x + z * z)`

. The coordinate for Unity's built-in cylinder is between -1 and 1 which can be translated to a value between 0 and 1 by . The computation of RGB colors from HSV coordinates is described in the article on HSV in Wikipedia.

### Interpolation of Varying Variables[edit]

The story about varying variables is not quite over yet. If you select the cube game object, you will see in the Scene View that it consists of only 12 triangles and 8 vertices. Thus, the vertex shader might be called only eight times and only eight different outputs are written to the varying variable. However, there are many more colors on the cube. How did that happen?

The answer is implied by the name **varying** variables. They are called this way because they vary across a triangle. In fact, the vertex shader is only called for each vertex of each triangle. If the vertex shader writes different values to a varying variable for different vertices, the values are interpolated across the triangle. The fragment shader is then called for each pixel that is covered by the triangle and receives interpolated values of the varying variables. The details of this interpolation are described in Section “Rasterization”.

If you want to make sure that a fragment shader receives one exact, non-interpolated value by a vertex shader, you have to make sure that the vertex shader writes the same value to the varying variable for all vertices of a triangle.

### Summary[edit]

And this is the end of this tutorial. Congratulations! Among other things, you have seen:

- What an RGB cube is.
- What varying variables are good for and how to define them.
- How to make sure that a varying variable has the same name and type in the vertex shader and the fragment shader.
- How the values written to a varying variable by the vertex shader are interpolated across a triangle before they are received by the fragment shader.

### Further Reading[edit]

If you want to know more

- about the data flow in and out of vertex and fragment shaders, you should read the description in Section “OpenGL ES 2.0 Pipeline”.
- about vector and matrix operations (e.g. the expression
`gl_Vertex + vec4(0.5, 0.5, 0.5, 0.0);`

), you should read Section “Vector and Matrix Operations”. - about the interpolation of varying variables, you should read Section “Rasterization”.
- about Unity's official documentation of writing vertex shaders and fragment shaders in Unity's ShaderLab, you should read Unity's ShaderLab reference about “GLSL Shader Programs”.

# Debugging of Shaders[edit]

This tutorial introduces **attribute variables**. It is based on Section “Minimal Shader” and Section “RGB Cube”.

This tutorial also introduces the main technique to debug shaders in Unity: false-color images, i.e. a value is visualized by setting one of the components of the fragment color to it. Then the intensity of that color component in the resulting image allows you to make conclusions about the value in the shader. This might appear to be a very primitive debugging technique because it is a very primitive debugging technique. Unfortunately, there is no alternative in Unity.

### Where Does the Vertex Data Come from?[edit]

In Section “RGB Cube” you have seen how the fragment shader gets its data from the vertex shader by means of varying variables. The question here is: where does the vertex shader get its data from? Within Unity, the answer is that the Mesh Renderer component of a game object sends all the data of the mesh of the game object to OpenGL in each frame. (This is often called a “draw call”. Note that each draw call has some performance overhead; thus, it is much more efficient to send one large mesh with one draw call to OpenGL than to send several smaller meshes with multiple draw calls.) This data usually consists of a long list of triangles, where each triangle is defined by three vertices and each vertex has certain attributes, including position. These attributes are made available in the vertex shader by means of attribute variables.

### Built-in Attribute Variables and how to Visualize Them[edit]

In Unity, most of the standard attributes (position, color, surface normal, and texture coordinates) are built in, i.e. you need not (in fact must not) define them. The names of these built-in attributes are actually defined by the OpenGL “compability profile” because such built-in names are needed if you mix an OpenGL application that was written for the fixed-function pipeline with a (programmable) vertex shader. If you had to define them, the definitions (only in the vertex shader) would look like this:

attribute vec4 gl_Vertex; // position (in object coordinates, // i.e. local or model coordinates) attribute vec4 gl_Color; // color (usually constant) attribute vec3 gl_Normal; // surface normal vector // (in object coordinates; usually normalized to unit length) attribute vec4 gl_MultiTexCoord0; //0th set of texture coordinates // (a.k.a. “UV”; between 0 and 1) attribute vec4 gl_MultiTexCoord1; //1st set of texture coordinates // (a.k.a. “UV”; between 0 and 1) ...

There is only one attribute variable that is provided by Unity but has no standard name in OpenGL, namely the tangent vector, i.e. a vector that is orthogonal to the surface normal. You should define this variable yourself as an attribute variable of type `vec4`

with the specific name `Tangent`

as shown in the following shader:

Shader "GLSL shader with all built-in attributes" { SubShader { Pass { GLSLPROGRAM varying vec4 color; #ifdef VERTEX attribute vec4 Tangent; // this attribute is specific to Unity void main() { color = gl_MultiTexCoord0; // set the varying variable // other possibilities to play with: // color = gl_Vertex; // color = gl_Color; // color = vec4(gl_Normal, 1.0); // color = gl_MultiTexCoord0; // color = gl_MultiTexCoord1; // color = Tangent; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = color; // set the output fragment color } #endif ENDGLSL } } }

In Section “RGB Cube” we have already seen, how to visualize the `gl_Vertex`

coordinates by setting the fragment color to those values. In this example, the fragment color is set to `gl_MultiTexCoord0`

such that we can see what kind of texture coordinates Unity provides.

Note that only the first three components of `Tangent`

represent the tangent direction. The scaling and the fourth component are set in a specific way, which is mainly useful for parallax mapping (see Section “Projection of Bumpy Surfaces”).

### How to Interpret False-Color Images[edit]

When trying to understand the information in a false-color image, it is important to focus on one color component only. For example, if the standard attribute `gl_MultiTexCoord0`

for a sphere is written to the fragment color then the red component of the fragment visualizes the `x`

coordinate of `gl_MultiTexCoord0`

, i.e. it doesn't matter whether the output color is maximum pure red or maximum yellow or maximum magenta, in all cases the red component is 1. On the other hand, it also doesn't matter for the red component whether the color is blue or green or cyan of any intensity because the red component is 0 in all cases. If you have never learned to focus solely on one color component, this is probably quite challenging; therefore, you might consider to look only at one color component at a time. For example by using this line to set the varying in the vertex shader:

color = vec4(gl_MultiTexCoord0.x, 0.0, 0.0, 1.0);

This sets the red component of the varying variable to the `x`

component of `gl_MultiTexCoord0`

but sets the green and blue components to 0 (and the alpha or opacity component to 1 but that doesn't matter in this shader).

If you focus on the red component or visualize only the red component you should see that it increases from 0 to 1 as you go around the sphere and after 360° drops to 0 again. It actually behaves similar to a longitude coordinate on the surface of a planet. (In terms of spherical coordinates, it corresponds to the azimuth.)

If the `x`

component of `gl_MultiTexCoord0`

corresponds to the longitude, one would expect that the `y`

component would correspond to the latitude (or the inclination in spherical coordinates). However, note that texture coordinates are always between 0 and 1; therefore, the value is 0 at the bottom (south pole) and 1 at the top (north pole). You can visualize the `y`

component as green on its own with:

color = vec4(0.0, gl_MultiTexCoord0.y, 0.0, 1.0);

Texture coordinates are particularly nice to visualize because they are between 0 and 1 just like color components are. Almost as nice are coordinates of normalized vectors (i.e., vectors of length 1; for example, `gl_Normal`

is usually normalized) because they are always between -1 and +1. To map this range to the range from 0 to 1, you add 1 to each component and divide all components by 2, e.g.:

color = vec4((gl_Normal + vec3(1.0, 1.0, 1.0)) / 2.0, 1.0);

Note that `gl_Normal`

is a three-dimensional vector. Black corresponds then to the coordinate -1 and full intensity of one component to the coordinate +1.

If the value that you want to visualize is in another range than 0 to 1 or -1 to +1, you have to map it to the range from 0 to 1, which is the range of color components. If you don't know which values to expect, you just have to experiment. What helps here is that if you specify color components outside of the range 0 to 1, they are automatically clamped to this range. I.e., values less than 0 are set to 0 and values greater than 1 are set to 1. Thus, when the color component is 0 or 1 you know at least that the value is less or greater than what you assumed and then you can adapt the mapping iteratively until the color component is between 0 and 1.

### Debugging Practice[edit]

In order to practice the debugging of shaders, this section includes some lines that produce black colors when the assignment to `color`

in the vertex shader is replaced by each of them. Your task is to figure out for each line, why the result is black. To this end, you should try to visualize any value that you are not absolutely sure about and map the values less than 0 or greater than 1 to other ranges such that the values are visible and you have at least an idea in which range they are. Note that most of the functions and operators are documented in Section “Vector and Matrix Operations”.

color = gl_MultiTexCoord0 - vec4(1.5, 2.3, 1.1, 0.0);

color = vec4(gl_MultiTexCoord0.z);

color = gl_MultiTexCoord0 / tan(0.0);

The following lines require some knowledge about the dot and cross product:

color = dot(gl_Normal, vec3(Tangent)) * gl_MultiTexCoord0;

color = dot(cross(gl_Normal, vec3(Tangent)), gl_Normal) * gl_MultiTexCoord0;

color = vec4(cross(gl_Normal, gl_Normal), 1.0);

color = vec4(cross(gl_Normal, vec3(gl_Vertex)), 1.0); // only for a sphere!

Do the functions `radians()`

and `noise()`

always return black? What's that good for?

color = radians(gl_MultiTexCoord0);

color = noise4(gl_MultiTexCoord0);

Consult the documentation in the “OpenGL ES Shading Language 1.0.17 Specification” available at the “Khronos OpenGL ES API Registry” to figure out what `radians()`

is good for and what the problem with `noise4()`

is.

### Special Variables in the Fragment Shader[edit]

Attributes are specific to vertices, i.e., they usually have different values for different vertices. There are similar variables for fragment shaders, i.e., variables that have different values for each fragment. However, they are different from attributes because they are not specified by a mesh (i.e. a list of triangles). They are also different from varyings because they are not set explicitly by the vertex shader.

Specifically, a four-dimensional vector `gl_FragCoord`

is available containing the screen (or: window) coordinates of the fragment that is processed; see Section “Vertex Transformations” for the description of the screen coordinate system.

Moreover, a boolean variable `gl_FrontFacing`

is provided that specifies whether the front face or the back face of a triangle is being rendered. Front faces usually face the “outside” of a model and back faces face the “inside” of a model; however, there is no clear outside or inside if the model is not a closed surface. Usually, the surface normal vectors point in the direction of the front face, but this is not required. In fact, front faces and back faces are specified by the order of the vertex triangles: if the vertices appear in counter-clockwise order, the front face is visible; if they appear in clockwise order, the back face is visible. An application is shown in Section “Cutaways”.

### Summary[edit]

Congratulations, you have reached the end of this tutorial! We have seen:

- The list of built-in attributes in Unity:
`gl_Vertex`

,`gl_Color`

,`gl_Normal`

,`gl_MultiTexCoord0`

,`gl_MultiTexCoord1`

, and the special`Tangent`

. - How to visualize these attributes (or any other value) by setting components of the output fragment color.
- The two additional special variables that are available in fragment programs:
`gl_FragCoord`

and`gl_FrontFacing`

.

### Further Reading[edit]

If you still want to know more

- about the data flow in vertex and fragment shaders, you should read the description in Section “OpenGL ES 2.0 Pipeline”.
- about operations and functions for vectors, you should read Section “Vector and Matrix Operations”.

# Shading in World Space[edit]

This tutorial introduces **uniform variables**. It is based on Section “Minimal Shader”, Section “RGB Cube”, and Section “Debugging of Shaders”.

In this tutorial we will look at a shader that changes the fragment color depending on its position in the world. The concept is not too complicated; however, there are extremely important applications, e.g. shading with lights and environment maps. We will also have a look at shaders in the real world; i.e., what is necessary to enable non-programmers to use your shaders?

### Transforming from Object to World Space[edit]

As mentioned in Section “Debugging of Shaders”, the attribute `gl_Vertex`

specifies object coordinates, i.e. coordinates in the local object (or model) space of a mesh. The object space (or object coordinate system) is specific to each game object; however, all game objects are transformed into one common coordinate system — the world space.

If a game object is put directly into the world space, the object-to-world transformation is specified by the Transform component of the game object. To see it, select the object in the **Scene View** or the **Hierarchy View** and then find the Transform component in the **Inspector View**. There are parameters for “Position”, “Rotation” and “Scale” in the Transform component, which specify how vertices are transformed from object coordinates to world coordinates. (If a game object is part of a group of objects, which is shown in the Hierarchy View by means of indentation, then the Transform component only specifies the transformation from object coordinates of a game object to the object coordinates of the parent. In this case, the actual object-to-world transformation is given by the combination of the transformation of a object with the transformations of its parent, grandparent, etc.) The transformations of vertices by translations, rotations and scalings, as well as the combination of transformations and their representation as 4×4 matrices are discussed in Section “Vertex Transformations”.

Back to our example: the transformation from object space to world space is put into a 4×4 matrix, which is also known as “model matrix” (since this transformation is also known as “model transformation”). This matrix is available in the uniform variable `_Object2World`

, which is defined and used in the following shader:

Shader "GLSL shading in world space" { SubShader { Pass { GLSLPROGRAM uniform mat4 _Object2World; // definition of a Unity-specific uniform variable varying vec4 position_in_world_space; #ifdef VERTEX void main() { position_in_world_space = _Object2World * gl_Vertex; // transformation of gl_Vertex from object coordinates // to world coordinates; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { float dist = distance(position_in_world_space, vec4(0.0, 0.0, 0.0, 1.0)); // computes the distance between the fragment position // and the origin (the 4th coordinate should always be // 1 for points). if (dist < 5.0) { gl_FragColor = vec4(0.0, 1.0, 0.0, 1.0); // color near origin } else { gl_FragColor = vec4(0.3, 0.3, 0.3, 1.0); // color far from origin } } #endif ENDGLSL } } }

Note that this shader makes sure that the definition of the uniform is included in both the vertex and the fragment shader (although this particular fragment shader doesn't need it). This is similar to the definition of varyings discussed in Section “RGB Cube”.

Usually, an OpenGL application has to set the value of uniform variables; however, Unity takes care of always setting the correct value of predefined uniform variables such as `_Object2World`

; thus, we don't have to worry about it.

This shader transforms the vertex position to world space and gives it to the fragment shader in a varying. For the fragment shader the varying variable contains the interpolated position of the fragment in world coordinates. Based on the distance of this position to the origin of the world coordinate system, one of two colors is set. Thus, if you move an object with this shader around in the editor it will turn green near the origin of the world coordinate system. Farther away from the origin it will turn dark grey.

### More Unity-Specific Uniforms[edit]

There are, in fact, several predefined uniform variables similar to `_Object2World`

. Here is a short list (including `_Object2World`

), which appears in the shader codes of several tutorials:

// The following built-in uniforms (except _LightColor0 and // _LightMatrix0) are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec4 _Time, _SinTime, _CosTime; // time values from Unity uniform vec4 _ProjectionParams; // x = 1 or -1 (-1 if projection is flipped) // y = near plane; z = far plane; w = 1/far plane uniform vec4 _ScreenParams; // x = width; y = height; z = 1 + 1/width; w = 1 + 1/height uniform vec4 unity_Scale; // w = 1/scale; see _World2Object uniform vec3 _WorldSpaceCameraPos; uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix // (all but the bottom-right element have to be scaled // with unity_Scale.w if scaling is important) uniform vec4 _LightPositionRange; // xyz = pos, w = 1/range uniform vec4 _WorldSpaceLightPos0; // position or direction of light source uniform vec4 _LightColor0; // color of light source uniform mat4 _LightMatrix0; // matrix to light space

As the comments suggest, instead of defining all these uniforms (except `_LightColor0`

and `_LightMatrix0`

), you could also include the file `UnityCG.glslinc`

. However, for some unknown reason `_LightColor0`

and `_LightMatrix0`

are not included in this file; thus, we have to define them separately:

#include "UnityCG.glslinc" uniform vec4 _LightColor0; uniform mat4 _LightMatrix0;

Unity does not always update all of these uniforms. In particular, `_WorldSpaceLightPos0`

, `_LightColor0`

, and `_LightMatrix0`

are only set correctly for shader passes that are tagged appropriately, e.g. with `Tags {"LightMode" = "ForwardBase"}`

as the first line in the `Pass {...}`

block; see also Section “Diffuse Reflection”.

### More OpenGL-Specific Uniforms[edit]

Another class of built-in uniforms are defined for the OpenGL compability profile, for example the `mat4`

matrix `gl_ModelViewProjectionMatrix`

, which is equivalent to the matrix product `gl_ProjectionMatrix * gl_ModelViewMatrix`

of two other built-in uniforms. The corresponding transformations are described in detail in Section “Vertex Transformations”.

As you can see in the shader above, these uniforms don't have to be defined; they are always available in GLSL shaders in Unity. If you had to define them, the definitions would look like this:

uniform mat4 gl_ModelViewMatrix; uniform mat4 gl_ProjectionMatrix; uniform mat4 gl_ModelViewProjectionMatrix; uniform mat4 gl_TextureMatrix[gl_MaxTextureCoords]; uniform mat3 gl_NormalMatrix; // transpose of the inverse of gl_ModelViewMatrix uniform mat4 gl_ModelViewMatrixInverse; uniform mat4 gl_ProjectionMatrixInverse; uniform mat4 gl_ModelViewProjectionMatrixInverse; uniform mat4 gl_TextureMatrixInverse[gl_MaxTextureCoords]; uniform mat4 gl_ModelViewMatrixTranspose; uniform mat4 gl_ProjectionMatrixTranspose; uniform mat4 gl_ModelViewProjectionMatrixTranspose; uniform mat4 gl_TextureMatrixTranspose[gl_MaxTextureCoords]; uniform mat4 gl_ModelViewMatrixInverseTranspose; uniform mat4 gl_ProjectionMatrixInverseTranspose; uniform mat4 gl_ModelViewProjectionMatrixInverseTranspose; uniform mat4 gl_TextureMatrixInverseTranspose[gl_MaxTextureCoords]; struct gl_LightModelParameters { vec4 ambient; }; uniform gl_LightModelParameters gl_LightModel; ...

In fact, the compability profile of OpenGL defines even more uniforms; see Chapter 7 of the “OpenGL Shading Language 4.10.6 Specification” available at Khronos' OpenGL page. Unity supports many of them but not all.

Some of these uniforms are arrays, e.g `gl_TextureMatrix`

. In fact, an array of matrices `gl_TextureMatrix[0]`

, `gl_TextureMatrix[1]`

, ..., `gl_TextureMatrix[gl_MaxTextureCoords - 1]`

is available, where `gl_MaxTextureCoords`

is a built-in integer.

### Computing the View Matrix[edit]

Traditionally, it is customary to do many computations in view space, which is just a rotated and translated version of world space (see Section “Vertex Transformations” for the details). Therefore, OpenGL offers only the product of the model matrix and the view matrix , i.e. the model-view matrix , which is available in the uniform `gl_ModelViewMatrix`

. The view matrix is not available. Unity also doesn't provide it.

However, `_Object2World`

is just the model matrix and `_World2Object`

is the inverse model matrix. (Except that all but the bottom-right element have to be scaled by `untiy_Scale.w`

.) Thus, we can easily compute the view matrix. The mathematics looks like this:

In other words, the view matrix is the product of the model-view matrix and the inverse model matrix (which is `_World2Object * unity_Scale.w`

except for the bottom-right element, which is 1). Assuming that we have defined the uniforms `_World2Object`

and `unity_Scale`

, we can compute the view matrix this way in GLSL:

mat4 modelMatrixInverse = _World2Object * unity_Scale.w; modelMatrixInverse[3][3] = 1.0; mat4 viewMatrix = gl_ModelViewMatrix * modelMatrixInverse;

### User-Specified Uniforms: Shader Properties[edit]

There is one more important type of uniform variables: uniforms that can be set by the user. Actually, these are called shader properties in Unity. You can think of them as parameters of the shader. A shader without parameters is usually used only by its programmer because even the smallest necessary change requires some programming. On the other hand, a shader using parameters with descriptive names can be used by other people, even non-programmers, e.g. CG artists. Imagine you are in a game development team and a CG artist asks you to adapt your shader for each of 100 design iterations. It should be obvious that a few parameters, which even a CG artist can play with, might save **you** a lot of time. Also, imagine you want to sell your shader: parameters will often dramatically increase the value of your shader.

Since the description of shader properties in Unity's ShaderLab reference is quite OK, here is only an example, how to use shader properties in our example. We first declare the properties and then define uniforms of the same names and corresponding types.

Shader "GLSL shading in world space" { Properties { _Point ("a point in world space", Vector) = (0., 0., 0., 1.0) _DistanceNear ("threshold distance", Float) = 5.0 _ColorNear ("color near to point", Color) = (0.0, 1.0, 0.0, 1.0) _ColorFar ("color far from point", Color) = (0.3, 0.3, 0.3, 1.0) } SubShader { Pass { GLSLPROGRAM // uniforms corresponding to properties uniform vec4 _Point; uniform float _DistanceNear; uniform vec4 _ColorNear; uniform vec4 _ColorFar; #include "UnityCG.glslinc" // defines _Object2World and _World2Object varying vec4 position_in_world_space; #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; position_in_world_space = modelMatrix * gl_Vertex; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { float dist= distance(position_in_world_space, _Point); if (dist < _DistanceNear) { gl_FragColor = _ColorNear; } else { gl_FragColor = _ColorFar; } } #endif ENDGLSL } } }

With these parameters, a non-programmer can modify the effect of our shader. This is nice; however, the properties of the shader (and in fact uniforms in general) can also be set by scripts! For example, a JavaScript attached to the game object that is using the shader can set the properties with these lines:

renderer.sharedMaterial.SetVector("_Point", Vector4(1.0, 0.0, 0.0, 1.0)); renderer.sharedMaterial.SetFloat("_DistanceNear", 10.0); renderer.sharedMaterial.SetColor("_ColorNear", Color(1.0, 0.0, 0.0)); renderer.sharedMaterial.SetColor("_ColorFar", Color(1.0, 1.0, 1.0));

Use `sharedMaterial`

if you want to change the parameters for all objects that use this material and just `material`

if you want to change the parameters only for one object. With scripting you could, for example, set the `_Point`

to the position of another object (i.e. the `position`

of its Transform component). In this way, you can specify a point just by moving another object around in the editor. In order to write such a script, select **Create > JavaScript** in the **Project View** and copy & paste this code:

@script ExecuteInEditMode() // make sure to run in edit mode var other : GameObject; // another user-specified object function Update () // this function is called for every frame { if (null != other) // has the user specified an object? { renderer.sharedMaterial.SetVector("_Point", other.transform.position); // set the shader property // _Point to the position of the other object } }

Then, you should attach the script to the object with the shader and drag & drop another object to the `other`

variable of the script in the **Inspector View**.

### Summary[edit]

Congratulations, you made it! (In case you wonder: yes, I'm also talking to myself here. ;) We discussed:

- How to transform a vertex into world coordinates.
- The most important Unity-specific uniforms that are supported by Unity.
- The most important OpenGL-specific uniforms that are supported by Unity.
- How to make a shader more useful and valuable by adding shader properties.

### Further Reading[edit]

If you want to know more

- about vector and matrix operations (e.g. the
`distance()`

function), you should read Section “Vector and Matrix Operations”. - about the standard vertex transformations, e.g. the model matrix and the view matrix, you should read Section “Vertex Transformations”.
- about the application of transformation matrices to points and directions, you should read Section “Applying Matrix Transformations”.
- about the specification of shader properties, you should read Unity's documentation about “ShaderLab syntax: Properties”.

# Cutaways[edit]

This tutorial covers **discarding fragments**, determining whether the front face or back face is rendered, and **front-face and back-face culling**. This tutorial assumes that you are familiar with varying variables as discussed in Section “RGB Cube”.

The main theme of this tutorial is to cut away triangles or fragments even though they are part of a mesh that is being rendered. The main two reasons are: we want to look through a triangle or fragment (as in the case of the roof in the drawing to the left, which is only partly cut away) or we know that a triangle isn't visible anyways; thus, we can save some performance by not processing it. OpenGL supports these situations in several ways; we will discuss two of them.

### Very Cheap Cutaways[edit]

The following shader is a very cheap way of cutting away parts of a mesh: all fragments are cut away that have a positive coordinate in object coordinates (i.e. in the coordinate system in which it was modeled; see Section “Vertex Transformations” for details about coordinate systems). Here is the code:

Shader "GLSL shader using discard" { SubShader { Pass { Cull Off // turn off triangle culling, alternatives are: // Cull Back (or nothing): cull only back faces // Cull Front : cull only front faces GLSLPROGRAM varying vec4 position_in_object_coordinates; #ifdef VERTEX void main() { position_in_object_coordinates= gl_Vertex; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { if (position_in_object_coordinates.y > 0.0) { discard; // drop the fragment if y coordinate > 0 } if (gl_FrontFacing) // are we looking at a front face? { gl_FragColor = vec4(0.0, 1.0, 0.0, 1.0); // yes: green } else { gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); // no: red } } #endif ENDGLSL } } }

When you apply this shader to any of the default objects, the shader will cut away half of them. This is a very cheap way of producing hemispheres or open cylinders.

### Discarding Fragments[edit]

Let's first focus on the `discard`

instruction in the fragment shader. This instruction basically just discards the processed fragment. (This was called a fragment “kill” in earlier shading languages; I can understand that the fragments prefer the term “discard”.) Depending on the hardware, this can be a quite expensive technique in the sense that rendering might perform considerably worse as soon as there is one shader that includes a `discard`

instruction (regardless of how many fragments are actually discarded, just the presence of the instruction may result in the deactivation of some important optimizations). Therefore, you should avoid this instruction whenever possible but in particular when you run into performance problems.

One more note: the condition for the fragment `discard`

includes only an object coordinate. The consequence is that you can rotate and move the object in any way and the cutaway part will always rotate and move with the object. You might want to check what cutting in world space looks like: change the vertex and fragment shader such that the world coordinate is used in the condition for the fragment `discard`

. Tip: see Section “Shading in World Space” for how to transform the vertex into world space.

### Better Cutaways[edit]

If you are not(!) familiar with scripting in Unity, you might try the following idea to improve the shader: change it such that fragments are discarded if the coordinate is greater than some threshold variable. Then introduce a shader property to allow the user to control this threshold. Tip: see Section “Shading in World Space” for a discussion of shader properties.

If you are familiar with scripting in Unity, you could try this idea: write a script for an object that takes a reference to another sphere object and assigns (using `renderer.sharedMaterial.SetMatrix()`

) the inverse model matrix (`renderer.worldToLocalMatrix`

) of that sphere object to a `mat4`

uniform variable of the shader. In the shader, compute the position of the fragment in world coordinates and apply the inverse model matrix of the other sphere object to the fragment position. Now you have the position of the fragment in the local coordinate system of the other sphere object; here, it is easy to test whether the fragment is inside the sphere or not because in this coordinate system all spheres are centered around the origin with radius 0.5. Discard the fragment if it is inside the other sphere object. The resulting script and shader can cut away points from the surface of any object with the help of a cutting sphere that can be manipulated interactively in the editor like any other sphere.

### Distinguishing between Front and Back Faces[edit]

A special boolean variable `gl_FrontFacing`

is available in the fragment shader that specifies whether we are looking at the front face of a triangle. Usually, the front faces are facing the outside of a mesh and the back faces the inside. (Just as the surface normal vector usually points to the outside.) However, the actual way front and back faces are distinguished is the order of the vertices in a triangle: if the camera sees the vertices of a triangle in counter-clockwise order, it sees the front face. If it sees the vertices in clockwise order, it sees the back face.

Our fragment shader checks the variable `gl_FrontFacing`

and assigns green to the output fragment color if `gl_FrontFacing`

is `true`

(i.e. the fragment is part of a front-facing triangle; i.e. it is facing the outside), and red if `gl_FrontFacing`

is `false`

(i.e. the fragment is part of a back-facing triangle; i.e. it is facing the inside). In fact, `gl_FrontFacing`

allows you not only to render the two faces of a surfaces with different colors but with completely different styles.

Note that basing the definition of front and back faces on the order of vertices in a triangle can cause problems when vertices are mirrored, i.e. scaled with a negative factor. Unity tries to take care of these problems; thus, just specifying a negative scaling in the Transform component of the game object will usually not cause this problem. However, since Unity has no control over what we are doing in the vertex shader, we can still turn the inside out by multiplying one (or three) of the coordinates with -1, e.g. by assigning `gl_Position`

this way in the vertex shader:

gl_Position = gl_ModelViewProjectionMatrix * vec4(-gl_Vertex.x, gl_Vertex.y, gl_Vertex.z, 1.0);

This just multiplies the coordinate by -1. For a sphere, you might think that nothing happens, but it actually turns front faces into back faces and vice versa; thus, now the inside is green and the outside is red. (By the way, this problem also affects the surface normal vector.) Thus, be careful with mirrors!

### Culling of Front or Back Faces[edit]

Finally, the shader (more specifically the shader pass) includes the line `Cull Off`

. This line has to come before `GLSLPROGRAM`

because it is not in GLSL. In fact, it is the command of Unity's ShaderLab to turn off any triangle culling. This is necessary because by default back faces are culled away as if the line `Cull Back`

was specified. You can also specify the culling of front faces with `Cull Front`

. The reason why culling of back-facing triangles is active by default, is that the inside of objects is usually invisible; thus, back-face culling can save quite some performance by avoiding to rasterize these triangles as explained next. Of course, we were able to see the inside with our shader because we have discarded some fragments; thus, we had to deactivate back-face culling.

How does culling work? Triangles and vertices are processed as usual. However, after the viewport transformation of the vertices to screen coordinates (see Section “Vertex Transformations”) the graphics processor determines whether the vertices of a triangle appear in counter-clockwise order or in clockwise order on the screen. Based on this test, each triangle is considered a front-facing or a back-facing triangle. If it is front-facing and culling is activated for front-facing triangles, it will be discarded, i.e., the processing of it stops and it is not rasterized. Analogously, if it is back-facing and culling is activated for back-facing triangles. Otherwise, the triangle will be processed as usual.

### Summary[edit]

Congratulations, you have worked through another tutorial. (If you have tried one of the assignments: good job! I didn't yet.) We have looked at:

- How to discard fragments.
- How to render front-facing and back-facing triangles in different colors.
- How to deactivate the default culling of back faces.
- How to activate the culling of front faces.

### Further Reading[edit]

If you still want to know more

- about the vertex transformations such as the model transformation from object to world coordinates or the viewport transformation to screen coordinates, you should read Section “Vertex Transformations”.
- about how to define shader properties, you should read Section “Shading in World Space”.
- about Unity's ShaderLab syntax for specifying culling, you should read Culling & Depth Testing.

# Transparency[edit]

This tutorial covers **blending** of fragments (i.e. compositing them) using GLSL shaders in Unity. It assumes that you are familiar with the concept of front and back faces as discussed in Section “Cutaways”.

More specifically, this tutorial is about **rendering transparent objects**, e.g. transparent glass, plastic, fabrics, etc. (More strictly speaking, these are actually semitransparent objects because they don't need to be perfectly transparent.) Transparent objects allow us to see through them; thus, their color “blends” with the color of whatever is behind them.

### Blending[edit]

As mentioned in Section “OpenGL ES 2.0 Pipeline”, the fragment shader computes an RGBA color (i.e. red, green, blue, and alpha components in `gl_FragColor`

) for each fragment (unless the fragment is discarded). The fragments are then processed as discussed in Section “Per-Fragment Operations”. One of the operations is the blending stage, which combines the color of the fragment (as specified in `gl_FragColor`

), which is called the “source color”, with the color of the corresponding pixel that is already in the framebuffer, which is called the “destination color” (because the “destination” of the resulting blended color is the framebuffer).

Blending is a fixed-function stage, i.e. you can configure it but not program it. The way it is configured, is by specifying a **blend equation**. You can think of the blend equation as this definition of the resulting RGBA color:

`vec4 result = SrcFactor * gl_FragColor + DstFactor * pixel_color;`

where `pixel_color`

is the RGBA color that is currently in the framebuffer and `result`

is the blended result, i.e. the output of the blending stage. `SrcFactor`

and `DstFactor`

are configurable RGBA colors (of type `vec4`

) that are multiplied component-wise with the fragment color and the pixel color. The values of `SrcFactor`

and `DstFactor`

are specified in Unity's ShaderLab syntax with this line:

`Blend`

{code for `SrcFactor`

} {code for `DstFactor`

}

The most common codes for the two factors are summarized in the following table (more codes are mentioned in Unity's ShaderLab reference about blending):

Code | Resulting Factor (`SrcFactor` or `DstFactor` ) |
---|---|

`One` |
`vec4(1.0)` |

`Zero` |
`vec4(0.0)` |

`SrcColor` |
`gl_FragColor` |

`SrcAlpha` |
`vec4(gl_FragColor.a)` |

`DstColor` |
`pixel_color` |

`DstAlpha` |
`vec4(pixel_color.a)` |

`OneMinusSrcColor` |
`vec4(1.0) - gl_FragColor` |

`OneMinusSrcAlpha` |
`vec4(1.0 - gl_FragColor.a)` |

`OneMinusDstColor` |
`vec4(1.0) - pixel_color` |

`OneMinusDstAlpha` |
`vec4(1.0 - pixel_color.a)` |

As discussed in Section “Vector and Matrix Operations”, `vec4(1.0)`

is just a short way of writing `vec4(1.0, 1.0, 1.0, 1.0)`

. Also note that all components of all colors and factors in the blend equation are clamped between 0 and 1.

### Alpha Blending[edit]

One specific example for a blend equation is called “alpha blending”. In Unity, it is specified this way:

`Blend SrcAlpha OneMinusSrcAlpha`

which corresponds to:

`vec4 result = vec4(gl_FragColor.a) * gl_FragColor + vec4(1.0 - gl_FragColor.a) * pixel_color;`

This uses the alpha component of `gl_FragColor`

as an **opacity**. I.e. the more opaque the fragment color is, the larger its opacity and therefore its alpha component, and thus the more of the fragment color is mixed in the result and the less of the pixel color in the framebuffer. A perfectly opaque fragment color (i.e. with an alpha component of 1) will completely replace the pixel color.

This blend equation is sometimes referred to as an “over” operation, i.e. “`gl_FragColor`

over `pixel_color`

”, since it corresponds to putting a layer of the fragment color with a specific opacity on top of the pixel color. (Think of a layer of colored glass or colored semitransparent plastic on top of something of another color.)

Due to the popularity of alpha blending, the alpha component of a color is often called opacity even if alpha blending is not employed. Moreover, note that in computer graphics a common formal definition of **transparency** is **1 − opacity**.

### Premultiplied Alpha Blending[edit]

There is an important variant of alpha blending: sometimes the fragment color has its alpha component already premultiplied to the color components. (You might think of it as a price that has VAT already included.) In this case, alpha should not be multiplied again (VAT should not be added again) and the correct blending is:

`Blend One OneMinusSrcAlpha`

which corresponds to:

`vec4 result = vec4(1.0) * gl_FragColor + vec4(1.0 - gl_FragColor.a) * pixel_color;`

### Additive Blending[edit]

Another example for a blending equation is:

`Blend One One`

This corresponds to:

`vec4 result = vec4(1.0) * gl_FragColor + vec4(1.0) * pixel_color;`

which just adds the fragment color to the color in the framebuffer. Note that the alpha component is not used at all; nonetheless, this blending equation is very useful for many kinds of transparent effects; for example, it is often used for particle systems when they represent fire or something else that is transparent and emits light. Additive blending is discussed in more detail in Section “Order-Independent Transparency”.

More examples of blend equations are given in Unity's ShaderLab reference about blending.

### Shader Code[edit]

Here is a simple shader which uses alpha blending to render a green color with opacity 0.3:

Shader "GLSL shader using blending" { SubShader { Tags { "Queue" = "Transparent" } // draw after all opaque geometry has been drawn Pass { ZWrite Off // don't write to depth buffer // in order not to occlude other objects Blend SrcAlpha OneMinusSrcAlpha // use alpha blending GLSLPROGRAM #ifdef VERTEX void main() { gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = vec4(0.0, 1.0, 0.0, 0.3); // the fourth component (alpha) is important: // this is semitransparent green } #endif ENDGLSL } } }

Apart from the blend equation, which has been discussed above, there are only two lines that need more explanation: `Tags { "Queue" = "Transparent" }`

and `ZWrite Off`

.

`ZWrite Off`

deactivates writing to the depth buffer. As explained in Section “Per-Fragment Operations”, the depth buffer keeps the depth of the nearest fragment and discards any fragments that have a larger depth. In the case of a transparent fragment, however, this is not what we want since we can (at least potentially) see through a transparent fragment. Thus, transparent fragments should not occlude other fragments and therefore the writing to the depth buffer is deactivated. See also Unity's ShaderLab reference about culling and depth testing.

The line `Tags { "Queue" = "Transparent" }`

specifies that the meshes using this subshader are rendered after all the opaque meshes were rendered. The reason is partly because we deactivate writing to the depth buffer: one consequence is that transparent fragments can be occluded by opaque fragments even though the opaque fragments are farther away. In order to fix this problem, we first draw all opaque meshes (in Unity´s “opaque queue”) before drawing all transparent meshes (in Unity's “transparent queue”). Whether or not a mesh is considered opaque or transparent depends on the tags of its subshader as specified with the line `Tags { "Queue" = "Transparent" }`

. More details about subshader tags are described in Unity's ShaderLab reference about subshader tags.

It should be mentioned that this strategy of rendering transparent meshes with deactivated writing to the depth buffer does not always solve all problems. It works perfectly if the order in which fragments are blended does not matter; for example, if the fragment color is just added to the pixel color in the framebuffer, the order in which fragments are blended is not important; see Section “Order-Independent Transparency”. However, for other blending equations, e.g. alpha blending, the result will be different depending on the order in which fragments are blended. (If you look through almost opaque green glass at almost opaque red glass you will mainly see green, while you will mainly see red if you look through almost opaque red glass at almost opaque green glass. Similarly, blending almost opaque green color over almost opaque red color will be different from blending almost opaque red color over almost opaque green color.) In order to avoid artifacts, it is therefore advisable to use additive blending or (premultiplied) alpha blending with small opacities (in which case the destination factor `DstFactor`

is close to 1 and therefore alpha blending is close to additive blending).

### Including Back Faces[edit]

The previous shader works well with other objects but it actually doesn't render the “inside” of the object. However, since we can see through the outside of a transparent object, we should also render the inside. As discussed in Section “Cutaways”, the inside can be rendered by deactivating culling with `Cull Off`

. However, if we just deactivate culling, we might get in trouble: as discussed above, it often matters in which order transparent fragments are rendered but without any culling, overlapping triangles from the inside and the outside might be rendered in a random order which can lead to annoying rendering artifacts. Thus, we would like to make sure that the inside (which is usually farther away) is rendered first before the outside is rendered. In Unity's ShaderLab this is achieved by specifying two passes, which are executed for the same mesh in the order in which they are defined:

Shader "GLSL shader using blending (including back faces)" { SubShader { Tags { "Queue" = "Transparent" } // draw after all opaque geometry has been drawn Pass { Cull Front // first pass renders only back faces // (the "inside") ZWrite Off // don't write to depth buffer // in order not to occlude other objects Blend SrcAlpha OneMinusSrcAlpha // use alpha blending GLSLPROGRAM #ifdef VERTEX void main() { gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = vec4(1.0, 0.0, 0.0, 0.3); // the fourth component (alpha) is important: // this is semitransparent red } #endif ENDGLSL } Pass { Cull Back // second pass renders only front faces // (the "outside") ZWrite Off // don't write to depth buffer // in order not to occlude other objects Blend SrcAlpha OneMinusSrcAlpha // standard blend equation "source over destination" GLSLPROGRAM #ifdef VERTEX void main() { gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = vec4(0.0, 1.0, 0.0, 0.3); // fourth component (alpha) is important: // this is semitransparent green } #endif ENDGLSL } } }

In this shader, the first pass uses front-face culling (with `Cull Front`

) to render the back faces (the inside) first. After that the second pass uses back-face culling (with `Cull Back`

) to render the front faces (the outside). This works perfect for convex meshes (closed meshes without dents; e.g. spheres or cubes) and is often a good approximation for other meshes.

### Summary[edit]

Congratulations, you made it through this tutorial! One interesting thing about rendering transparent objects is that it isn't just about blending but also requires knowledge about culling and the depth buffer. Specifically, we have looked at:

- What blending is and how it is specified in Unity.
- How a scene with transparent and opaque objects is rendered and how objects are classified as transparent or opaque in Unity.
- How to render the inside and outside of a transparent object, in particular how to specify two passes in Unity.

### Further Reading[edit]

If you still want to know more

- the OpenGL pipeline, you should read Section “OpenGL ES 2.0 Pipeline”.
- about per-fragment operations in the OpenGL pipeline (e.g. blending and the depth test), you should read Section “Per-Fragment Operations”.
- about front-face and back-face culling, you should read Section “Cutaways”.
- about how to specify culling and the depth buffer functionality in Unity, you should read Unity's ShaderLab reference about culling and depth testing.
- about how to specify blending in Unity, you should read Unity's ShaderLab reference about blending.
- about the render queues in Unity, you should read Unity's ShaderLab reference about subshader tags.

# Order-Independent Transparency[edit]

This tutorial covers **order-independent blending**.

It continues the discussion in Section “Transparency” and solves some problems of standard transparency. If you haven't read that tutorial, you should read it first.

### Order-Independent Blending[edit]

As noted in Section “Transparency”, the result of blending often (in particular for standard alpha blending) depends on the order in which triangles are rendered and therefore results in rendering artifacts if the triangles are not sorted from back to front (which they usually aren't). The term “order-independent transparency” describes various techniques to avoid this problem. One of these techniques is order-independent blending, i.e. the use of a blend equation that does not depend on the order in which triangles are rasterized. There two basic possibilities: additive blending and multiplicative blending.

#### Additive Blending[edit]

The standard example for additive blending are double exposures as in the images in this section: colors are added such that it is impossible (or at least very hard) to say in which order the photos were taken. Additive blending can be characterized in terms of the blend equation introduced in Section “Transparency”:

`vec4 result = SrcFactor * gl_FragColor + DstFactor * pixel_color;`

where `SrcFactor`

and `DstFactor`

are determined by a line in Unity's ShaderLab syntax:

`Blend`

{code for `SrcFactor`

} {code for `DstFactor`

}

For additive blending, the code for `DstFactor`

has to be `One`

and the code for `SrcFactor`

must not depend on the pixel color in the framebuffer; i.e., it can be `One`

, `SrcColor`

, `SrcAlpha`

, `OneMinusSrcColor`

, or `OneMinusSrcAlpha`

.

An example is:

Shader "GLSL shader using additive blending" { SubShader { Tags { "Queue" = "Transparent" } // draw after all opaque geometry has been drawn Pass { Cull Off // draw front and back faces ZWrite Off // don't write to depth buffer // in order not to occlude other objects Blend SrcAlpha One // additive blending GLSLPROGRAM #ifdef VERTEX void main() { gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = vec4(1.0, 0.0, 0.0, 0.3); } #endif ENDGLSL } } }

#### Multiplicative Blending[edit]

An example for multiplicative blending in photography is the use of multiple uniform grey filters: the order in which the filters are put onto a camera doesn't matter for the resulting attenuation of the image. In terms of the rasterization of triangles, the image corresponds to the contents of the framebuffer before the triangles are rasterized, while the filters correspond to the triangles.

When specifying multiplicative blending in Unity with the line

`Blend`

{code for `SrcFactor`

} {code for `DstFactor`

}

the code for `SrcFactor`

has to be `Zero`

and the code for `DstFactor`

must depend on the fragment color; i.e., it can be `SrcColor`

, `SrcAlpha`

, `OneMinusSrcColor`

, or `OneMinusSrcAlpha`

. A typical example for attenuating the background with the opacity specified by the alpha component of fragments would use `OneMinusSrcAlpha`

for the code for `DstFactor`

:

Shader "GLSL shader using multiplicative blending" { SubShader { Tags { "Queue" = "Transparent" } // draw after all opaque geometry has been drawn Pass { Cull Off // draw front and back faces ZWrite Off // don't write to depth buffer // in order not to occlude other objects Blend Zero OneMinusSrcAlpha // multiplicative blending // for attenuation by the fragment's alpha GLSLPROGRAM #ifdef VERTEX void main() { gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = vec4(1.0, 0.0, 0.0, 0.3); // only A (alpha) is used } #endif ENDGLSL } } }

### Complete Shader Code[edit]

Finally, it makes good sense to combine multiplicative blending for the attenuation of the background and additive blending for the addition of colors of the triangles in one shader by combining the two passes that were presented above. This can be considered an approximation to alpha blending for **small opacities**, i.e. **small values of alpha**, if one ignores attenuation of colors of the triangle mesh by itself.

Shader "GLSL shader using order-independent blending" { SubShader { Tags { "Queue" = "Transparent" } // draw after all opaque geometry has been drawn Pass { Cull Off // draw front and back faces ZWrite Off // don't write to depth buffer // in order not to occlude other objects Blend Zero OneMinusSrcAlpha // multiplicative blending // for attenuation by the fragment's alpha GLSLPROGRAM #ifdef VERTEX void main() { gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = vec4(1.0, 0.0, 0.0, 0.3); // only A (alpha) is used } #endif ENDGLSL } Pass { Cull Off // draw front and back faces ZWrite Off // don't write to depth buffer // in order not to occlude other objects Blend SrcAlpha One // additive blending to add colors GLSLPROGRAM #ifdef VERTEX void main() { gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = vec4(1.0, 0.0, 0.0, 0.3); } #endif ENDGLSL } } }

Note that the order of the two passes is important: first the background is attenuated and then colors are added.

### Summary[edit]

Congratulations, you have reached the end of this tutorial. We have looked at:

- What order-independent transparency and order-independent blending is.
- What the two most important kinds of order-independent blending are (additive and multiplicative).
- How to implement additive and multiplicative blending.
- How to combine two passes for additive and multiplicative blending for an order-independent approximation to alpha blending.

### Further Reading[edit]

If you still want to know more

- about the shader code, you should read Section “Transparency”.
- about another technique for order-independent transparency, namely depth peeling, you could read a technical report by Cass Everitt: “Interactive Order-Independent Transparency”, which is available online.

# Silhouette Enhancement[edit]

This tutorial covers the **transformation of surface normal vectors**. It assumes that you are familiar with alpha blending as discussed in Section “Transparency” and with shader properties as discussed in Section “Shading in World Space”.

The objective of this tutorial is to achieve an effect that is visible in the photo to the left: the silhouettes of semitransparent objects tend to be more opaque than the rest of the object. This adds to the impression of a three-dimensional shape even without lighting. It turns out that transformed normals are crucial to obtain this effect.

### Silhouettes of Smooth Surfaces[edit]

In the case of smooth surfaces, points on the surface at silhouettes are characterized by normal vectors that are parallel to the viewing plane and therefore orthogonal to the direction to the viewer. In the figure to the left, the blue normal vectors at the silhouette at the top of the figure are parallel to the viewing plane while the other normal vectors point more in the direction to the viewer (or camera). By calculating the direction to the viewer and the normal vector and testing whether they are (almost) orthogonal to each other, we can therefore test whether a point is (almost) on the silhouette.

More specifically, if **V** is the normalized (i.e. of length 1) direction to the viewer and **N** is the normalized surface normal vector, then the two vectors are orthogonal if the dot product is 0: **V**·**N** = 0. In practice, this will rarely be the case. However, if the dot product **V**·**N** is close to 0, we can assume that the point is close to a silhouette.

### Increasing the Opacity at Silhouettes[edit]

For our effect, we should therefore increase the opacity if the dot product **V**·**N** is close to 0. There are various ways to increase the opacity for small dot products between the direction to the viewer and the normal vector. Here is one of them (which actually has a physical model behind it, which is described in Section 5.1 of this publication) to compute the increased opacity from the regular opacity of the material:

It always makes sense to check the extreme cases of an equation like this. Consider the case of a point close to the silhouette: **V**·**N** ≈ 0. In this case, the regular opacity will be divided by a small, positive number. (Note that GLSL guarantees to handle the case of division by zero gracefully; thus, we don't have to worry about it.) Therefore, whatever is, the ratio of and a small positive number, will be larger. The function will take care that the resulting opacity is never larger than 1.

On the other hand, for points far away from the silhouette we have **V**·**N** ≈ 1. In this case, α' ≈ min(1, α) ≈ α; i.e., the opacity of those points will not change much. This is exactly what we want. Thus, we have just checked that the equation is at least plausible.

### Implementing an Equation in a Shader[edit]

In order to implement an equation like the one for in a shader, the first question should be: Should it be implemented in the vertex shader or in the fragment shader? In some cases, the answer is clear because the implementation requires texture mapping, which is often only available in the fragment shader. In many cases, however, there is no general answer. Implementations in vertex shaders tend to be faster (because there are usually fewer vertices than fragments) but of lower image quality (because normal vectors and other vertex attributes can change abruptly between vertices). Thus, if you are most concerned about performance, an implementation in a vertex shader is probably a better choice. On the other hand, if you are most concerned about image quality, an implementation in a pixel shader might be a better choice. The same trade-off exists between per-vertex lighting (i.e. Gouraud shading, which is discussed in Section “Specular Highlights”) and per-fragment lighting (i.e. Phong shading, which is discussed in Section “Smooth Specular Highlights”).

The next question is: in which coordinate system should the equation be implemented? (See Section “Vertex Transformations” for a description of the standard coordinate systems.) Again, there is no general answer. However, an implementation in world coordinates is often a good choice in Unity because many uniform variables are specified in world coordinates. (In other environments implementations in view coordinates are very common.)

The final question before implementing an equation is: where do we get the parameters of the equation from? The regular opacity is specified (within a RGBA color) by a shader property (see Section “Shading in World Space”). The normal vector `gl_Normal`

is a standard vertex attribute (see Section “Debugging of Shaders”). The direction to the viewer can be computed in the vertex shader as the vector from the vertex position in world space to the camera position in world space `_WorldSpaceCameraPos`

, which is provided by Unity.

Thus, we only have to transform the vertex position and the normal vector into world space before implementing the equation. The transformation matrix `_Object2World`

from object space to world space and its inverse `_World2Object`

are provided by Unity as discussed in Section “Shading in World Space”. The application of transformation matrices to points and normal vectors is discussed in detail in Section “Applying Matrix Transformations”. The basic result is that points and directions are transformed just by multiplying them with the transformation matrix, e.g.:

uniform mat4 _Object2World; ... vec4 positionInWorldSpace = _Object2World * gl_Vertex; vec3 viewDirection = _WorldSpaceCameraPos - vec3(positionInWorldSpace);

On the other hand **normal vectors are transformed by multiplying them with the transposed inverse transformation matrix**. Since Unity provides us with the inverse transformation matrix (which is `_World2Object * unity_Scale.w`

apart from the bottom-right elemen), a better alternative is to multiply the normal vector **from the left** to the inverse matrix, which is equivalent to multiplying it from the right to the transposed inverse matrix as discussed in Section “Applying Matrix Transformations”:

uniform mat4 _World2Object; // the inverse of _Object2World // (after multiplication with unity_Scale.w) uniform vec4 unity_Scale; ... vec3 normalInWorldSpace = vec3(vec4(gl_Normal, 0.0) * _World2Object * unity_Scale.w); // corresponds to a multiplication of the // transposed inverse of _Object2World with gl_Normal

Note that the incorrect bottom-right matrix element is no problem because it is always multiplied with 0. Moreover, the multiplication with `unity_Scale.w`

is unnecessary if the scaling doesn't matter; for example, if we normalize all transformed vectors.

Now we have all the pieces that we need to write the shader.

### Shader Code[edit]

Shader "GLSL silhouette enhancement" { Properties { _Color ("Color", Color) = (1, 1, 1, 0.5) // user-specified RGBA color including opacity } SubShader { Tags { "Queue" = "Transparent" } // draw after all opaque geometry has been drawn Pass { ZWrite Off // don't occlude other objects Blend SrcAlpha OneMinusSrcAlpha // standard alpha blending GLSLPROGRAM uniform vec4 _Color; // define shader property for shaders // The following built-in uniforms are also defined in // "UnityCG.glslinc", which could be #included uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix // (apart from the factor unity_Scale.w) varying vec3 varyingNormalDirection; // normalized surface normal vector varying vec3 varyingViewDirection; // normalized view direction #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // multiplication with unity_Scale.w is unnecessary // because we normalize transformed vectors varyingNormalDirection = normalize( vec3(vec4(gl_Normal, 0.0) * modelMatrixInverse)); varyingViewDirection = normalize(_WorldSpaceCameraPos - vec3(modelMatrix * gl_Vertex)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { vec3 normalDirection = normalize(varyingNormalDirection); vec3 viewDirection = normalize(varyingViewDirection); float newOpacity = min(1.0, _Color.a / abs(dot(viewDirection, normalDirection))); gl_FragColor = vec4(vec3(_Color), newOpacity); } #endif ENDGLSL } } }

The assignment to `newOpacity`

is an almost literal translation of the equation

Note that we normalize the varyings `varyingNormalDirection`

and `varyingViewDirection`

in the vertex shader (because we want to interpolate between directions without putting more nor less weight on any of them) and at the begin of the fragment shader (because the interpolation can distort our normalization to a certain degree). However, in many cases the normalization of `varyingNormalDirection`

in the vertex shader is not necessary. Similarly, the normalization of `varyingViewDirection`

in the fragment shader is in most cases unnecessary.

### More Artistic Control[edit]

While the described silhouette enhancement is based on a physical model, it lacks artistic control; i.e., a CG artist cannot easily create a thinner or thicker silhouette than the physical model suggests. To allow for more artistic control, you could introduce another (positive) floating-point number property and take the dot product |**V**·**N**| to the power of this number (using the built-in GLSL function `pow(float x, float y)`

) before using it in the equation above. This will allow CG artists to create thinner or thicker silhouettes independently of the opacity of the base color.

### Summary[edit]

Congratulations, you have finished this tutorial. We have discussed:

- How to find silhouettes of smooth surfaces (using the dot product of the normal vector and the view direction).
- How to enhance the opacity at those silhouettes.
- How to implement equations in shaders.
- How to transform points and normal vectors from object space to world space (using the transposed inverse model matrix for normal vectors).
- How to compute the viewing direction (as the difference from the camera position to the vertex position).
- How to interpolate normalized directions (i.e. normalize twice: in the vertex shader and the fragment shader).
- How to provide more artistic control over the thickness of silhouettes .

### Further Reading[edit]

If you still want to know more

- about object space and world space, you should read the description in Section “Vertex Transformations”.
- about how to apply transformation matrices to points, directions and normal vectors, you should read Section “Applying Matrix Transformations”.
- about the basics of rendering transparent objects, you should read Section “Transparency”.
- about uniform variables provided by Unity and shader properties, you should read Section “Shading in World Space”.
- about the mathematics of silhouette enhancement, you could read Section 5.1 of the paper “Scale-Invariant Volume Rendering” by Martin Kraus, published at IEEE Visualization 2005, which is available online.

# Diffuse Reflection[edit]

This tutorial covers **per-vertex diffuse reflection**.

It's the first in a series of tutorials about basic lighting in Unity. In this tutorial, we start with diffuse reflection from a single directional light source and then include point light sources and multiple light sources (using multiple passes). Further tutorials cover extensions of this, in particular specular reflection, per-pixel lighting, and two-sided lighting.

### Diffuse Reflection[edit]

The moon exhibits almost exclusively diffuse reflection (also called Lambertian reflection), i.e. light is reflected into all directions without specular highlights. Other examples of such materials are chalk and matte paper; in fact, any surface that appears dull and matte.

In the case of perfect diffuse reflection, the intensity of the observed reflected light depends on the cosine of the angle between the surface normal vector and the ray of the incoming light. As illustrated in the figure to the left, it is common to consider normalized vectors starting in the point of a surface, where the lighting should be computed: the normalized surface normal vector **N** is orthogonal to the surface and the normalized light direction **L** points to the light source.

For the observed diffuse reflected light , we need the cosine of the angle between the normalized surface normal vector **N** and the normalized direction to the light source **L**, which is the dot product **N**·**L** because the dot product **a**·**b** of any two vectors **a** and **b** is:

.

In the case of normalized vectors, the lengths |**a**| and |**b**| are both 1.

If the dot product **N**·**L** is negative, the light source is on the “wrong” side of the surface and we should set the reflection to 0. This can be achieved by using max(0, **N**·**L**), which makes sure that the value of the dot product is clamped to 0 for negative dot products. Furthermore, the reflected light depends on the intensity of the incoming light and a material constant for the diffuse reflection: for a black surface, the material constant is 0, for a white surface it is 1. The equation for the diffuse reflected intensity is then:

For colored light, this equation applies to each color component (e.g. red, green, and blue). Thus, if the variables , , and denote color vectors and the multiplications are performed component-wise (which they are for vectors in GLSL), this equation also applies to colored light. This is what we actually use in the shader code.

### Shader Code for One Directional Light Source[edit]

If we have only one directional light source, the shader code for implementing the equation for is relatively small. In order to implement the equation, we follow the questions about implementing equations, which were discussed in Section “Silhouette Enhancement”:

- Should the equation be implemented in the vertex shader or the fragment shader? We try the vertex shader here. In Section “Smooth Specular Highlights”, we will look at an implementation in the fragment shader.
- In which coordinate system should the equation be implemented? We try world space by default in Unity. (Which turns out to be a good choice here because Unity provides the light direction in world space.)
- Where do we get the parameters from? The answer to this is a bit longer:

We use a shader property (see Section “Shading in World Space”) to let the user specify the diffuse material color . We can get the direction to the light source in world space from the Unity-specific uniform `_WorldSpaceLightPos0`

and the light color from the Unity-specific uniform `_LightColor0`

. As mentioned in Section “Shading in World Space”, we have to tag the shader pass with `Tags {"LightMode" = "ForwardBase"}`

to make sure that these uniforms have the correct values. (Below we will discuss what this tag actually means.) We get the surface normal vector in object coordinates from the attribute `gl_Normal`

. Since we implement the equation in world space, we have to convert the surface normal vector from object space to world space as discussed in Section “Silhouette Enhancement”.

The shader code then looks like this:

Shader "GLSL per-vertex diffuse lighting" { Properties { _Color ("Diffuse Material Color", Color) = (1,1,1,1) } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // make sure that all uniforms are correctly set GLSLPROGRAM uniform vec4 _Color; // shader property specified by users // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 color; // the diffuse lighting computed in the vertex shader #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors vec3 normalDirection = normalize( vec3(vec4(gl_Normal, 0.0) * modelMatrixInverse)); vec3 lightDirection = normalize( vec3(_WorldSpaceLightPos0)); vec3 diffuseReflection = vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); color = vec4(diffuseReflection, 1.0); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = color; } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Diffuse" }

When you use this shader, make sure that there is only one light source in the scene, which has to be directional. If there is no light source, you can create a directional light source by selecting **Game Object > Create Other > Directional Light** from the main menu. Also, make sure that the “Forward Rendering Path” is active by selecting **Edit > Project Settings > Player** and then in the **Inspector View** the **Per-Platform Settings > Other Settings > Rendering > Rendering Path** should be set to **Forward**. (See below for more details about the “Forward Rendering Path”.)

### Fallback Shaders[edit]

The line `Fallback "Diffuse"`

in the shader code defines a built-in fallback shader in case Unity doesn't find an appropriate subshader. For our example, Unity would use the fallback shader if it doesn't use the “forward rendering path” (see below) or if it couldn't compile the shader code. By choosing the specific name “_Color” for our shader property, we make sure that this built-in fallback shader can also access it. The source code of the built-in shaders is available at Unity's website. Inspection of this source code appears to be the only way to determine a suitable fallback shader and the names of the properties that it is using.

As mentioned, Unity will also use the fallback shader if there is a compile error in the shader code. In this case, the error is only be reported in the **Inspector View** of the shader; thus, it might be difficult to understand that the fallback shader is being used. Therefore, it is usually a good idea to comment the fallback instruction out during development of a shader but include it in the final version for better compatibility.

### Shader Code for Multiple Directional (Pixel) Lights[edit]

So far, we have only considered a single light source. In order to handle multiple light sources, Unity chooses various techniques depending on the rendering and quality settings. In the tutorials here, we will only cover the “Forward Rendering Path”. In order to choose it, select **Edit > Project Settings > Player** and then in the Inspector View set **Per-Platform Settings > Other Settings > Rendering > Rendering Path** to **Forward**. (Moreover, all cameras should be configured to use the player settings, which they are by default.)

In this tutorial we consider only Unity's so-called **pixel lights**. For the first pixel light (which always is a directional light), Unity calls the shader pass tagged with `Tags { "LightMode" = "ForwardBase" }`

(as in our code above). For each additional pixel light, Unity calls the shader pass tagged with `Tags { "LightMode" = "ForwardAdd" }`

. In order to make sure that all lights are rendered as pixel lights, you have to make sure that the quality settings allow for enough pixel lights: Select **Edit > Project Settings > Quality** and then increase the number labeled **Pixel Light Count** in any of the quality settings that you use. If there are more light sources in the scene than pixel light count allows for, Unity renders only the most important lights as pixel lights. Alternatively, you can set the **Render Mode** of all light sources to **Important** in order to render them as pixel lights. (See Section “Multiple Lights” for a discussion of the less important **vertex lights**.)

Our shader code so far is OK for the `ForwardBase`

pass. For the `ForwardAdd`

pass, we need to add the reflected light to the light that is already stored in the framebuffer. To this end, we just have to configure the blending to add the new fragment color (`gl_FragColor`

) to the color in the framebuffer. As discussed in Section “Transparency”, this is achieved by an additive blend equation, which is specified by this line:

`Blend One One`

Blending automatically clamps all results between 0 and 1; thus, we don't have to worry about colors or alpha values greater than 1.

All in all, our new shader for multiple directional lights becomes:

Shader "GLSL per-vertex diffuse lighting" { Properties { _Color ("Diffuse Material Color", Color) = (1,1,1,1) } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for first light source GLSLPROGRAM uniform vec4 _Color; // shader property specified by users // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 color; // the diffuse lighting computed in the vertex shader #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors vec3 normalDirection = normalize( vec3(vec4(gl_Normal, 0.0) * modelMatrixInverse)); vec3 lightDirection = normalize( vec3(_WorldSpaceLightPos0)); vec3 diffuseReflection = vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); color = vec4(diffuseReflection, 1.0); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = color; } #endif ENDGLSL } Pass { Tags { "LightMode" = "ForwardAdd" } // pass for additional light sources Blend One One // additive blending GLSLPROGRAM uniform vec4 _Color; // shader property specified by users // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 color; // the diffuse lighting computed in the vertex shader #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors vec3 normalDirection = normalize( vec3(vec4(gl_Normal, 0.0) * modelMatrixInverse)); vec3 lightDirection = normalize( vec3(_WorldSpaceLightPos0)); vec3 diffuseReflection = vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); color = vec4(diffuseReflection, 1.0); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = color; } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Diffuse" }

This appears to be a rather long shader; however, both passes are identical apart from the tag and the `Blend`

setting in the `ForwardAdd`

pass.

### Changes for a Point Light Source[edit]

In the case of a directional light source `_WorldSpaceLightPos0`

specifies the direction from where light is coming. In the case of a point light source (or a spot light source), however, `_WorldSpaceLightPos0`

specifies the position of the light source in world space and we have to compute the direction to the light source as the difference vector from the position of the vertex in world space to the position of the light source. Since the 4th coordinate of a point is 1 and the 4th coordinate of a direction is 0, we can easily distinguish between the two cases:

vec3 lightDirection; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { lightDirection = normalize( vec3(_WorldSpaceLightPos0 - modelMatrix * gl_Vertex)); }

While there is no attenuation of light for directional light sources, we should add some attenuation with distance to point and spot light source. As light spreads out from a point in three dimensions, it's covering ever larger virtual spheres at larger distances. Since the surface of these spheres increases quadratically with increasing radius and the total amount of light per sphere is the same, the amount of light per area decreases quadratically with increasing distance from the point light source. Thus, we should divide the intensity of the light source by the squared distance to the vertex.

Since a quadratic attenuation is rather rapid, we use a linear attenuation with distance, i.e. we divide the intensity by the distance instead of the squared distance. The code could be:

vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - modelMatrix * gl_Vertex); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); }

The factor `attenuation`

should then be multiplied with `_LightColor0`

to compute the incoming light; see the shader code below. Note that spot light sources have additional features, which are beyond the scope of this tutorial.

Also note that this code is unlikely to give you the best performance because any `if`

is usually quite costly. Since `_WorldSpaceLightPos0.w`

is either 0 or 1, it is actually not too hard to rewrite the code to avoid the use of `if`

and optimize a bit further:

vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - modelMatrix * gl_Vertex * _WorldSpaceLightPos0.w); float one_over_distance = 1.0 / length(vertexToLightSource); float attenuation = mix(1.0, one_over_distance, _WorldSpaceLightPos0.w); vec3 lightDirection = vertexToLightSource * one_over_distance;

However, we will use the version with `if`

for clarity. (“Keep it simple, stupid!”)

The complete shader code for multiple directional and point lights is:

Shader "GLSL per-vertex diffuse lighting" { Properties { _Color ("Diffuse Material Color", Color) = (1,1,1,1) } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for first light source GLSLPROGRAM uniform vec4 _Color; // shader property specified by users // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 color; // the diffuse lighting computed in the vertex shader #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors vec3 normalDirection = normalize( vec3(vec4(gl_Normal, 0.0) * modelMatrixInverse)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - modelMatrix * gl_Vertex); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); color = vec4(diffuseReflection, 1.0); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = color; } #endif ENDGLSL } Pass { Tags { "LightMode" = "ForwardAdd" } // pass for additional light sources Blend One One // additive blending GLSLPROGRAM uniform vec4 _Color; // shader property specified by users // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 color; // the diffuse lighting computed in the vertex shader #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors vec3 normalDirection = normalize( vec3(vec4(gl_Normal, 0.0) * modelMatrixInverse)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - modelMatrix * gl_Vertex); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); color = vec4(diffuseReflection, 1.0); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = color; } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Diffuse" }

Note that the light source in the `ForwardBase`

pass always is a directional light; thus, the code for the first pass could actually be simplified. On the other hand, using the same GLSL code for both passes, makes it easier to copy & paste the code from one pass to the other in case we have to edit the shader code.

If there is a problem with the shader, remember to activate the “Forward Rendering Path” by selecting **Edit > Project Settings > Player** and then in the **Inspector View** set **Per-Platform Settings > Other Settings > Rendering > Rendering Path** to **Forward**.

### Changes for a Spotlight[edit]

Unity implements spotlights with the help of cookie textures as described in Section “Cookies”; however, this is somewhat advanced. Here, we treat spotlights as if they were point lights.

### Summary[edit]

Congratulations! You just learned how Unity's per-pixel lights work. This is essential for the following tutorials about more advanced lighting. We have also seen:

- What diffuse reflection is and how to describe it mathematically.
- How to implement diffuse reflection for a single directional light source in a shader.
- How to extend the shader for point light sources with linear attenuation.
- How to further extend the shader to handle multiple per-pixel lights.

### Further Reading[edit]

If you still want to know more

- about transforming normal vectors into world space, you should read Section “Shading in World Space”.
- about uniform variables provided by Unity and shader properties, you should read Section “Shading in World Space”.
- about (additive) blending, you should read Section “Transparency”.
- about pass tags in Unity (e.g.
`ForwardBase`

or`ForwardAdd`

), you should read Unity's ShaderLab reference about pass tags. - about how Unity processes light sources in general, you should read Unity's manual about rendering paths.

# Specular Highlights[edit]

This tutorial covers **per-vertex lighting** (also known as **Gouraud shading**) using the **Phong reflection model**.

It extends the shader code in Section “Diffuse Reflection” by two additional terms: ambient lighting and specular reflection. Together, the three terms constitute the Phong reflection model. If you haven't read Section “Diffuse Reflection”, this would be a very good opportunity to read it.

### Ambient Light[edit]

Consider the painting by Caravaggio to the left. While large parts of the white shirt are in shadows, no part of it is completely black. Apparently there is always some light being reflected from walls and other objects to illuminate everything in the scene — at least to a certain degree. In the Phong reflection model, this effect is taken into account by ambient lighting, which depends on a general ambient light intensity and the material color for diffuse reflection. In an equation for the intensity of ambient lighting :

Analogously to the equation for diffuse reflection in Section “Diffuse Reflection”, this equation can also be interpreted as a vector equation for the red, green, and blue components of light.

In Unity, the ambient light is specified by choosing **Edit > Render Settings** from the main menu. In a GLSL shader in Unity, this color is always available as `gl_LightModel.ambient`

, which is one of the pre-defined uniforms of the OpenGL compatibility profile mentioned in Section “Shading in World Space”.

### Specular Highlights[edit]

If you have a closer look at Caravaggio's painting, you will see several specular highlights: on the nose, on the hair, on the lips, on the lute, on the violin, on the bow, on the fruits, etc. The Phong reflection model includes a specular reflection term that can simulate such highlights on shiny surfaces; it even includes a parameter to specify a shininess of the material. The shininess specifies how small the highlights are: the shinier, the smaller the highlights.

A perfectly shiny surface will reflect light from the light source only in the geometrically reflected direction **R**. For less than perfectly shiny surfaces, light is reflected to directions around **R**: the smaller the shininess, the wider the spreading. Mathematically, the normalized reflected direction **R** is defined by:

for a normalized surface normal vector **N** and a normalized direction to the light source **L**. In GLSL, the function `vec3 reflect(vec3 I, vec3 N)`

(or `vec4 reflect(vec4 I, vec4 N)`

) computes the same reflected vector but for the direction `I`

from the light source to the point on the surface. Thus, we have to negate our direction **L** to use this function.

The specular reflection term computes the specular reflection in the direction of the viewer **V**. As discussed above, the intensity should be large if **V** is close to **R**, where “closeness” is parametrized by the shininess . In the Phong reflection model, the cosine of the angle between **R** and **V** to the -th power is used to generate highlights of different shininess. Similarly to the case of the diffuse reflection, we should clamp negative cosines to 0. Furthermore, the specular term requires a material color for the specular reflection, which is usually just white such that all highlights have the color of the incoming light . For example, all highlights in Caravaggio's painting are white. The specular term of the Phong reflection model is then:

Analogously to the case of the diffuse reflection, the specular term should be ignored if the light source is on the “wrong” side of the surface; i.e., if the dot product **N**·**L** is negative.

### Shader Code[edit]

The shader code for the ambient lighting is straightforward with a component-wise vector-vector product:

vec3 ambientLighting = vec3(gl_LightModel.ambient) * vec3(_Color);

For the implementation of the specular reflection, we require the direction to the viewer in world space, which we can compute as the difference between the camera position and the vertex position (both in world space). The camera position in world space is provided by Unity in the uniform `_WorldSpaceCameraPos`

; the vertex position can be transformed to world space as discussed in Section “Diffuse Reflection”. The equation of the specular term in world space could then be implemented like this:

vec3 viewDirection = normalize(vec3( vec4(_WorldSpaceCameraPos, 1.0) - modelMatrix * gl_Vertex)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); }

This code snippet uses the same variables as the shader code in Section “Diffuse Reflection” and additionally the user-specified properties `_SpecColor`

and `_Shininess`

. (The names were specifically chosen such that the fallback shader can access them; see the discussion in Section “Diffuse Reflection”.) `pow(a, b)`

computes .

If the ambient lighting is added to the first pass (we only need it once) and the specular reflection is added to both passes of the full shader of Section “Diffuse Reflection”, it looks like this:

Shader "GLSL per-vertex lighting" { Properties { _Color ("Diffuse Material Color", Color) = (1,1,1,1) _SpecColor ("Specular Material Color", Color) = (1,1,1,1) _Shininess ("Shininess", Float) = 10 } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for ambient light and first light source GLSLPROGRAM // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 color; // the Phong lighting computed in the vertex shader #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors vec3 normalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); vec3 viewDirection = normalize(vec3( vec4(_WorldSpaceCameraPos, 1.0) - modelMatrix * gl_Vertex)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - modelMatrix * gl_Vertex); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 ambientLighting = vec3(gl_LightModel.ambient) * vec3(_Color); vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } color = vec4(ambientLighting + diffuseReflection + specularReflection, 1.0); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = color; } #endif ENDGLSL } Pass { Tags { "LightMode" = "ForwardAdd" } // pass for additional light sources Blend One One // additive blending GLSLPROGRAM // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 color; // the diffuse lighting computed in the vertex shader #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors vec3 normalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); vec3 viewDirection = normalize(vec3( vec4(_WorldSpaceCameraPos, 1.0) - modelMatrix * gl_Vertex)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - modelMatrix * gl_Vertex); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } // vec3 ambientLighting = // vec3(gl_LightModel.ambient) * vec3(_Color); vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } color = vec4(diffuseReflection + specularReflection, 1.0); // no ambient lighting in this pass gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = color; } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Specular" }

### Summary[edit]

Congratulation, you just learned how to implement the Phong reflection model. In particular, we have seen:

- What the ambient lighting in the Phong reflection model is.
- What the specular reflection term in the Phong reflection model is.
- How these terms can be implemented in GLSL in Unity.

### Further Reading[edit]

If you still want to know more

- about the shader code, you should read Section “Diffuse Reflection”.

# Two-Sided Surfaces[edit]

This tutorial covers **two-sided per-vertex lighting**.

It's part of a series of tutorials about basic lighting in Unity. In this tutorial, we extend Section “Specular Highlights” to render two-sided surfaces. If you haven't read Section “Specular Highlights”, this would be a very good time to read it.

### Two-Sided Lighting[edit]

As shown by the figure of the algebraic surface, it's sometimes useful to apply different colors to the two sides of a surface. In Section “Cutaways”, we have seen how a fragment shader can use the built-in variable `gl_FrontFacing`

to determine whether a fragment is part of a front-facing or a back-facing triangle. Can a vertex shader also determine whether it is part of a front-facing or a back-facing triangle? The answer is a clear: **no!** One reason is that the same vertex can be part of a front-facing **and** a back-facing triangle at the same time; thus, whatever decision is made in the vertex shader, it is potentially wrong for some triangles. If you want a simple rule to remember: “Fragments are either front-facing or back-facing. Vertices are bi.”

Thus, two-sided per-vertex lighting has to let the fragment shader determine, whether the front or the back material color should be applied. For example, with this fragment shader:

#ifdef FRAGMENT varying vec4 frontColor; // color for front face varying vec4 backColor; // color for back face void main() { if (gl_FrontFacing) // is the fragment part of a front face? { gl_FragColor = frontColor; } else // fragment is part of a back face { gl_FragColor = backColor; } } #endif

On the other hand, this means that the vertex shader has to compute the surface lighting twice: for a front face and for a back face. Fortunately, this is usually still less work than computing the surface lighting for each fragment.

### Shader Code[edit]

The shader code for two-sided per-vertex lighting is a straightforward extension of the code in Section “Specular Highlights”. It requires two sets of material parameters (front and back) and deactivates culling. The vertex shader computes two colors, one for front faces and one for back faces using the negated normal vector and the second set of material parameters. Then the fragment shader decides which one to apply.

Shader "GLSL two-sided per-vertex lighting" { Properties { _Color ("Front Material Diffuse Color", Color) = (1,1,1,1) _SpecColor ("Front Material Specular Color", Color) = (1,1,1,1) _Shininess ("Front Material Shininess", Float) = 10 _BackColor ("Back Material Diffuse Color", Color) = (1,1,1,1) _BackSpecColor ("Back Material Specular Color", Color) = (1,1,1,1) _BackShininess ("Back Material Shininess", Float) = 10 } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for ambient light and first light source Cull Off // render front faces and back faces GLSLPROGRAM // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; uniform vec4 _BackColor; uniform vec4 _BackSpecColor; uniform float _BackShininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 frontColor; // lighting of front faces computed in the vertex shader varying vec4 backColor; // lighting of back faces computed in the vertex shader #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors vec3 normalDirection = normalize(vec3(vec4(gl_Normal, 0.0) * modelMatrixInverse)); vec3 viewDirection = normalize(vec3( vec4(_WorldSpaceCameraPos, 1.0) - modelMatrix * gl_Vertex)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - modelMatrix * gl_Vertex); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } // Computation of lighting for front faces vec3 ambientLighting = vec3(gl_LightModel.ambient) * vec3(_Color); vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } frontColor = vec4(ambientLighting + diffuseReflection + specularReflection, 1.0); // Computation of lighting for back faces // (uses negative normalDirection and back material colors) vec3 backAmbientLighting = vec3(gl_LightModel.ambient) * vec3(_BackColor); vec3 backDiffuseReflection = attenuation * vec3(_LightColor0) * vec3(_BackColor) * max(0.0, dot(-normalDirection, lightDirection)); vec3 backSpecularReflection; if (dot(-normalDirection, lightDirection) < 0.0) // light source on the wrong side? { backSpecularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { backSpecularReflection = attenuation * vec3(_LightColor0) * vec3(_BackSpecColor) * pow(max(0.0, dot( reflect(-lightDirection, -normalDirection), viewDirection)), _BackShininess); } backColor = vec4(backAmbientLighting + backDiffuseReflection + backSpecularReflection, 1.0); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { if (gl_FrontFacing) // is the fragment part of a front face? { gl_FragColor = frontColor; } else // fragment is part of a back face { gl_FragColor = backColor; } } #endif ENDGLSL } Pass { Tags { "LightMode" = "ForwardAdd" } // pass for additional light sources Blend One One // additive blending Cull Off // render front faces and back faces GLSLPROGRAM // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; uniform vec4 _BackColor; uniform vec4 _BackSpecColor; uniform float _BackShininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 frontColor; // lighting of front faces computed in the vertex shader varying vec4 backColor; // lighting of back faces computed in the vertex shader #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors vec3 normalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); vec3 viewDirection = normalize(vec3( vec4(_WorldSpaceCameraPos, 1.0) - modelMatrix * gl_Vertex)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - modelMatrix * gl_Vertex); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } // Computation of lighting for front faces vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } frontColor = vec4(diffuseReflection + specularReflection, 1.0); // Computation of lighting for back faces // (uses negative normalDirection and back material colors) vec3 backDiffuseReflection = attenuation * vec3(_LightColor0) * vec3(_BackColor) * max(0.0, dot(-normalDirection, lightDirection)); vec3 backSpecularReflection; if (dot(-normalDirection, lightDirection) < 0.0) // light source on the wrong side? { backSpecularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { backSpecularReflection = attenuation * vec3(_LightColor0) * vec3(_BackSpecColor) * pow(max(0.0, dot( reflect(-lightDirection, -normalDirection), viewDirection)), _BackShininess); } backColor = vec4(backDiffuseReflection + backSpecularReflection, 1.0); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { if (gl_FrontFacing) // is the fragment part of a front face? { gl_FragColor = frontColor; } else // fragment is part of a back face { gl_FragColor = backColor; } } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Specular" }

Again, this code consists of two passes, where the second pass is the same as the first apart from the additive blending and the missing ambient color.

### Summary[edit]

Congratulations, you made it to the end of this short tutorial with a long shader. We have seen:

- Why a vertex shader cannot distinguish between front-facing and back-facing vertices (because the same vertex might be part of a front-facing and a back-facing triangles).
- How to compute lighting for front faces and for back faces in the vertex shader.
- How to let the fragment shader decide which color to apply.

### Further Reading[edit]

If you still want to know more

- about the shader version for single-sided surfaces, you should read Section “Specular Highlights”.
- about front-facing and back-facing triangles in GLSL, you should read Section “Cutaways”.

# Smooth Specular Highlights[edit]

This tutorial covers **per-pixel lighting** (also known as **Phong shading**).

It is based on Section “Specular Highlights”. If you haven't read that tutorial yet, you should read it first. The main disadvantage of per-vertex lighting (i.e. of computing the surface lighting for each vertex and then interpolating the vertex colors) is the limited quality, in particular for specular highlights as demonstrated by the figure to the left. The remedy is per-pixel lighting which computes the lighting for each fragment based on an interpolated normal vector. While the resulting image quality is considerably higher, the performance costs are also significant.

### Per-Pixel Lighting (Phong Shading)[edit]

Per-pixel lighting is also known as Phong shading (in contrast to per-vertex lighting, which is also known as Gouraud shading). This should not be confused with the Phong reflection model (also called Phong lighting), which computes the surface lighting by an ambient, a diffuse, and a specular term as discussed in Section “Specular Highlights”.

The key idea of per-pixel lighting is easy to understand: normal vectors and positions are interpolated for each fragment and the lighting is computed in the fragment shader.

### Shader Code[edit]

Apart from optimizations, implementing per-pixel lighting based on shader code for per-vertex lighting is straightforward: the lighting computation is moved from the vertex shader to the fragment shader and the vertex shader has to write the attributes required for the lighting computation to varyings. The fragment shader then uses these varyings to compute the lighting (instead of the attributes that the vertex shader used). That's about it.

In this tutorial, we adapt the shader code from Section “Specular Highlights” to per-pixel lighting. The result looks like this:

Shader "GLSL per-pixel lighting" { Properties { _Color ("Diffuse Material Color", Color) = (1,1,1,1) _SpecColor ("Specular Material Color", Color) = (1,1,1,1) _Shininess ("Shininess", Float) = 10 } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for ambient light and first light source GLSLPROGRAM // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 position; // position of the vertex in world space varying vec3 varyingNormalDirection; // surface normal vector in world space #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors position = modelMatrix * gl_Vertex; varyingNormalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { vec3 normalDirection = normalize(varyingNormalDirection); vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - position); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 ambientLighting = vec3(gl_LightModel.ambient) * vec3(_Color); vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } gl_FragColor = vec4(ambientLighting + diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } Pass { Tags { "LightMode" = "ForwardAdd" } // pass for additional light sources Blend One One // additive blending GLSLPROGRAM // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 position; // position of the vertex in world space varying vec3 varyingNormalDirection; // surface normal vector in world space #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors position = modelMatrix * gl_Vertex; varyingNormalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { vec3 normalDirection = normalize(varyingNormalDirection); vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - position); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } gl_FragColor = vec4(diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Specular" }

Note that the vertex shader writes a normalized vector to `varyingNormalDirection`

in order to make sure that all directions are weighted equally in the interpolation. The fragment shader normalizes it again because the interpolated directions are no longer normalized.

### Summary[edit]

Congratulations, now you know how per-pixel Phong lighting works. We have seen:

- Why the quality provided by per-vertex lighting is sometimes insufficient (in particular because of specular highlights).
- How per-pixel lighting works and how to implement it based on a shader for per-vertex lighting.

### Further Reading[edit]

If you still want to know more

- about the shader version for per-vertex lighting, you should read Section “Specular Highlights”.

# Two-Sided Smooth Surfaces[edit]

This tutorial covers **two-sided per-pixel lighting** (i.e. **two-sided Phong shading**).

Here we combine the per-pixel lighting discussed in Section “Smooth Specular Highlights” with the two-sided lighting discussed in Section “Two-Sided Surfaces”.

### Shader Coder[edit]

The required changes to the code of Section “Smooth Specular Highlights” are: new properties for the back material, deactivation of culling, new local variables for the material parameters in the fragment shader, which are set either to the front material parameters or the back material parameters according to `gl_FrontFacing`

. Also, the surface normal vector is negated in case a back face is rendered. It's actually quite straightforward. The code looks like this:

Shader "GLSL two-sided per-pixel lighting" { Properties { _Color ("Front Material Diffuse Color", Color) = (1,1,1,1) _SpecColor ("Front Material Specular Color", Color) = (1,1,1,1) _Shininess ("Front Material Shininess", Float) = 10 _BackColor ("Back Material Diffuse Color", Color) = (1,1,1,1) _BackSpecColor ("Back Material Specular Color", Color) = (1,1,1,1) _BackShininess ("Back Material Shininess", Float) = 10 } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for ambient light and first light source Cull Off GLSLPROGRAM // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; uniform vec4 _BackColor; uniform vec4 _BackSpecColor; uniform float _BackShininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 position; // position of the vertex in world space varying vec3 varyingNormalDirection; // surface normal vector in world space #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors position = modelMatrix * gl_Vertex; varyingNormalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { vec3 normalDirection = normalize(varyingNormalDirection); vec4 diffuseColor; vec4 specularColor; float shininess; if (gl_FrontFacing) { diffuseColor = _Color; specularColor = _SpecColor; shininess = _Shininess; } else { diffuseColor = _BackColor; specularColor = _BackSpecColor; shininess = _BackShininess; normalDirection = -normalDirection; } vec3 viewDirection = normalize( _WorldSpaceCameraPos - vec3(position)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - position); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 ambientLighting = vec3(gl_LightModel.ambient) * vec3(diffuseColor); vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(diffuseColor) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(specularColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), shininess); } gl_FragColor = vec4(ambientLighting + diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } Pass { Tags { "LightMode" = "ForwardAdd" } // pass for additional light sources Blend One One // additive blending Cull Off GLSLPROGRAM // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; uniform vec4 _BackColor; uniform vec4 _BackSpecColor; uniform float _BackShininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 position; // position of the vertex in world space varying vec3 varyingNormalDirection; // surface normal vector in world space #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors position = modelMatrix * gl_Vertex; varyingNormalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { vec3 normalDirection = normalize(varyingNormalDirection); vec4 diffuseColor; vec4 specularColor; float shininess; if (gl_FrontFacing) { diffuseColor = _Color; specularColor = _SpecColor; shininess = _Shininess; } else { diffuseColor = _BackColor; specularColor = _BackSpecColor; shininess = _BackShininess; normalDirection = -normalDirection; } vec3 viewDirection = normalize( _WorldSpaceCameraPos - vec3(position)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - position); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(diffuseColor) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(specularColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), shininess); } gl_FragColor = vec4(diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Specular" }

As always, the only difference between the two passes is the lack of ambient lighting and the additive blending in the second pass.

### Summary[edit]

Congratulations, you have reached the end of this short tutorial. We have seen:

- How two-sided surfaces can be rendered with per-pixel lighting.

### Further Reading[edit]

If you still want to know more

- about the shader version for single-sided per-pixel lighting, you should read Section “Smooth Specular Highlights”.
- about the shader version for two-sided per-vertex lighting, you should read Section “Two-Sided Surfaces”.

# Multiple Lights[edit]

This tutorial covers **lighting by multiple light sources in one pass**. In particular, it covers Unity's so-called “vertex lights” in the `ForwardBase`

pass.

This tutorial is an extension of Section “Smooth Specular Highlights”. If you haven't read that tutorial, you should read it first.

### Multiple Lights in One Pass[edit]

As discussed in Section “Diffuse Reflection”, Unity's forward rendering path uses separate passes for the most important light sources. These are called “pixel lights” because the built-in shaders render them with per-pixel lighting. All light sources with the **Render Mode** set to **Important** are rendered as pixel lights. If the **Pixel Light Count** of the **Quality** project settings allows for more pixel lights, then some of the light sources with **Render Mode** set to **Auto** are also rendered as pixel lights. What happens to the other light sources? The built-in shaders of Unity render four additional lights as **vertex lights** in the `ForwardBase`

pass. As the name indicates, the built-in shaders render these lights with per-vertex lighting. This is what this tutorial is about. (Further lights are approximated by spherical harmonic lighting, which is not covered here.)

Unfortunately, it is somewhat unclear how to access the four vertex lights (i.e. their positions and colors). Here is, what appears to work in Unity 3.4 on Windows and MacOS X:

// Built-in uniforms for "vertex lights" uniform vec4 unity_LightColor[4]; // array of the colors of the 4 light sources uniform vec4 unity_4LightPosX0; // x coordinates of the 4 light sources in world space uniform vec4 unity_4LightPosY0; // y coordinates of the 4 light sources in world space uniform vec4 unity_4LightPosZ0; // z coordinates of the 4 light sources in world space uniform vec4 unity_4LightAtten0; // scale factors for attenuation with squared distance // uniform vec4 unity_LightPosition[4] is apparently not // always correctly set in Unity 3.4 // uniform vec4 unity_LightAtten[4] is apparently not // always correctly set in Unity 3.4

Depending on your platform and version of Unity you might have to use `unity_LightPosition[4]`

instead of `unity_4LightPosX0`

, `unity_4LightPosY0`

, and `unity_4LightPosZ0`

. Similarly, you might have to use `unity_LightAtten[4]`

instead of `unity_4LightAtten0`

. Note what's not available: neither any cookie texture nor the transformation to light space (and therefore neither the direction of spotlights). Also, no 4th component of the light positions is available; thus, it is unclear whether a vertex light is a directional light, a point light, or a spotlight.

Here, we follow Unity's built-in shaders and only compute the diffuse reflection by vertex lights using per-vertex lighting. This can be computed with the following for-loop inside the vertex shader:

vertexLighting = vec3(0.0, 0.0, 0.0); for (int index = 0; index < 4; index++) { vec4 lightPosition = vec4(unity_4LightPosX0[index], unity_4LightPosY0[index], unity_4LightPosZ0[index], 1.0); vec3 vertexToLightSource = vec3(lightPosition - position); vec3 lightDirection = normalize(vertexToLightSource); float squaredDistance = dot(vertexToLightSource, vertexToLightSource); float attenuation = 1.0 / (1.0 + unity_4LightAtten0[index] * squaredDistance); vec3 diffuseReflection = attenuation * vec3(unity_LightColor[index]) * vec3(_Color) * max(0.0, dot(varyingNormalDirection, lightDirection)); vertexLighting = vertexLighting + diffuseReflection; }

The total diffuse lighting by all vertex lights is accumulated in `vertexLighting`

by initializing it to black and then adding the diffuse reflection of each vertex light to the previous value of `vertexLighting`

at the end of the for-loop. A for-loop should be familiar to any C/C++/Java/JavaScript programmer. Note that for-loops are sometimes severely limited; in particular the limits (here: 0 and 4) have to be constants in Unity, i.e. you cannot even use uniforms to determine the limits. (The technical reason is that the limits have to be known at compile time in order to “un-roll” the loop.)

This is more or less how vertex lights are computed in Unity's built-in shaders. However, remember that nothing would stop you from computing specular reflection or per-pixel lighting with these “vertex lights”.

### Complete Shader Code[edit]

In the context of the shader code from Section “Smooth Specular Highlights”, the complete shader code is:

Shader "GLSL per-pixel lighting with vertex lights" { Properties { _Color ("Diffuse Material Color", Color) = (1,1,1,1) _SpecColor ("Specular Material Color", Color) = (1,1,1,1) _Shininess ("Shininess", Float) = 10 } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for // 4 vertex lights, ambient light & first pixel light GLSLPROGRAM #pragma multi_compile_fwdbase // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") // Built-in uniforms for "vertex lights" uniform vec4 unity_LightColor[4]; uniform vec4 unity_4LightPosX0; // x coordinates of the 4 light sources in world space uniform vec4 unity_4LightPosY0; // y coordinates of the 4 light sources in world space uniform vec4 unity_4LightPosZ0; // z coordinates of the 4 light sources in world space uniform vec4 unity_4LightAtten0; // scale factors for attenuation with squared distance // Varyings varying vec4 position; // position of the vertex (and fragment) in world space varying vec3 varyingNormalDirection; // surface normal vector in world space varying vec3 vertexLighting; #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors position = modelMatrix * gl_Vertex; varyingNormalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; // Diffuse reflection by four "vertex lights" vertexLighting = vec3(0.0, 0.0, 0.0); #ifdef VERTEXLIGHT_ON for (int index = 0; index < 4; index++) { vec4 lightPosition = vec4(unity_4LightPosX0[index], unity_4LightPosY0[index], unity_4LightPosZ0[index], 1.0); vec3 vertexToLightSource = vec3(lightPosition - position); vec3 lightDirection = normalize(vertexToLightSource); float squaredDistance = dot(vertexToLightSource, vertexToLightSource); float attenuation = 1.0 / (1.0 + unity_4LightAtten0[index] * squaredDistance); vec3 diffuseReflection = attenuation * vec3(unity_LightColor[index]) * vec3(_Color) * max(0.0, dot(varyingNormalDirection, lightDirection)); vertexLighting = vertexLighting + diffuseReflection; } #endif } #endif #ifdef FRAGMENT void main() { vec3 normalDirection = normalize(varyingNormalDirection); vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - position); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 ambientLighting = vec3(gl_LightModel.ambient) * vec3(_Color); vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } gl_FragColor = vec4(vertexLighting + ambientLighting + diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } Pass { Tags { "LightMode" = "ForwardAdd" } // pass for additional "pixel lights" Blend One One // additive blending GLSLPROGRAM // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") // Varyings varying vec4 position; // position of the vertex (and fragment) in world space varying vec3 varyingNormalDirection; // surface normal vector in world space #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors position = modelMatrix * gl_Vertex; varyingNormalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { vec3 normalDirection = normalize(varyingNormalDirection); vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - position); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } gl_FragColor = vec4(diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Specular" }

The use of `#pragma multi_compile_fwdbase`

and `#ifdef VERTEXLIGHT_ON ... #endif`

appears to be necessary to make sure that no vertex lighting is computed when Unity doesn't provide the data.

### Summary[edit]

Congratulations, you have reached the end of this tutorial. We have seen:

- How Unity's vertex lights are specified.
- How a for-loop can be used in GLSL to compute the lighting of multiple lights in one pass.

### Further Reading[edit]

If you still want to know more

- about other parts of the shader code, you should read Section “Smooth Specular Highlights”.
- about Unity's forward rendering path and what is computed in the
`ForwardBase`

pass, you should read Unity's reference about forward rendering.

# Textured Spheres[edit]

This tutorial introduces **texture mapping**.

It's the first in a series of tutorials about texturing in GLSL shaders in Unity. In this tutorial, we start with a single texture map on a sphere. More specifically, we map an image of the Earth's surface onto a sphere. Based on this, further tutorials cover topics such as lighting of textured surfaces, transparent textures, multitexturing, gloss mapping, etc.

### Texture Mapping[edit]

The basic idea of “texture mapping” (or “texturing”) is to map an image (i.e. a “texture” or a “texture map”) onto a triangle mesh; in other words, to put a flat image onto the surface of a three-dimensional shape.

To this end, “texture coordinates” are defined, which simply specify the position in the texture (i.e. image). The horizontal coordinate is officially called `S`

and the vertical coordinate `T`

. However, it is very common to refer to them as `x`

and `y`

. In animation and modeling tools, texture coordinates are usually called `U`

and `V`

.

In order to map the texture image to a mesh, every vertex of the mesh is given a pair of texture coordinates. (This process (and the result) is sometimes called “UV mapping” since each vertex is mapped to a point in the UV-space.) Thus, every vertex is mapped to a point in the texture image. The texture coordinates of the vertices can then be interpolated for each point of any triangle between three vertices and thus every point of all triangles of the mesh can have a pair of (interpolated) texture coordinates. These texture coordinates map each point of the mesh to a specific position in the texture map and therefore to the color at this position. Thus, rendering a texture-mapped mesh consists of two steps for all visible points: interpolation of texture coordinates and a look-up of the color of the texture image at the position specified by the interpolated texture coordinates.

In OpenGL, any valid floating-point number is a valid texture coordinate. However, when the GPU is asked to look up a pixel (or “texel”) of a texture image (e.g. with the “texture2D” instruction described below), it will internally map the texture coordinates to the range between 0 and 1 in a way depending on the “Wrap Mode” that is specified when importing the texture: wrap mode “repeat” basically uses the fractional part of the texture coordinates to determine texture coordinates in the range between 0 and 1. On the other hand, wrap mode “clamp” clamps the texture coordinates to this range. These internal texture coordinates in the range between 0 and 1 are then used to determine the position in the texture image: specifies the lower, left corner of the texture image; the lower, right corner; the upper, left corner; etc.

### Texturing a Sphere in Unity[edit]

To map the image of the Earth's surface onto a sphere in Unity, you first have to import the image into Unity. Click the image until you get to a larger version and save it (usually with a right-click) to your computer (remember where you saved it). Then switch to Unity and choose **Assets > Import New Asset...** from the main menu. Choose the image file and click on **Import** in the file selector box. The imported texture image should appear in the **Project View**. By selecting it there, details about the way it is imported appear (and can be changed) in the **Inspector View**.

Now create a sphere, a material, and a shader, and attach the shader to the material and the material to the sphere as described in Section “Minimal Shader”. The shader code should be:

Shader "GLSL shader with single texture" { Properties { _MainTex ("Texture Image", 2D) = "white" {} // a 2D texture property that we call "_MainTex", which should // be labeled "Texture Image" in Unity's user interface. // By default we use the built-in texture "white" // (alternatives: "black", "gray" and "bump"). } SubShader { Pass { GLSLPROGRAM uniform sampler2D _MainTex; // a uniform variable refering to the property above // (in fact, this is just a small integer specifying a // "texture unit", which has the texture image "bound" // to it; Unity takes care of this). varying vec4 textureCoordinates; // the texture coordinates at the vertices, // which are interpolated for each fragment #ifdef VERTEX void main() { textureCoordinates = gl_MultiTexCoord0; // Unity provides default longitude-latitude-like // texture coordinates at all vertices of a // sphere mesh as the attribute "gl_MultiTexCoord0". gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = texture2D(_MainTex, vec2(textureCoordinates)); // look up the color of the texture image specified by // the uniform "_MainTex" at the position specified by // "textureCoordinates.x" and "textureCoordinates.y" // and return it in "gl_FragColor" } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Unlit/Texture" }

Note that the name `_MainTex`

was chosen to make sure that the fallback shader `Unlit/Texture`

can access it (see the discussion of fallback shaders in Section “Diffuse Reflection”).

The sphere should now be white. If it is grey, you should check whether the shader is attached to the material and the material is attached to the sphere. If the sphere is magenta, you should check the shader code. In particular, you should select the shader in the **Project View** and read the error message in the **Inspector View**.

If the sphere is white, select the sphere in the **Hierarchy View** or the **Scene View** and look at the information in the **Inspector View**. Your material should appear under **Mesh Renderer** and under it should be a label **Texture Image**. (Otherwise click on the material bar to make it appear.) The label “Texture Image” is the same that we specified for our shader property `_MainTex`

in the shader code. There is an empty box to the right of this label. Either click on the small **Select** button in the box and select the imported texture image or drag & drop the texture image from the **Project View** to this empty box.

If everything went right, the texture image should now appear on the sphere. Congratulations!

### How It Works[edit]

Since many techniques use texture mapping, it pays off very well to understand what is happening here. Therefore, let's review the shader code:

The vertices of Unity's sphere object come with attribute data in `gl_MultiTexCoord0`

for each vertex, which specifies texture coordinates that are similar to longitude and latitude (but range from 0 to 1). This is analogous to the attribute `gl_Vertex`

, which specifies a position in object space, except that `gl_MultiTexCoord0`

specifies texture coordinates in the space of the texture image.

The vertex shader then writes the texture coordinates of each vertex to the varying variable `textureCoordinates`

. For each fragment of a triangle (i.e. each covered pixel), the values of this varying at the three triangle vertices are interpolated (see the description in Section “Rasterization”) and the interpolated texture coordinates are given to the fragment shader. The fragment shader then uses them to look up a color in the texture image specified by the uniform `_MainTex`

at the interpolated position in texture space and returns this color in `gl_FragColor`

, which is then written to the framebuffer and displayed on the screen.

It is crucial that you gain a good idea of these steps in order to understand the more complicated texture mapping techniques presented in other tutorials.

### Repeating and Moving Textures[edit]

In Unity's interface for the shader above, you might have noticed the parameters **Tiling** and **Offset**, each with an **x** and a **y** component. In built-in shaders, these parameters allow you to repeat the texture (by shrinking the texture image in texture coordinate space) and move the texture image on the surface (by offsetting it in texture coordinate space). In order to be consistent with this behavior, another uniform has to be defined:

uniform vec4 _MainTex_ST; // tiling and offset parameters of property "_MainTex"

For each texture property, Unity offers such a `vec4`

uniform with the ending “_ST”. (Remember: “S” and “T” are the official names of the texture coordinates, which are usually called “U” and “V”, or “x” and “y”.) This uniform holds the **x** and **y** components of the **Tiling** parameter in `_MainTex_ST.x`

and `_MainTex_ST.y`

, while the **x** and **y** components of the **Offset** parameter are stored in `_MainTex_ST.w`

and `_MainTex_ST.z`

. The uniform should be used like this:

gl_FragColor = texture2D(_MainTex, _MainTex_ST.xy * textureCoordinates.xy + _MainTex_ST.zw);

This makes the shader behave like the built-in shaders. In the other tutorials, this feature is usually not implemented in order to keep the shader code a bit cleaner.

And just for completeness, here is the complete shader code with this feature:

Shader "GLSL shader with single texture" { Properties { _MainTex ("Texture Image", 2D) = "white" {} } SubShader { Pass { GLSLPROGRAM uniform sampler2D _MainTex; uniform vec4 _MainTex_ST; // tiling and offset parameters of property varying vec4 textureCoordinates; #ifdef VERTEX void main() { textureCoordinates = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = texture2D(_MainTex, _MainTex_ST.xy * textureCoordinates.xy + _MainTex_ST.zw); // textureCoordinates are multiplied with the tiling // parameters and the offset parameters are added } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Unlit/Texture" }

### Summary[edit]

You have reached the end of one of the most important tutorials. We have looked at:

- How to import a texture image and how to attach it to a texture property of a shader.
- How a vertex shader and a fragment shader work together to map a texture image onto a mesh.
- How Unity's tiling and offset parameters for textures work and how to implement them.

### Further Reading[edit]

If you want to know more

- about the data flow in and out of vertex shaders and fragment shaders (i.e. vertex attributes, varyings, etc.), you should read the description in Section “OpenGL ES 2.0 Pipeline”.
- about the interpolation of varying variables for the fragment shader, you should read the discussion in Section “Rasterization”.

# Lighting Textured Surfaces[edit]

This tutorial covers **per-vertex lighting of textured surfaces**.

It combines the shader code of Section “Textured Spheres” and Section “Specular Highlights” to compute lighting with a diffuse material color determined by a texture. If you haven't read those sections, this would be a very good opportunity to read them.

### Texturing and Diffuse Per-Vertex Lighting[edit]

In Section “Textured Spheres”, the texture color was used as output of the fragment shader. However, it is also possible to use the texture color as any of the parameters in lighting computations, in particular the material constant for diffuse reflection, which was introduced in Section “Diffuse Reflection”. It appears in the diffuse part of the Phong reflection model:

where this equation is used with different material constants for the three color components red, green, and blue. By using a texture to determine these material constants, they can vary over the surface.

### Shader Code[edit]

In comparison to the per-vertex lighting in Section “Specular Highlights”, the vertex shader here computes two varying colors: `diffuseColor`

is multiplied with the texture color in the fragment shader and `specularColor`

is just the specular term, which shouldn't be multiplied with the texture color. This makes perfect sense but for historically reasons (i.e. older graphics hardware that was less capable) this is sometimes referred to as “separate specular color”; in fact, Unity's ShaderLab has an option called “SeparateSpecular” to activate or deactivate it.

Note that a property `_Color`

is included, which is multiplied (component-wise) to all parts of the `diffuseColor`

; thus, it acts as a useful color filter to tint or shade the texture color. Moreover, a property with this name is required to make the fallback shader work (see also the discussion of fallback shaders in Section “Diffuse Reflection”).

Shader "GLSL per-vertex lighting with texture" { Properties { _MainTex ("Texture For Diffuse Material Color", 2D) = "white" {} _Color ("Overall Diffuse Color Filter", Color) = (1,1,1,1) _SpecColor ("Specular Material Color", Color) = (1,1,1,1) _Shininess ("Shininess", Float) = 10 } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for ambient light and first light source GLSLPROGRAM // User-specified properties uniform sampler2D _MainTex; uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec3 diffuseColor; // diffuse Phong lighting computed in the vertex shader varying vec3 specularColor; // specular Phong lighting computed in the vertex shader varying vec4 textureCoordinates; #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors vec3 normalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); vec3 viewDirection = normalize(vec3( vec4(_WorldSpaceCameraPos, 1.0) - modelMatrix * gl_Vertex)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - modelMatrix * gl_Vertex); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 ambientLighting = vec3(gl_LightModel.ambient) * vec3(_Color); vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } diffuseColor = ambientLighting + diffuseReflection; specularColor = specularReflection; textureCoordinates = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = vec4(diffuseColor * vec3(texture2D(_MainTex, vec2(textureCoordinates))) + specularColor, 1.0); } #endif ENDGLSL } Pass { Tags { "LightMode" = "ForwardAdd" } // pass for additional light sources Blend One One // additive blending GLSLPROGRAM // User-specified properties uniform sampler2D _MainTex; uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec3 diffuseColor; // diffuse Phong lighting computed in the vertex shader varying vec3 specularColor; // specular Phong lighting computed in the vertex shader varying vec4 textureCoordinates; #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors vec3 normalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); vec3 viewDirection = normalize(vec3( vec4(_WorldSpaceCameraPos, 1.0) - modelMatrix * gl_Vertex)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - modelMatrix * gl_Vertex); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } diffuseColor = diffuseReflection; specularColor = specularReflection; textureCoordinates = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = vec4(diffuseColor * vec3(texture2D(_MainTex, vec2(textureCoordinates))) + specularColor, 1.0); } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Specular" }

In order to assign a texture image to this shader, you should follow the steps discussed in Section “Textured Spheres”.

### Summary[edit]

Congratulations, you have reached the end. We have looked at:

- How texturing and per-vertex lighting are usually combined.
- What a “separate specular color” is.

### Further Reading[edit]

If you still want to know more

- about fallback shaders or the diffuse reflection term of the Phong reflection model, you should read Section “Diffuse Reflection”.
- about per-vertex lighting or the rest of the Phong reflection model, i.e. the ambient and the specular term, you should read Section “Specular Highlights”.
- about the basics of texturing, you should read Section “Textured Spheres”.

# Glossy Textures[edit]

This tutorial covers **per-pixel lighting of partially glossy, textured surfaces**.

It combines the shader code of Section “Textured Spheres” and Section “Smooth Specular Highlights” to compute per-pixel lighting with a material color for diffuse reflection that is determined by the RGB components of a texture and an intensity of the specular reflection that is determined by the A component of the same texture. If you haven't read those sections, this would be a very good opportunity to read them.

### Gloss Mapping[edit]

In Section “Lighting Textured Surfaces”, the material constant for the diffuse reflection was determined by the RGB components of a texture image. Here we extend this technique and determine the strength of the specular reflection by the A (alpha) component of the same texture image. Using only one texture offers a significant performance advantage, in particular because an RGBA texture lookup is under certain circumstances just as expensive as an RGB texture lookup.

If the “gloss” of a texture image (i.e. the strength of the specular reflection) is encoded in the A (alpha) component of an RGBA texture image, we can simply multiply the material constant for the specular reflection with the alpha component of the texture image. was introduced in Section “Specular Highlights” and appears in the specular reflection term of the Phong reflection model:

If multiplied with the alpha component of the texture image, this term reaches its maximum (i.e. the surface is glossy) where alpha is 1, and it is 0 (i.e. the surface is not glossy at all) where alpha is 0.

### Shader Code for Per-Pixel Lighting[edit]

The shader code is a combination of the per-pixel lighting from Section “Smooth Specular Highlights” and the texturing from Section “Textured Spheres”. Similarly to Section “Lighting Textured Surfaces”, the RGB components of the texture color in `textureColor`

is multiplied to the diffuse material color `_Color`

.

In the particular texture image to the left, the alpha component is 0 for water and 1 for land. However, it should be the water that is glossy and the land that isn't. Thus, with this particular image, we should multiply the specular material color with `(1.0 - textureColor.a)`

. On the other hand, usual gloss maps would require a multiplication with `textureColor.a`

. (Note how easy it is to make this kind of changes to a shader program.)

Shader "GLSL per-pixel lighting with texture" { Properties { _MainTex ("RGBA Texture For Material Color", 2D) = "white" {} _Color ("Diffuse Material Color", Color) = (1,1,1,1) _SpecColor ("Specular Material Color", Color) = (1,1,1,1) _Shininess ("Shininess", Float) = 10 } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for ambient light and first light source GLSLPROGRAM // User-specified properties uniform sampler2D _MainTex; uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 position; // position of the vertex (and fragment) in world space varying vec3 varyingNormalDirection; // surface normal vector in world space varying vec4 textureCoordinates; #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors position = modelMatrix * gl_Vertex; varyingNormalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; textureCoordinates = gl_MultiTexCoord0; } #endif #ifdef FRAGMENT void main() { vec3 normalDirection = normalize(varyingNormalDirection); vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection; float attenuation; vec4 textureColor = texture2D(_MainTex, vec2(textureCoordinates)); if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - position); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 ambientLighting = vec3(gl_LightModel.ambient) * vec3(_Color) * vec3(textureColor); vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * vec3(textureColor) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * (1.0 - textureColor.a) // for usual gloss maps: "... * textureColor.a" * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } gl_FragColor = vec4(ambientLighting + diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } Pass { Tags { "LightMode" = "ForwardAdd" } // pass for additional light sources Blend One One // additive blending GLSLPROGRAM // User-specified properties uniform sampler2D _MainTex; uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 position; // position of the vertex (and fragment) in world space varying vec3 varyingNormalDirection; // surface normal vector in world space varying vec4 textureCoordinates; #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors position = modelMatrix * gl_Vertex; varyingNormalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; textureCoordinates = gl_MultiTexCoord0; } #endif #ifdef FRAGMENT void main() { vec3 normalDirection = normalize(varyingNormalDirection); vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection; float attenuation; vec4 textureColor = texture2D(_MainTex, vec2(textureCoordinates)); if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - position); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * vec3(textureColor) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * (1.0 - textureColor.a) // for usual gloss maps: "... * textureColor.a" * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } gl_FragColor = vec4(diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Specular" }

A useful modification of this shader for the particular texture image above, would be to set the diffuse material color to a dark blue where the alpha component is 0.

### Shader Code for Per-Vertex Lighting[edit]

As discussed in Section “Smooth Specular Highlights”, specular highlights are usually not rendered very well with per-vertex lighting. Sometimes, however, there is no choice because of performance limitations. In order to include gloss mapping in the shader code of Section “Lighting Textured Surfaces”, the fragment shaders of both passes should be replaced with this code:

#ifdef FRAGMENT void main() { vec4 textureColor = texture2D(_MainTex, vec2(textureCoordinates)); gl_FragColor = vec4(diffuseColor * vec3(textureColor) + specularColor * (1.0 - textureColor.a), 1.0); } #endif

Note that a usual gloss map would require a multiplication with `textureColor.a`

instead of `(1.0 - textureColor.a)`

.

### Summary[edit]

Congratulations! You finished an important tutorial about gloss mapping. We have looked at:

- What gloss mapping is.
- How to implement it for per-pixel lighting.
- How to implement it for per-vertex lighting.

### Further Reading[edit]

If you still want to learn more

- about per-pixel lighting (without texturing), you should read Section “Smooth Specular Highlights”.
- about texturing, you should read Section “Textured Spheres”.
- about per-vertex lighting with texturing, you should read Section “Lighting Textured Surfaces”.

# Transparent Textures[edit]

This tutorial covers various common uses of **alpha texture maps**, i.e. RGBA texture images with an A (alpha) component that specifies the opacity of texels.

It combines the shader code of Section “Textured Spheres” with concepts that were introduced in Section “Cutaways” and Section “Transparency”.

If you haven't read these tutorials, this would be a very good opportunity to read them.

### Discarding Transparent Fragments[edit]

Let's start with discarding fragments as explained in Section “Cutaways”. Follow the steps described in Section “Textured Spheres” and assign the image to the left to the material of a sphere with the following shader:

Shader "GLSL texturing with alpha discard" { Properties { _MainTex ("RGBA Texture Image", 2D) = "white" {} _Cutoff ("Alpha Cutoff", Float) = 0.5 } SubShader { Pass { Cull Off // since the front is partially transparent, // we shouldn't cull the back GLSLPROGRAM uniform sampler2D _MainTex; uniform float _Cutoff; varying vec4 textureCoordinates; #ifdef VERTEX void main() { textureCoordinates = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = texture2D(_MainTex, vec2(textureCoordinates)); if (gl_FragColor.a < _Cutoff) // alpha value less than user-specified threshold? { discard; // yes: discard this fragment } } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Unlit/Transparent Cutout" }

The fragment shader reads the RGBA texture and compares the alpha value against a user-specified threshold. If the alpha value is less than the threshold, the fragment is discarded and the surface appears transparent.

### Alpha Testing[edit]

The same effect as described above can be implemented with an alpha test. The advantage of the alpha test is that it runs also on older hardware that doesn't support GLSL. Here is the code, which results in more or less the same result as the shader above:

Shader "GLSL texturing with alpha test" { Properties { _MainTex ("RGBA Texture Image", 2D) = "white" {} _Cutoff ("Alpha Cutoff", Float) = 0.5 } SubShader { Pass { Cull Off // since the front is partially transparent, // we shouldn't cull the back AlphaTest Greater [_Cutoff] // specify alpha test: // fragment passes if alpha is greater than _Cutoff GLSLPROGRAM uniform sampler2D _MainTex; uniform float _Cutoff; varying vec4 textureCoordinates; #ifdef VERTEX void main() { textureCoordinates = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = texture2D(_MainTex, vec2(textureCoordinates)); } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Unlit/Transparent Cutout" }

Here, no explicit `discard`

instruction is necessary but the alpha test has to be configured to pass only those fragments with an alpha value of more than the `_Cutoff`

property; otherwise they are discarded. More details about the alpha test are available in Unity's ShaderLab documentation.

Note that the alpha test and the `discard`

instruction are rather slow on some platforms, in particular on mobile devices. Thus, blending is often a more efficient alternative.

### Blending[edit]

The Section “Transparency” described how to render semitransparent objects with alpha blending. Combining this with an RGBA texture results in this code:

Shader "GLSL texturing with alpha blending" { Properties { _MainTex ("RGBA Texture Image", 2D) = "white" {} } SubShader { Tags {"Queue" = "Transparent"} Pass { Cull Front // first render the back faces ZWrite Off // don't write to depth buffer // in order not to occlude other objects Blend SrcAlpha OneMinusSrcAlpha // blend based on the fragment's alpha value GLSLPROGRAM uniform sampler2D _MainTex; varying vec4 textureCoordinates; #ifdef VERTEX void main() { textureCoordinates = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = texture2D(_MainTex, vec2(textureCoordinates)); } #endif ENDGLSL } Pass { Cull Back // now render the front faces ZWrite Off // don't write to depth buffer // in order not to occlude other objects Blend SrcAlpha OneMinusSrcAlpha // blend based on the fragment's alpha value GLSLPROGRAM uniform sampler2D _MainTex; varying vec4 textureCoordinates; #ifdef VERTEX void main() { textureCoordinates = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = texture2D(_MainTex, vec2(textureCoordinates)); } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Unlit/Transparent" }

Note that all texels with an alpha value of 0 are black in this particular texture image. In fact, the colors in this texture image are “premultiplied” with their alpha value. (Such colors are also called “opacity-weighted.”) Thus, for this particular image, we should actually specify the blend equation for premultiplied colors in order to avoid another multiplication of the colors with their alpha value in the blend equation. Therefore, an improvement of the shader (for this particular texture image) is to employ the following blend specification in both passes:

`Blend One OneMinusSrcAlpha`

### Blending with Customized Colors[edit]

We should not end this tutorial without a somewhat more practical application of the presented techniques. To the left is an image of a globe with semitransparent blue oceans, which I found on Wikimedia Commons. There is some lighting (or silhouette enhancement) going on, which I didn't try to reproduce. Instead, I only tried to reproduce the basic idea of semitransparent oceans with the following shader, which ignores the RGB colors of the texture map and replaces them by specific colors based on the alpha value:

Shader "GLSL semitransparent colors based on alpha" { Properties { _MainTex ("RGBA Texture Image", 2D) = "white" {} } SubShader { Tags {"Queue" = "Transparent"} Pass { Cull Front // first render the back faces ZWrite Off // don't write to depth buffer // in order not to occlude other objects Blend SrcAlpha OneMinusSrcAlpha // blend based on the fragment's alpha value GLSLPROGRAM uniform sampler2D _MainTex; varying vec4 textureCoordinates; #ifdef VERTEX void main() { textureCoordinates = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = texture2D(_MainTex, vec2(textureCoordinates)); if (gl_FragColor.a > 0.5) // opaque back face? { gl_FragColor = vec4(0.0, 0.0, 0.2, 1.0); // opaque dark blue } else // transparent back face? { gl_FragColor = vec4(0.0, 0.0, 1.0, 0.3); // semitransparent dark blue } } #endif ENDGLSL } Pass { Cull Back // now render the front faces ZWrite Off // don't write to depth buffer // in order not to occlude other objects Blend SrcAlpha OneMinusSrcAlpha // blend based on the fragment's alpha value GLSLPROGRAM uniform sampler2D _MainTex; varying vec4 textureCoordinates; #ifdef VERTEX void main() { textureCoordinates = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = texture2D(_MainTex, vec2(textureCoordinates)); if (gl_FragColor.a > 0.5) // opaque front face? { gl_FragColor = vec4(0.0, 1.0, 0.0, 1.0); // opaque green } else // transparent front face { gl_FragColor = vec4(0.0, 0.0, 1.0, 0.3); // semitransparent dark blue } } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Unlit/Transparent" }

Of course, it would be interesting to add lighting and silhouette enhancement to this shader. One could also change the opaque, green color in order to take the texture color into account, e.g. with:

`gl_FragColor = vec4(0.5 * gl_FragColor.r, 2.0 * gl_FragColor.g, 0.5 * gl_FragColor.b, 1.0);`

which emphasizes the green component by multiplying it with 2 and dims the red and blue components by multiplying them with 0.5. However, this results in oversaturated green that is clamped to the maximum intensity. This can be avoided by halving the difference of the green component to the maximum intensity 1. This difference is `1.0 - gl_FragColor.g`

; half of it is `0.5 * (1.0 - gl_FragColor.g)`

and the value corresponding to this reduced distance to the maximum intensity is: `1.0 - 0.5 * (1.0 - gl_FragColor.g)`

. Thus, in order to avoid oversaturation of green, we could use (instead of the opaque green color):

`gl_FragColor = vec4(0.5 * gl_FragColor.r, 1.0 - 0.5 * (1.0 - gl_FragColor.g), 0.5 * gl_FragColor.b, 1.0);`

In practice, one has to try various possibilities for such color transformations. To this end, the use of numeric shader properties (e.g. for the factors 0.5 in the line above) is particularly useful to interactively explore the possibilities.

### Summary[edit]

Congratulations! You have reached the end of this rather long tutorial. We have looked at:

- How discarding fragments can be combined with alpha texture maps.
- How the alpha test can be used to achieve the same effect.
- How alpha texture maps can be used for blending.
- How alpha texture maps can be used to determine colors.

### Further Reading[edit]

If you still want to know more

- about texturing, you should read Section “Textured Spheres”.
- about discarding fragments, you should read Section “Cutaways”.
- about the alpha test, you should read Unity's ShaderLab documentation: Alpha testing.
- about blending, you should read Section “Transparency”.

# Layers of Textures[edit]

This tutorial introduces **multitexturing**, i.e. the use of multiple texture images in a shader.

It extends the shader code of Section “Textured Spheres” to multiple textures and shows a way of combining them. If you haven't read that tutorial, this would be a very good opportunity to read it.

### Layers of Surfaces[edit]

Many real surfaces (e.g. the human skin illustrated in the image to the left) consist of several layers of different colors, transparencies, reflectivities, etc. If the topmost layer is opaque and doesn't transmit any light, this doesn't really matter for rendering the surface. However, in many cases the topmost layer is (semi)transparent and therefore an accurate rendering of the surface has to take multiple layers into account.

In fact, the specular reflection that is included in the Phong reflection model (see Section “Specular Highlights”) often corresponds to a transparent layer that reflects light: sweat on human skin, wax on fruits, transparent plastics with embedded pigment particles, etc. On the other hand, the diffuse reflection corresponds to the layer(s) below the topmost transparent layer.

Lighting such layered surfaces doesn't require a geometric model of the layers: they can be represented by a single, infinitely thin polygon mesh. However, the lighting computation has to compute different reflections for different layers and has to take the transmission of light between layers into account (both when light enters the layer and when it exits the layer). Examples of this approach are included in the “Dawn” demo by Nvidia (see Chapter 3 of the book “GPU Gems”, which is available online) and the “Human Head” demo by Nvidia (see Chapter 14 of the book “GPU Gems 3”, which is also available online).

A full description of these processes is beyond the scope of this tutorial. Suffice to say that layers are often associated with texture images to specify their characteristics. Here we just show how to use two textures and one particular way of combining them. The example is in fact not related to layers and therefore illustrates that multitexturing has more applications than layers of surfaces.

### Lit and Unlit Earth[edit]

Due to human activities, the unlit side of the Earth is not completely dark. Instead, artificial lights mark the position and extension of cities as shown in the image to the left. Therefore, diffuse lighting of the Earth should not just dim the texture image for the sunlit surface but actually blend it to the unlit texture image. Note that the sunlit Earth is far brighter than human-made lights on the unlit side; however, we reduce this contrast in order to show off the nighttime texture.

The shader code extends the code from Section “Textured Spheres” to two texture images and uses the computation described in Section “Diffuse Reflection” for a single, directional light source:

According to this equation, the level of diffuse lighting `levelOfLighting`

is max(0, **N**·**L**). We then blend the colors of the daytime texture and the nighttime texture based on `levelOfLighting`

. This could be achieved by multiplying the daytime color with `levelOfLighting`

and multiplying the nighttime color with `1.0 - levelOfLighting`

before adding them to determine the fragment's color. Alternatively, the built-in GLSL function `mix`

can be used (`mix(a, b, w) = b*w + a*(1.0-w)`

), which is likely to be more efficient. Thus, the fragment shader could be:

#ifdef FRAGMENT void main() { vec4 nighttimeColor = _Color * texture2D(_MainTex, vec2(textureCoordinates)); vec4 daytimeColor = _LightColor0 * texture2D(_DecalTex, vec2(textureCoordinates)); gl_FragColor = mix(nighttimeColor, daytimeColor, levelOfLighting); // = daytimeColor * levelOfLighting // + nighttimeColor * (1.0 - levelOfLighting) } #endif

Note that this blending is very similar to the alpha blending that was discussed in Section “Transparency” except that we perform the blending inside a fragment shader and use `levelOfLighting`

instead of the alpha component (i.e. the opacity) of the texture that should be blended “over” the other texture. In fact, if `_DecalTex`

specified an alpha component (see Section “Transparent Textures”), we could use this alpha component to blend `_DecalTex`

over `_MainTex`

. This is actually what Unity's standard `Decal`

shader does and it corresponds to a layer which is partially transparent on top of an opaque layer that is visible where the topmost layer is transparent.

### Complete Shader Code[edit]

The names of the properties of the shader were chosen to agree with the property names of the fallback shader — in this case the `Decal`

shader (note that the fallback `Decal`

shade and the standard `Decal`

shader appear to use the two textures in opposite ways). Also, an additional property `_Color`

is introduced and multiplied (component-wise) to the texture color of the nighttime texture in order to control its overall brightness. Furthermore, the color of the light source `_LightColor0`

is multiplied (also component-wise) to the color of the daytime texture in order to take colored light sources into account.

Shader "GLSL multitexturing of Earth" { Properties { _DecalTex ("Daytime Earth", 2D) = "white" {} _MainTex ("Nighttime Earth", 2D) = "white" {} _Color ("Nighttime Color Filter", Color) = (1,1,1,1) } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for the first, directional light GLSLPROGRAM uniform sampler2D _MainTex; uniform sampler2D _DecalTex; uniform vec4 _Color; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying float levelOfLighting; // level of diffuse lighting computed in vertex shader varying vec4 textureCoordinates; #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors vec3 normalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); vec3 lightDirection; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { lightDirection = vec3(0.0, 0.0, 0.0); // ignore other light sources } levelOfLighting = max(0.0, dot(normalDirection, lightDirection)); textureCoordinates = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { vec4 nighttimeColor = _Color * texture2D(_MainTex, vec2(textureCoordinates)); vec4 daytimeColor = _LightColor0 * texture2D(_DecalTex, vec2(textureCoordinates)); gl_FragColor = mix(nighttimeColor, daytimeColor, levelOfLighting); // = daytimeColor * levelOfLighting // + nighttimeColor * (1.0 - levelOfLighting) } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Decal" }

When you run this shader, make sure that you have an activated directional light source in your scene.

### Summary[edit]

Congratulations! You have reached the end of the last tutorial on basic texturing. We have looked at:

- How layers of surfaces can influence the appearance of materials (e.g. human skin, waxed fruits, plastics, etc.)
- How artificial lights on the unlit side can be taken into account when texturing a sphere representing the Earth.
- How to implement this technique in a shader.
- How this is related to blending an alpha texture over a second opaque texture.

### Further Reading[edit]

If you still want to know more

- about basic texturing, you should read Section “Textured Spheres”.
- about diffuse reflection, you should read Section “Diffuse Reflection”.
- about alpha textures, you should read Section “Transparent Textures”.
- about advanced skin rendering, you could read Chapter 3 “Skin in the ‘Dawn’ Demo” by Curtis Beeson and Kevin Bjorke of the book “GPU Gems” by Randima Fernando (editor) published 2004 by Addison-Wesley, which is available online, and Chapter 14 “Advanced Techniques for Realistic Real-Time Skin Rendering” by Eugene d’Eon and David Luebke of the book “GPU Gems 3” by Hubert Nguyen (editor) published 2007 by Addison-Wesley, which is also available online.

# Lighting of Bumpy Surfaces[edit]

This tutorial covers **normal mapping**.

It's the first in a series of tutorials about texturing techniques that go beyond two-dimensional surfaces (or layers of surfaces). In this tutorial, we start with normal mapping, which is a very well established technique to fake the lighting of small bumps and dents — even on coarse polygon meshes. The code of this tutorial is based on Section “Smooth Specular Highlights” and Section “Textured Spheres”.

### Perceiving Shapes Based on Lighting[edit]

The painting by Caravaggio depicted to the left is about the incredulity of Saint Thomas, who did not believe in Christ's resurrection until he put his finger in Christ's side. The furrowed brows of the apostles not only symbolize this incredulity but clearly convey it by means of a common facial expression. However, why do we know that their foreheads are actually furrowed instead of being painted with some light and dark lines? After all, this is just a flat painting. In fact, viewers intuitively make the assumption that these are furrowed instead of painted brows — even though the painting itself allows for both interpretations. The lesson is: bumps on smooth surfaces can often be convincingly conveyed by the lighting alone without any other cues (shadows, occlusions, parallax effects, stereo, etc.).

### Normal Mapping[edit]

Normal mapping tries to convey bumps on smooth surfaces (i.e. coarse triangle meshes with interpolated normals) by changing the surface normal vectors according to some virtual bumps. When the lighting is computed with these modified normal vectors, viewers will often perceive the virtual bumps — even though a perfectly flat triangle has been rendered. The illusion can certainly break down (in particular at silhouettes) but in many cases it is very convincing.

More specifically, the normal vectors that represent the virtual bumps are first **encoded** in a texture image (i.e. a normal map). A fragment shader then looks up these vectors in the texture image and computes the lighting based on them. That's about it. The problem, of course, is the encoding of the normal vectors in a texture image. There are different possibilities and the fragment shader has to be adapted to the specific encoding that was used to generate the normal map.

### Normal Mapping in Unity[edit]

The very good news is that you can easily create normal maps from gray-scale images with Unity: create a gray-scale image in your favorite paint program and use a specific gray for the regular height of the surface, lighter grays for bumps, and darker grays for dents. Make sure that the transitions between different grays are smooth, e.g. by blurring the image. When you import the image with **Assets > Import New Asset** change the **Texture Type** in the **Inspector View** to **Normal map** and check **Generate from greyscale**. After clicking **Apply**, the preview should show a bluish image with reddish and greenish edges. Alternatively to generating a normal map, the encoded normal map to the left can be imported (don't forget to uncheck the **Generate from greyscale** box).

The not so good news is that the fragment shader has to do some computations to decode the normals. First of all, the texture color is stored in a two-component texture image, i.e. there is only an alpha component and one color component available. The color component can be accessed as the red, green, or blue component — in all cases the same value is returned. Here, we use the green component since Unity also uses it. The two components, and , are stored as numbers between 0 and 1; however, they represent coordinates and between -1 and 1. The mapping is:

and

From these two components, the third component of the three-dimensional normal vector **n** can be calculated because of the normalization to unit length:

Only the “+” solution is necessary if we choose the axis along the axis of the smooth normal vector (interpolated from the normal vectors that were set in the vertex shader) since we aren't able to render surfaces with an inwards pointing normal vector anyways. The code snippet from the fragment shader could look like this:

vec4 encodedNormal = texture2D(_BumpMap, _BumpMap_ST.xy * textureCoordinates.xy + _BumpMap_ST.zw); vec3 localCoords = vec3(2.0 * encodedNormal.ag - vec2(1.0), 0.0); localCoords.z = sqrt(1.0 - dot(localCoords, localCoords)); // approximation without sqrt: localCoords.z = // 1.0 - 0.5 * dot(localCoords, localCoords);

The decoding for devices that use OpenGL ES is actually simpler since Unity doesn't use a two-component texture in this case. Thus, for mobile platforms the decoding becomes:

vec4 encodedNormal = texture2D(_BumpMap, _BumpMap_ST.xy * textureCoordinates.xy + _BumpMap_ST.zw); vec3 localCoords = 2.0 * encodedNormal.rgb - vec3(1.0);

However, the rest of this tutorial (and also Section “Projection of Bumpy Surfaces”) will cover only (desktop) OpenGL.

Unity uses a local surface coordinate systems for each point of the surface to specify normal vectors in the normal map. The axis of this local coordinates system is given by the smooth, interpolated normal vector **N** in world space and the plane is a tangent plane to the surface as illustrated in the image to the left. Specifically, the axis is specified by the tangent attribute **T** that Unity provides to vertices (see the discussion of attributes in Section “Debugging of Shaders”). Given the and axis, the axis can be computed by a cross product in the vertex shader, e.g. **B** = **N** × **T**. (The letter **B** refers to the traditional name “binormal” for this vector.)

Note that the normal vector **N** is transformed with the transpose of the inverse model matrix from object space to world space (because it is orthogonal to a surface; see Section “Applying Matrix Transformations”) while the tangent vector **T** specifies a direction between points on a surface and is therefore transformed with the model matrix. The binormal vector **B** represents a third class of vectors which are transformed differently. (If you really want to know: the skew-symmetric matrix B corresponding to “**B**×” is transformed like a quadratic form.) Thus, the best choice is to first transform **N** and **T** to world space, and then to compute **B** in world space using the cross product of the transformed vectors.

With the normalized directions **T**, **B**, and **N** in world space, we can easily form a matrix that maps any normal vector **n** of the normal map from the local surface coordinate system to world space because the columns of such a matrix are just the vectors of the axes; thus, the 3×3 matrix for the mapping of **n** to world space is:

These calculations are performed by the vertex shader, for example this way:

varying vec4 position; // position of the vertex (and fragment) in world space varying vec4 textureCoordinates; varying mat3 localSurface2World; // mapping from // local surface coordinates to world coordinates #ifdef VERTEX attribute vec4 Tangent; void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors localSurface2World[0] = normalize(vec3( modelMatrix * vec4(vec3(Tangent), 0.0))); localSurface2World[2] = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); localSurface2World[1] = normalize( cross(localSurface2World[2], localSurface2World[0]) * Tangent.w); // factor Tangent.w is specific to Unity position = modelMatrix * gl_Vertex; textureCoordinates = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif

The factor `Tangent.w`

in the computation of `binormal`

is specific to Unity, i.e. Unity provides tangent vectors and normal maps such that we have to do this multiplication.

In the fragment shader, we multiply the matrix in `localSurface2World`

with **n**. For example, with this line:

vec3 normalDirection = normalize(localSurface2World * localCoords);

With the new normal vector in world space, we can compute the lighting as in Section “Smooth Specular Highlights”.

### Complete Shader Code[edit]

This shader code simply integrates all the snippets and uses our standard two-pass approach for pixel lights.

Shader "GLSL normal mapping" { Properties { _BumpMap ("Normal Map", 2D) = "bump" {} _Color ("Diffuse Material Color", Color) = (1,1,1,1) _SpecColor ("Specular Material Color", Color) = (1,1,1,1) _Shininess ("Shininess", Float) = 10 } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for ambient light and first light source GLSLPROGRAM // User-specified properties uniform sampler2D _BumpMap; uniform vec4 _BumpMap_ST; uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 position; // position of the vertex (and fragment) in world space varying vec4 textureCoordinates; varying mat3 localSurface2World; // mapping from local // surface coordinates to world coordinates #ifdef VERTEX attribute vec4 Tangent; void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors localSurface2World[0] = normalize(vec3( modelMatrix * vec4(vec3(Tangent), 0.0))); localSurface2World[2] = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); localSurface2World[1] = normalize( cross(localSurface2World[2], localSurface2World[0]) * Tangent.w); // factor Tangent.w is specific to Unity position = modelMatrix * gl_Vertex; textureCoordinates = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { // in principle we have to normalize the columns of // "localSurface2World" again; however, the potential // problems are small since we use this matrix only to // compute "normalDirection", which we normalize anyways vec4 encodedNormal = texture2D(_BumpMap, _BumpMap_ST.xy * textureCoordinates.xy + _BumpMap_ST.zw); vec3 localCoords = vec3(2.0 * encodedNormal.ag - vec2(1.0), 0.0); localCoords.z = sqrt(1.0 - dot(localCoords, localCoords)); // approximation without sqrt: localCoords.z = // 1.0 - 0.5 * dot(localCoords, localCoords); vec3 normalDirection = normalize(localSurface2World * localCoords); vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - position); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 ambientLighting = vec3(gl_LightModel.ambient) * vec3(_Color); vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } gl_FragColor = vec4(ambientLighting + diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } Pass { Tags { "LightMode" = "ForwardAdd" } // pass for additional light sources Blend One One // additive blending GLSLPROGRAM // User-specified properties uniform sampler2D _BumpMap; uniform vec4 _BumpMap_ST; uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 position; // position of the vertex (and fragment) in world space varying vec4 textureCoordinates; varying mat3 localSurface2World; // mapping from // local surface coordinates to world coordinates #ifdef VERTEX attribute vec4 Tangent; void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors localSurface2World[0] = normalize(vec3( modelMatrix * vec4(vec3(Tangent), 0.0))); localSurface2World[2] = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); localSurface2World[1] = normalize( cross(localSurface2World[2], localSurface2World[0]) * Tangent.w); // factor Tangent.w is specific to Unity position = modelMatrix * gl_Vertex; textureCoordinates = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { // in principle we have to normalize the columns of // "localSurface2World" again; however, the potential // problems are small since we use this matrix only to // compute "normalDirection", which we normalize anyways vec4 encodedNormal = texture2D(_BumpMap, _BumpMap_ST.xy * textureCoordinates.xy + _BumpMap_ST.zw); vec3 localCoords = vec3(2.0 * encodedNormal.ag - vec2(1.0), 0.0); localCoords.z = sqrt(1.0 - dot(localCoords, localCoords)); // approximation without sqrt: localCoords.z = // 1.0 - 0.5 * dot(localCoords, localCoords); vec3 normalDirection = normalize(localSurface2World * localCoords); vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - position); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } gl_FragColor = vec4(diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Bumped Specular" }

Note that we have used the tiling and offset uniform `_BumpMap_ST`

as explained in the Section “Textured Spheres” since this option is often particularly useful for bump maps.

### Summary[edit]

Congratulations! You finished this tutorial! We have look at:

- How human perception of shapes often relies on lighting.
- What normal mapping is.
- How Unity encodes normal maps.
- How a fragment shader can decode Unity's normal maps and use them to per-pixel lighting.

### Further Reading[edit]

If you still want to know more

- about texture mapping (including tiling and offseting), you should read Section “Textured Spheres”.
- about per-pixel lighting with the Phong reflection model, you should read Section “Smooth Specular Highlights”.
- about transforming normal vectors, you should read Section “Applying Matrix Transformations”.
- about normal mapping, you could read Mark J. Kilgard: “A Practical and Robust Bump-mapping Technique for Today’s GPUs”, GDC 2000: Advanced OpenGL Game Development, which is available online.

# Projection of Bumpy Surfaces[edit]

This tutorial covers (single-step) **parallax mapping**.

It extends and is based on Section “Lighting of Bumpy Surfaces”.

### Improving Normal Mapping[edit]

The normal mapping technique presented in Section “Lighting of Bumpy Surfaces” only changes the lighting of a flat surface to create the illusion of bumps and dents. If one looks straight onto a surface (i.e. in the direction of the surface normal vector), this works very well. However, if one looks onto a surface from some other angle (as in the image to the left), the bumps should also stick out of the surface while the dents should recede into the surface. Of course, this could be achieved by geometrically modeling bumps and dents; however, this would require to process many more vertices. On the other hand, single-step parallax mapping is a very efficient techniques similar to normal mapping, which doesn't require additional triangles but can still move virtual bumps by several pixels to make them stick out of a flat surface. However, the technique is limited to bumps and dents of small heights and requires some fine-tuning for best results.

### Parallax Mapping Explained[edit]

Parallax mapping was proposed in 2001 by Tomomichi Kaneko et al. in their paper “Detailed shape representation with parallax mapping” (ICAT 2001). The basic idea is to offset the texture coordinates that are used for the texturing of the surface (in particular normal mapping). If this offset of texture coordinates is computed appropriately, it is possible to move parts of the texture (e.g. bumps) as if they were sticking out of the surface.

The illustration to the left shows the view vector **V** in the direction to the viewer and the surface normal vector **N** in the point of a surface that is rasterized in a fragment shader. Parallax mapping proceeds in 3 steps:

- Lookup of the height at the rasterized point in a height map, which is depicted by the wavy line on top of the straight line at the bottom in the illustration.
- Computation of the intersection of the viewing ray in direction of
**V**with a surface at height parallel to the rendered surface. The distance is the distance between the rasterized surface point moved by in the direction of**N**and this intersection point. If these two points are projected onto the rendered surface, is also the distance between the rasterized point and a new point on the surface (marked by a cross in the illustration). This new surface point is a better approximation to the point that is actually visible for the view ray in direction**V**if the surface was displaced by the height map. - Transformation of the offset into texture coordinate space in order to compute an offset of texture coordinates for all following texture lookups.

For the computation of we require the height of a height map at the rasterized point, which is implemented in the example by a texture lookup in the A component of the texture property `_ParallaxMap`

, which should be a gray-scale image representing heights as discussed in Section “Lighting of Bumpy Surfaces”. We also require the view direction **V** in the local surface coordinate system formed by the normal vector ( axis), the tangent vector ( axis), and the binormal vector ( axis), which was also introduced Section “Lighting of Bumpy Surfaces”. To this end we compute a transformation from local surface coordinates to object space with:

where **T**, **B** and **N** are given in object coordinates. (In Section “Lighting of Bumpy Surfaces” we had a similar matrix but with vectors in world coordinates.)

We compute the view direction **V** in object space (as the difference between the rasterized position and the camera position transformed from world space to object space) and then we transform it to the local surface space with the matrix which can be computed as:

This is possible because **T**, **B** and **N** are orthogonal and normalized. (Actually, the situation is a bit more complicated because we won't normalize these vectors but use their length for another transformation; see below.) Thus, in order to transform **V** from object space to the local surface space, we have to multiply it with the transposed matrix . In GLSL, this is achieved by multiplying the vector from the left to the matrix .

Once we have the **V** in the local surface coordinate system with the axis in the direction of the normal vector **N**, we can compute the offsets (in direction) and (in direction) by using similar triangles (compare with the illustration):

and .

Thus:

and .

Note that it is not necessary to normalize **V** because we use only ratios of its components, which are not affected by the normalization.

Finally, we have to transform and into texture space. This would be quite difficult if Unity wouldn't help us: the tangent attribute `Tangent`

is actually appropriately scaled and has a fourth component `Tangent.w`

for scaling the binormal vector such that the transformation of the view direction **V** scales and appropriately to have and in texture coordinate space without further computations.

### Implementation[edit]

The implementation shares most of the code with Section “Lighting of Bumpy Surfaces”. In particular, the same scaling of the binormal vector with the fourth component of the `Tangent`

attribute is used in order to take the mapping of the offsets from local surface space to texture space into account:

vec3 binormal = cross(gl_Normal, vec3(Tangent)) * Tangent.w;

In the vertex shader, we have to add a varying for the view vector **V** in the local surface coordinate system (with the scaling of axes to take the mapping to texture space into account). This varying is called `viewDirInScaledSurfaceCoords`

. It is computed by multiplying the view vector in object coordinates (`viewDirInObjectCoords`

) from the left to the matrix (`localSurface2ScaledObject`

) as explained above:

vec3 viewDirInObjectCoords = vec3( modelMatrixInverse * vec4(_WorldSpaceCameraPos, 1.0) - gl_Vertex); mat3 localSurface2ScaledObject = mat3(vec3(Tangent), binormal, gl_Normal); // vectors are orthogonal viewDirInScaledSurfaceCoords = viewDirInObjectCoords * localSurface2ScaledObject; // we multiply with the transpose to multiply with // the "inverse" (apart from the scaling)

The rest of the vertex shader is the same as for normal mapping, see Section “Lighting of Bumpy Surfaces”.

In the fragment shader, we first query the height map for the height of the rasterized point. This height is specified by the A component of the texture `_ParallaxMap`

. The values between 0 and 1 are transformed to the range -`_Parallax`

/2 to +`_Parallax`

with a shader property `_Parallax`

in order to offer some user control over the strength of the effect (and to be compatible with the fallback shader):

float height = _Parallax * (-0.5 + texture2D(_ParallaxMap, _ParallaxMap_ST.xy * textureCoordinates.xy + _ParallaxMap_ST.zw));

The offsets and are then computed as described above. However, we also clamp each offset between a user-specified interval -`_MaxTexCoordOffset`

and `_MaxTexCoordOffset`

in order to make sure that the offset stays in reasonable bounds. (If the height map consists of more or less flat plateaus of constant height with smooth transitions between these plateaus, `_MaxTexCoordOffset`

should be smaller than the thickness of these transition regions; otherwise the sample point might be in a different plateau with a different height, which would mean that the approximation of the intersection point is arbitrarily bad.) The code is:

vec2 texCoordOffsets = clamp(height * viewDirInScaledSurfaceCoords.xy / viewDirInScaledSurfaceCoords.z, -_MaxTexCoordOffset, +_MaxTexCoordOffset);

In the following code, we have to apply the offsets to the texture coordinates in all texture lookups; i.e., we have to replace `vec2(textureCoordinates)`

(or equivalently `textureCoordinates.xy`

) by `(textureCoordinates.xy + texCoordOffsets)`

, e.g.:

vec4 encodedNormal = texture2D(_BumpMap, _BumpMap_ST.xy * (textureCoordinates.xy + texCoordOffsets) + _BumpMap_ST.zw);

The rest of the fragment shader code is just as it was for Section “Lighting of Bumpy Surfaces”.

### Complete Shader Code[edit]

As discussed in the previous section, most of this code is taken from Section “Lighting of Bumpy Surfaces”. Note that if you want to use the code on a mobile device with OpenGL ES, make sure to change the decoding of the normal map as described in that tutorial.

The part about parallax mapping is actually only a few lines. Most of the names of the shader properties were chosen according to the fallback shader; the user interface labels are much more descriptive.

Shader "GLSL parallax mapping" { Properties { _BumpMap ("Normal Map", 2D) = "bump" {} _ParallaxMap ("Heightmap (in A)", 2D) = "black" {} _Parallax ("Max Height", Float) = 0.01 _MaxTexCoordOffset ("Max Texture Coordinate Offset", Float) = 0.01 _Color ("Diffuse Material Color", Color) = (1,1,1,1) _SpecColor ("Specular Material Color", Color) = (1,1,1,1) _Shininess ("Shininess", Float) = 10 } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for ambient light and first light source GLSLPROGRAM // User-specified properties uniform sampler2D _BumpMap; uniform vec4 _BumpMap_ST; uniform sampler2D _ParallaxMap; uniform vec4 _ParallaxMap_ST; uniform float _Parallax; uniform float _MaxTexCoordOffset; uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 unity_Scale; // w = 1/uniform scale; // should be multiplied to _World2Object uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 position; // position of the vertex (and fragment) in world space varying vec4 textureCoordinates; varying mat3 localSurface2World; // mapping from // local surface coordinates to world coordinates varying vec3 viewDirInScaledSurfaceCoords; #ifdef VERTEX attribute vec4 Tangent; void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object * unity_Scale.w; localSurface2World[0] = normalize(vec3( modelMatrix * vec4(vec3(Tangent), 0.0))); localSurface2World[2] = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); localSurface2World[1] = normalize( cross(localSurface2World[2], localSurface2World[0]) * Tangent.w); vec3 binormal = cross(gl_Normal, vec3(Tangent)) * Tangent.w; // appropriately scaled tangent and binormal // to map distances from object space to texture space vec3 viewDirInObjectCoords = vec3(modelMatrixInverse * vec4(_WorldSpaceCameraPos, 1.0) - gl_Vertex); mat3 localSurface2ScaledObject = mat3(vec3(Tangent), binormal, gl_Normal); // vectors are orthogonal viewDirInScaledSurfaceCoords = viewDirInObjectCoords * localSurface2ScaledObject; // we multiply with the transpose to multiply // with the "inverse" (apart from the scaling) position = modelMatrix * gl_Vertex; textureCoordinates = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { // parallax mapping: compute height and // find offset in texture coordinates // for the intersection of the view ray // with the surface at this height float height = _Parallax * (-0.5 + texture2D(_ParallaxMap, _ParallaxMap_ST.xy * textureCoordinates.xy + _ParallaxMap_ST.zw)); vec2 texCoordOffsets = clamp(height * viewDirInScaledSurfaceCoords.xy / viewDirInScaledSurfaceCoords.z, -_MaxTexCoordOffset, +_MaxTexCoordOffset); // normal mapping: lookup and decode normal from bump map // in principle we have to normalize the columns // of "localSurface2World" again; however, the potential // problems are small since we use this matrix only // to compute "normalDirection", which we normalize anyways vec4 encodedNormal = texture2D(_BumpMap, _BumpMap_ST.xy * (textureCoordinates.xy + texCoordOffsets) + _BumpMap_ST.zw); vec3 localCoords = vec3(2.0 * encodedNormal.ag - vec2(1.0), 0.0); localCoords.z = sqrt(1.0 - dot(localCoords, localCoords)); // approximation without sqrt: localCoords.z = // 1.0 - 0.5 * dot(localCoords, localCoords); vec3 normalDirection = normalize(localSurface2World * localCoords); // per-pixel lighting using the Phong reflection model // (with linear attenuation for point and spot lights) vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - position); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 ambientLighting = vec3(gl_LightModel.ambient) * vec3(_Color); vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } gl_FragColor = vec4(ambientLighting + diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } Pass { Tags { "LightMode" = "ForwardAdd" } // pass for additional light sources Blend One One // additive blending GLSLPROGRAM // User-specified properties uniform sampler2D _BumpMap; uniform vec4 _BumpMap_ST; uniform sampler2D _ParallaxMap; uniform vec4 _ParallaxMap_ST; uniform float _Parallax; uniform float _MaxTexCoordOffset; uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 unity_Scale; // w = 1/uniform scale; // should be multiplied to _World2Object uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from "Lighting.cginc") varying vec4 position; // position of the vertex (and fragment) in world space varying vec4 textureCoordinates; varying mat3 localSurface2World; // mapping // from local surface coordinates to world coordinates varying vec3 viewDirInScaledSurfaceCoords; #ifdef VERTEX attribute vec4 Tangent; void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object * unity_Scale.w; localSurface2World[0] = normalize(vec3( modelMatrix * vec4(vec3(Tangent), 0.0))); localSurface2World[2] = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); localSurface2World[1] = normalize( cross(localSurface2World[2], localSurface2World[0]) * Tangent.w); vec3 binormal = cross(gl_Normal, vec3(Tangent)) * Tangent.w; // appropriately scaled tangent and binormal // to map distances from object space to texture space vec3 viewDirInObjectCoords = vec3(modelMatrixInverse * vec4(_WorldSpaceCameraPos, 1.0) - gl_Vertex); mat3 localSurface2ScaledObject = mat3(vec3(Tangent), binormal, gl_Normal); // vectors are orthogonal viewDirInScaledSurfaceCoords = viewDirInObjectCoords * localSurface2ScaledObject; // we multiply with the transpose to multiply // with the "inverse" (apart from the scaling) position = modelMatrix * gl_Vertex; textureCoordinates = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { // parallax mapping: compute height and // find offset in texture coordinates // for the intersection of the view ray // with the surface at this height float height = _Parallax * (-0.5 + texture2D(_ParallaxMap, _ParallaxMap_ST.xy * textureCoordinates.xy + _ParallaxMap_ST.zw)); vec2 texCoordOffsets = clamp(height * viewDirInScaledSurfaceCoords.xy / viewDirInScaledSurfaceCoords.z, -_MaxTexCoordOffset, +_MaxTexCoordOffset); // normal mapping: lookup and decode normal from bump map // in principle we have to normalize the columns // of "localSurface2World" again; however, the potential // problems are small since we use this matrix only to // compute "normalDirection", which we normalize anyways vec4 encodedNormal = texture2D(_BumpMap, _BumpMap_ST.xy * (textureCoordinates.xy + texCoordOffsets) + _BumpMap_ST.zw); vec3 localCoords = vec3(2.0 * encodedNormal.ag - vec2(1.0), 0.0); localCoords.z = sqrt(1.0 - dot(localCoords, localCoords)); // approximation without sqrt: localCoords.z = // 1.0 - 0.5 * dot(localCoords, localCoords); vec3 normalDirection = normalize(localSurface2World * localCoords); // per-pixel lighting using the Phong reflection model // (with linear attenuation for point and spot lights) vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - position); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } gl_FragColor = vec4(diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Parallax Specular" }

### Summary[edit]

Congratulations! If you actually understand the whole shader, you have come a long way. In fact, the shader includes lots of concepts (transformations between coordinate systems, application of the inverse of an orthogonal matrix by multiplying a vector from the left to it, the Phong reflection model, normal mapping, parallax mapping, ...). More specifically, we have seen:

- How parallax mapping improves upon normal mapping.
- How parallax mapping is described mathematically.
- How parallax mapping is implemented.

### Further Reading[edit]

If you still want to know more

- about details of the shader code, you should read Section “Lighting of Bumpy Surfaces”.
- about parallax mapping, you could read the original publication by Tomomichi Kaneko et al.: “Detailed shape representation with parallax mapping”, ICAT 2001, pages 205–208, which is available online.

# Cookies[edit]

This tutorial covers **projective texture mapping in light space**, which is useful to implement cookies for spotlights and directional light sources. (In fact, Unity uses a built-in cookie for any spotlight.)

The tutorial is based on the code of Section “Smooth Specular Highlights” and Section “Transparent Textures”. If you haven't read those tutorials yet, you should read them first.

### Gobos and Cookies in Real Life[edit]

In real life, gobos are pieces of material (often metal) with holes that are placed in front of light sources to manipulate the shape of light beams or shadows. Cookies (or “cuculoris”) serve a similar purpose but are placed at a larger distance from the light source as shown in the image to the left.

### Unity's Cookies[edit]

In Unity, a **cookie** can be specified for each light source in the **Inspector View** when the light source is selected. This cookie is basically an alpha texture map (see Section “Transparent Textures”) that is placed in front of the light source and moves with it (therefore it is actually similar to a gobo). It lets light pass through where the alpha component of the texture image is 1 and blocks light where the alpha component is 0. Unity's cookies for spotlights and directional lights are just square, two-dimensional alpha texture maps. On the other hand, cookies for point lights are cube maps, which we will not cover here.

In order to implement a cookie, we have to extend the shader of any surface that should be affected by the cookie. (This is very different from how Unity's projectors work; see Section “Projectors”.) Specifically, we have to attenuate the light of each light source according to its cookie in the lighting computation of a shader. Here, we use the per-pixel lighting described in Section “Smooth Specular Highlights”; however, the technique can be applied to any lighting computation.

In order to find the relevant position in the cookie texture, the position of the rasterized point of a surface is transformed into the coordinate system of the light source. This coordinate system is very similar to the clip coordinate system of a camera, which is described in Section “Vertex Transformations”. In fact, the best way to think of the coordinate system of a light source is probably to think of the light source as a camera. The *x* and *y* light coordinates are then related to the screen coordinates of this hypothetical camera. Transforming a point from world coordinates to light coordinates is actually very easy because Unity provides the required 4×4 matrix as the uniform variable `_LightMatrix0`

. (Otherwise we would have to set up the matrix similar to the matrices for the viewing transformation and the projection, which are discussed in Section “Vertex Transformations”.)

For best efficiency, the transformation of the surface point from world space to light space should be performed by multiplying `_LightMatrix0`

to the position in world space in the vertex shader, for example this way:

uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from Lighting.cginc) uniform mat4 _LightMatrix0; // transformation // from world to light space (from Autolight.cginc) varying vec4 position; // position of the vertex (and fragment) in world space varying vec4 positionInLightSpace; // position of the vertex (and fragment) in light space varying vec3 varyingNormalDirection; // surface normal vector in world space #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors position = modelMatrix * gl_Vertex; positionInLightSpace = _LightMatrix0 * position; varyingNormalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif

Apart from the definitions of the uniform `_LightMatrix0`

and the varying `positionInLightSpace`

and the instruction to compute `positionInLightSpace`

, this is the same vertex shader as in Section “Smooth Specular Highlights”.

### Cookies for Directional Light Sources[edit]

For the cookie of a directional light source, we can just use the *x* and *y* light coordinates in `positionInLightSpace`

as texture coordinates for a lookup in the cookie texture `_LightTexture0`

. This texture lookup should be performed in the fragment shader. Then the resulting alpha component should be multiplied to the computed lighting; for example:

// compute diffuseReflection and specularReflection float cookieAttenuation = 1.0; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { cookieAttenuation = texture2D(_LightTexture0, vec2(positionInLightSpace)).a; } // compute cookieAttenuation for spotlights here gl_FragColor = vec4(cookieAttenuation * (diffuseReflection + specularReflection), 1.0);

Instead of `vec2(positionInLightSpace)`

we could also use `positionInLightSpace.xy`

to get a two-dimensional vector with the *x* and *y* coordinates in light space.

### Cookies for Spotlights[edit]

For spotlights, the *x* and *y* light coordinates in `positionInLightSpace`

have to be divided by the *w* light coordinate. This division is characteristic for projective texture mapping and corresponds to the perspective division for a camera, which is described in Section “Vertex Transformations”. Unity defines the matrix `_LightMatrix0`

such that we have to add to both coordinates after the division:

cookieAttenuation = texture2D(_LightTexture0, vec2(positionInLightSpace) / positionInLightSpace.w + vec2(0.5)).a;

For some GPUs it might be more efficient to use the built-in function `texture2DProj`

, which takes three texture coordinates in a `vec3`

and divides the first two coordinates by the third coordinate before the texture lookup. A problem with this approach is that we have to add **after** the division by `positionInLightSpace.w`

; however, `texture2DProj`

doesn't allow us to add anything after the internal division by the third texture coordinate. The solution is to add `0.5 * positionInLightSpace.w`

**before** the division by `positionInLightSpace.w`

, which corresponds to adding after the division:

vec3 textureCoords = vec3(vec2(positionInLightSpace) + vec2(0.5 * positionInLightSpace.w), positionInLightSpace.w); cookieAttenuation = texture2DProj(_LightTexture0, textureCoords).a;

Note that the texture lookup for directional lights can also be implemented with `texture2DProj`

by setting `textureCoords`

to `vec3(vec2(positionInLightSpace), 1.0)`

. This would allow us to use only one texture lookup for both directional lights and for spotlights, which is more efficient on some GPUs.

### Complete Shader Code[edit]

For the complete shader code we use a simplified version of the `ForwardBase`

pass of Section “Smooth Specular Highlights” since Unity only uses a directional light without cookie in the `ForwardBase`

pass. All light sources with cookies are handled by the `ForwardAdd`

pass. We ignore cookies for point lights, for which `_LightMatrix0[3][3]`

is `1.0`

(but we include them in the next section). Spotlights always have a cookie texture: if the user didn't specify one, Unity supplies a cookie texture to generate the shape of a spotlight; thus, it is OK to always apply the cookie. Directional lights don't always have a cookie; however, if there is only one directional light source without cookie then it has been processed in the `ForwardBase`

pass. Thus, unless there are more than one directional light sources without cookies, we can assume that all directional light sources in the `ForwardAdd`

pass have cookies. In this case, the complete shader code could be:

Shader "GLSL per-pixel lighting with cookies" { Properties { _Color ("Diffuse Material Color", Color) = (1,1,1,1) _SpecColor ("Specular Material Color", Color) = (1,1,1,1) _Shininess ("Shininess", Float) = 10 } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for ambient light // and first directional light source without cookie GLSLPROGRAM // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from Lighting.cginc) varying vec4 position; // position of the vertex (and fragment) in world space varying vec3 varyingNormalDirection; // surface normal vector in world space #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors position = modelMatrix * gl_Vertex; varyingNormalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { vec3 normalDirection = normalize(varyingNormalDirection); vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection = normalize(vec3(_WorldSpaceLightPos0)); vec3 ambientLighting = vec3(gl_LightModel.ambient) * vec3(_Color); vec3 diffuseReflection = vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } gl_FragColor = vec4(ambientLighting + diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } Pass { Tags { "LightMode" = "ForwardAdd" } // pass for additional light sources Blend One One // additive blending GLSLPROGRAM // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from Lighting.cginc) uniform mat4 _LightMatrix0; // transformation // from world to light space (from Autolight.cginc) uniform sampler2D _LightTexture0; // cookie alpha texture map (from Autolight.cginc) varying vec4 position; // position of the vertex (and fragment) in world space varying vec4 positionInLightSpace; // position of the vertex (and fragment) in light space varying vec3 varyingNormalDirection; // surface normal vector in world space #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors position = modelMatrix * gl_Vertex; positionInLightSpace = _LightMatrix0 * position; varyingNormalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { vec3 normalDirection = normalize(varyingNormalDirection); vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - position); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); } vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } float cookieAttenuation = 1.0; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { cookieAttenuation = texture2D(_LightTexture0, vec2(positionInLightSpace)).a; } else if (1.0 != _LightMatrix0[3][3]) // spotlight (i.e. not a point light)? { cookieAttenuation = texture2D(_LightTexture0, vec2(positionInLightSpace) / positionInLightSpace.w + vec2(0.5)).a; } gl_FragColor = vec4(cookieAttenuation * (diffuseReflection + specularReflection), 1.0); } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Specular" }

### Shader Programs for Specific Light Sources[edit]

The previous shader code is limited to scenes with at most one directional light source without a cookie. Also, it doesn't take cookies of point light sources into account. Writing more general shader code requires different `ForwardAdd`

passes for different light sources. (Remember that the light source in the `ForwardBase`

pass is always a directional light source without cookie.) Fortunately, Unity offers a way to generate multiple shaders by using the following Unity-specific directive (right after `GLSLPROGRAM`

in the `ForwardAdd`

pass):

`#pragma multi_compile_lightpass`

With this instruction, Unity will compile the shader code for the `ForwardAdd`

pass multiple times for different kinds of light sources. Each compilation is distinguished by the definition of one of the following symbols: `DIRECTIONAL`

, `DIRECTIONAL_COOKIE`

, `POINT`

, `POINT_NOATT`

, `POINT_COOKIE`

, `SPOT`

. The shader code should check which symbol is defined (using the directives `#if defined ... #elif defined ... #endif`

) and include appropriate instructions. For example:

Shader "GLSL per-pixel lighting with cookies" { Properties { _Color ("Diffuse Material Color", Color) = (1,1,1,1) _SpecColor ("Specular Material Color", Color) = (1,1,1,1) _Shininess ("Shininess", Float) = 10 } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for ambient light // and first directional light source without cookie GLSLPROGRAM // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from Lighting.cginc) varying vec4 position; // position of the vertex (and fragment) in world space varying vec3 varyingNormalDirection; // surface normal vector in world space #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors position = modelMatrix * gl_Vertex; varyingNormalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { vec3 normalDirection = normalize(varyingNormalDirection); vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection = normalize(vec3(_WorldSpaceLightPos0)); vec3 ambientLighting = vec3(gl_LightModel.ambient) * vec3(_Color); vec3 diffuseReflection = vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } gl_FragColor = vec4(ambientLighting + diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } Pass { Tags { "LightMode" = "ForwardAdd" } // pass for additional light sources Blend One One // additive blending GLSLPROGRAM #pragma multi_compile_lightpass // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from Lighting.cginc) uniform mat4 _LightMatrix0; // transformation // from world to light space (from Autolight.cginc) #if defined DIRECTIONAL_COOKIE || defined SPOT uniform sampler2D _LightTexture0; // cookie alpha texture map (from Autolight.cginc) #elif defined POINT_COOKIE uniform samplerCube _LightTexture0; // cookie alpha texture map (from Autolight.cginc) #endif varying vec4 position; // position of the vertex (and fragment) in world space varying vec4 positionInLightSpace; // position of the vertex (and fragment) in light space varying vec3 varyingNormalDirection; // surface normal vector in world space #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors position = modelMatrix * gl_Vertex; positionInLightSpace = _LightMatrix0 * position; varyingNormalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { vec3 normalDirection = normalize(varyingNormalDirection); vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection; float attenuation = 1.0; // by default no attenuation with distance #if defined DIRECTIONAL || defined DIRECTIONAL_COOKIE lightDirection = normalize(vec3(_WorldSpaceLightPos0)); #elif defined POINT_NOATT lightDirection = normalize(vec3(_WorldSpaceLightPos0 - position)); #elif defined POINT || defined POINT_COOKIE || defined SPOT vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - position); float distance = length(vertexToLightSource); attenuation = 1.0 / distance; // linear attenuation lightDirection = normalize(vertexToLightSource); #endif vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } float cookieAttenuation = 1.0; // by default no cookie attenuation #if defined DIRECTIONAL_COOKIE cookieAttenuation = texture2D(_LightTexture0, vec2(positionInLightSpace)).a; #elif defined POINT_COOKIE cookieAttenuation = textureCube(_LightTexture0, vec3(positionInLightSpace)).a; #elif defined SPOT cookieAttenuation = texture2D(_LightTexture0, vec2(positionInLightSpace) / positionInLightSpace.w + vec2(0.5)).a; #endif gl_FragColor = vec4(cookieAttenuation * (diffuseReflection + specularReflection), 1.0); } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Specular" }

Note that the cookie for a point light source is using a cube texture map. This kind of texture map is discussed in Section “Reflecting Surfaces”.

### Summary[edit]

Congratulations, you have learned the most important aspects of projective texture mapping. We have seen:

- How to implement cookies for directional light sources.
- How to implement spotlights (with and without user-specified cookies).
- How to implement different shaders for different light sources.

### Further Reading[edit]

If you still want to know more

- about the shader version for lights without cookies, you should read Section “Smooth Specular Highlights”.
- about texture mapping and in particular alpha texture maps, you should read Section “Transparent Textures”.
- about projective texture mapping in fixed-function OpenGL, you could read NVIDIA's white paper “Projective Texture Mapping” by Cass Everitt (which is available online).

# Light Attenuation[edit]

This tutorial covers **textures for light attenuation** or — more generally spoken — textures as lookup tables.

It is based on Section “Cookies”. If you haven't read that tutorial yet, you should read it first.

### Texture Maps as Lookup Tables[edit]

One can think of a texture map as an approximation to a two-dimensional function that maps the texture coordinates to an RGBA color. If one of the two texture coordinates is kept fixed, the texture map can also represent a one-dimensional function. Thus, it is often possible to replace mathematical expressions that depend only on one or two variables by lookup tables in the form of texture maps. (The limitation is that the resolution of the texture map is limited by the size of the texture image and therefore the accuracy of a texture lookup might be insufficient.)

The main advantage of using such a texture lookup is a potential gain of performance: a texture lookup doesn't depend on the complexity of the mathematical expression but only on the size of the texture image (to a certain degree: the smaller the texture image the more efficient the caching up to the point where the whole texture fits into the cache). However, there is an overhead of using a texture lookup; thus, replacing simple mathematical expressions — including built-in functions — is usually pointless.

Which mathematical expressions should be replaced by texture lookups? Unfortunately, there is no general answer because it depends on the specific GPU whether a specific lookup is faster than evaluating a specific mathematical expression. However, one should keep in mind that a texture map is less simple (since it requires code to compute the lookup table), less explicit (since the mathematical function is encoded in a lookup table), less consistent with other mathematical expressions, and has a wider scope (since the texture is available in the whole fragment shader). These are good reasons to avoid lookup tables. However, the gains in performance might outweigh these reasons. In that case, it is a good idea to include comments that document how to achieve the same effect without the lookup table.

### Unity's Texture Lookup for Light Attenuation[edit]

Unity actually uses a lookup texture `_LightTextureB0`

internally for the light attenuation of point lights and spotlights. (Note that in some cases, e.g. point lights without cookie textures, this lookup texture is set to `_LightTexture0`

without `B`

. This case is ignored here.) In Section “Diffuse Reflection”, it was described how to implement linear attenuation: we compute an attenuation factor that includes one over the distance between the position of the light source in world space and the position of the rendered fragment in world space. In order to represent this distance, Unity uses the coordinate in light space. Light space coordinates have been discussed in Section “Cookies”; here, it is only important that we can use the Unity-specific uniform matrix `_LightMatrix0`

to transform a position from world space to light space. Analogously to the code in Section “Cookies”, we store the position in light space in the varying variable `positionInLightSpace`

. We can then use the coordinate of this varying to look up the attenuation factor in the alpha component of the texture `_LightTextureB0`

in the fragment shader:

float distance = positionInLightSpace.z; // use z coordinate in light space as signed distance attenuation = texture2D(_LightTextureB0, vec2(distance)).a; // texture lookup for attenuation // alternative with linear attenuation: // float distance = length(vertexToLightSource); // attenuation = 1.0 / distance;

Using the texture lookup, we don't have to compute the length of a vector (which involves three squares and one square root) and we don't have to divide by this length. In fact, the actual attenuation function that is implemented in the lookup table is more complicated in order to avoid saturated colors at short distances. Thus, compared to a computation of this actual attenuation function, we save even more operations.

### Complete Shader Code[edit]

The shader code is based on the code of Section “Cookies”. The `ForwardBase`

pass was slightly simplified by assuming that the light source is always directional without attenuation. The vertex shader of the `ForwardAdd`

pass is identical to the code in Section “Cookies” but the fragment shader includes the texture lookup for light attenuation, which is described above. However, the fragment shader lacks the cookie attenuation in order to focus on the attenuation with distance. It is straightforward (and a good exercise) to include the code for the cookie again.

Shader "GLSL light attenuation with texture lookup" { Properties { _Color ("Diffuse Material Color", Color) = (1,1,1,1) _SpecColor ("Specular Material Color", Color) = (1,1,1,1) _Shininess ("Shininess", Float) = 10 } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for ambient light and // first directional light source without attenuation GLSLPROGRAM // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from Lighting.cginc) varying vec4 position; // position of the vertex (and fragment) in world space varying vec3 varyingNormalDirection; // surface normal vector in world space #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors position = modelMatrix * gl_Vertex; varyingNormalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { vec3 normalDirection = normalize(varyingNormalDirection); vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection = normalize(vec3(_WorldSpaceLightPos0)); // we assume that the light source in ForwardBase pass // is a directional light source without attenuation vec3 ambientLighting = vec3(gl_LightModel.ambient) * vec3(_Color); vec3 diffuseReflection = vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } gl_FragColor = vec4(ambientLighting + diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } Pass { Tags { "LightMode" = "ForwardAdd" } // pass for additional light sources Blend One One // additive blending GLSLPROGRAM // User-specified properties uniform vec4 _Color; uniform vec4 _SpecColor; uniform float _Shininess; // The following built-in uniforms (except _LightColor0) // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix uniform vec4 _WorldSpaceLightPos0; // direction to or position of light source uniform vec4 _LightColor0; // color of light source (from Lighting.cginc) uniform mat4 _LightMatrix0; // transformation // from world to light space (from Autolight.cginc) uniform sampler2D _LightTextureB0; // texture lookup (from Autolight.cginc) varying vec4 position; // position of the vertex (and fragment) in world space varying vec4 positionInLightSpace; // position of the vertex (and fragment) in light space varying vec3 varyingNormalDirection; // surface normal vector in world space #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors position = modelMatrix * gl_Vertex; positionInLightSpace = _LightMatrix0 * position; varyingNormalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { vec3 normalDirection = normalize(varyingNormalDirection); vec3 viewDirection = normalize(_WorldSpaceCameraPos - vec3(position)); vec3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(vec3(_WorldSpaceLightPos0)); } else // point or spot light { vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0 - position); lightDirection = normalize(vertexToLightSource); float distance = positionInLightSpace.z; // use z coordinate in light space as signed distance attenuation = texture2D(_LightTextureB0, vec2(distance)).a; // texture lookup for attenuation // alternative with linear attenuation: // float distance = length(vertexToLightSource); // attenuation = 1.0 / distance; } vec3 diffuseReflection = attenuation * vec3(_LightColor0) * vec3(_Color) * max(0.0, dot(normalDirection, lightDirection)); vec3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = vec3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * vec3(_LightColor0) * vec3(_SpecColor) * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } gl_FragColor = vec4(diffuseReflection + specularReflection, 1.0); } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Specular" }

If you compare the lighting computed by this shader with the lighting of a built-in shader, you will notice a difference in intensity by a factor of about 2 to 4. However, this is mainly due to additional constant factors in the built-in shaders. It is straightforward to introduce similar constant factors in the code above.

It should be noted that the coordinate in light space is not equal to the distance from the light source; it's not even proportional to that distance. In fact, the meaning of the coordinate depends on the matrix `_LightMatrix0`

, which is an undocumented feature of Unity and can therefore change anytime. However, it is rather safe to assume that a value of 0 corresponds to very close positions and a value of 1 corresponds to farther positions.

Also note that point lights without cookie textures specify the attenuation lookup texture in `_LightTexture0`

instead of `_LightTextureB0`

; thus, the code above doesn't work for them. Moreover, the code doesn't check the sign of the coordinate, which is fine for spot lights but results in a lack of attenuation on one side of point light sources.

### Computing Lookup Textures[edit]

So far, we have used a lookup texture that is provided by Unity. If Unity wouldn't provide us with the texture in `_LightTextureB0`

, we had to compute this texture ourselves. Here is some JavaScript code to compute a similar lookup texture. In order to use it, you have to change the name `_LightTextureB0`

to `_LookupTexture`

in the shader code and attach the following JavaScript to any game object with the corresponding material:

@script ExecuteInEditMode() public var upToDate : boolean = false; function Start() { upToDate = false; } function Update() { if (!upToDate) // is lookup texture not up to date? { upToDate = true; var texture = new Texture2D(16, 16); // width = 16 texels, height = 16 texels texture.filterMode = FilterMode.Bilinear; texture.wrapMode = TextureWrapMode.Clamp; renderer.sharedMaterial.SetTexture("_LookupTexture", texture); // "_LookupTexture" has to correspond to the name // of the uniform sampler2D variable in the shader for (var j : int = 0; j < texture.height; j++) { for (var i : int = 0; i < texture.width; i++) { var x : float = (i + 0.5) / texture.width; // first texture coordinate var y : float = (j + 0.5) / texture.height; // second texture coordinate var color = Color(0.0, 0.0, 0.0, (1.0 - x) * (1.0 - x)); // set RGBA of texels texture.SetPixel(i, j, color); } } texture.Apply(); // apply all the texture.SetPixel(...) commands } }

In this code, `i`

and `j`

enumerate the texels of the texture image while `x`

and `y`

represent the corresponding texture coordinates. The function `(1.0-x)*(1.0-x)`

for the alpha component of the texture image happens to produce similar results as compared to Unity's lookup texture.

Note that the lookup texture should not be computed in every frame. Rather it should be computed only when necessary. If a lookup texture depends on additional parameters, then the texture should only be recomputed if any parameter has been changed. This can be achieved by storing the parameter values for which a lookup texture has been computed and continuously checking whether any of the new parameters are different from these stored values. If this is the case, the lookup texture has to be recomputed.

### Summary[edit]

Congratulations, you have reached the end of this tutorial. We have seen:

- How to use the built-in texture
`_LightTextureB0`

as a lookup table for light attenuation. - How to compute your own lookup textures in JavaScript.

### Further Reading[edit]

If you still want to know more

- about light attenuation for light sources, you should read Section “Diffuse Reflection”.
- about basic texture mapping, you should read Section “Textured Spheres”.
- about coordinates in light space, you should read Section “Cookies”.
- about the SECS principles (simple, explicit, consistent, minimal scope), you could read Chapter 3 of David Straker's book “C Style: Standards and Guidelines”, published by Prentice-Hall in 1991, which is available online.

# Projectors[edit]

This tutorial covers **projective texture mapping for projectors**, which are particular rendering components of Unity.

It is based on Section “Cookies”. If you haven't read that tutorial yet, you should read it first.

### Unity's Projectors[edit]

Unity's projectors are somewhat similar to spotlights. In fact, they can be used for similar applications. There is, however, an important technical difference: For spotlights, the shaders of all lit objects have to compute the lighting by the spotlight as discussed in Section “Cookies”. If the shader of an object ignores the spotlight, it just won't be lit by the spotlight. This is different for projectors: Each projector is associated with a material with a shader that is applied to any object in the projector's range. Thus, an object's shader doesn't need to deal with the projector; instead, the projector applies its shader to all objects in its range as an additional render pass in order to achieve certain effects, e.g. adding the light of a projected image or attenuating the color of an object to fake a shadow. In fact, various effects can be achieved by using different blend equations of the projector's shader. (Blend equations are discussed in Section “Transparency”.)

One might even consider projectors as the more “natural” way of implementing lights. However, the interaction between light and materials is usually specific to each material while the single shader of a projector cannot deal with all these differences. This limits the possibilities of projectors to three basic behaviors: adding light to an object, modulating an object's color, or both, adding light and modulating the object's color. We will look at adding light to an object and attenuating an object's colors as an example of modulating them.

### Projectors for Adding Light[edit]

In order to create a projector, choose **GameObject > Create Empty** from the main menu and then (with the new object still selected) **Component > Effects > Projector** from the main menu. You have now a projector that can be manipulated similarly to a spotlight. The settings of the projector in the **Inspector View** are discussed in Unity's reference manual. Here, the only important setting is the projector's **Material**, which will be applied to all objects in its range. Thus, we have to create another material and assign a suitable shader to it. This shader usually doesn't have access to the materials of the game objects, which it is applied to; therefore, it doesn't have access to their textures etc. Neither does it have access to any information about light sources. However, it has access to the attributes of the vertices of the game objects and its own shader properties.

A shader to add light to objects could be used to project any image onto other objects, similarly to an overhead projector or a movie projector. Thus, it should use a texture image similar to a cookie for spotlights (see Section “Cookies”) except that the RGB colors of the texture image should be added to allow for colored projections. We achieve this by setting the fragment color to the RGBA color of the texture image and using the blend equation

`Blend One One`

which just adds the fragment color to the color in the framebuffer. (Depending on the texture image, it might be better to use `Blend SrcAlpha One`

in order to remove any colors with zero opacity.)

Another difference to the cookies of spotlights is that we should use the Unity-specific uniform matrix `_Projector`

to transform positions from object space to projector space instead of the matrix `_LightMatrix0`

. However, coordinates in projector space work very similar to coordinates in light space — except that the resulting and coordinates are in the correct range; thus, we don't have to bother with adding 0.5. Nonetheless, we have to perform the division by the coordinates (as always for projective texture mapping); either by explicitly dividing and by or by using `texture2DProj`

:

Shader "GLSL projector shader for adding light" { Properties { _ShadowTex ("Projected Image", 2D) = "white" {} } SubShader { Pass { Blend One One // add color of _ShadowTex to the color in the framebuffer GLSLPROGRAM // User-specified properties uniform sampler2D _ShadowTex; // Projector-specific uniforms uniform mat4 _Projector; // transformation matrix // from object space to projector space varying vec4 positionInProjSpace; // position of the vertex (and fragment) in projector space #ifdef VERTEX void main() { positionInProjSpace = _Projector * gl_Vertex; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { if (positionInProjSpace.w > 0.0) // in front of projector? { gl_FragColor = texture2D(_ShadowTex , vec2(positionInProjSpace) / positionInProjSpace.w); // alternatively: gl_FragColor = texture2DProj( // _ShadowTex, vec3(positionInProjSpace)); } else // behind projector { gl_FragColor = vec4(0.0); } } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Projector/Light" }

Notice that we have to test whether is positive (i.e. the fragment is in front of the projector, not behind it). Without this test, the projector would also add light to objects behind it. Furthermore, the texture image has to be square and it is usually a good idea to use textures with wrap mode set to clamp.

Just in case you wondered: the shader property for the texture is called `_ShadowTex`

in order to be compatible with the built-in shaders for projectors.

### Projectors for Modulating Colors[edit]

The basic steps of creating a projector for modulating colors are the same as above. The only difference is the shader code. The following example adds a drop shadow by attenuating colors, in particular the floor's color. Note that in an actual application, the color of the shadow caster should not be attenuated. This can be achieved by assigning the shadow caster to a particular **Layer** (in the **Inspector View** of the game object) and specifying this layer under **Ignore Layers** in the **Inspector View** of the projector.

In order to give the shadow a certain shape, we use the alpha component of a texture image to determine how dark the shadow is. (Thus, we can use the cookie textures for lights in the standard assets.) In order to attenuate the color in the framebuffer, we should multiply it with 1 minus alpha (i.e. factor 0 for alpha equals 1). Therefore, the appropriate blend equation is:

`Blend Zero OneMinusSrcAlpha`

The `Zero`

indicates that we don't add any light. Even if the shadow is too dark, no light should be added; instead, the alpha component should be reduced in the fragment shader, e.g. by multiplying it with a factor less than 1. For an independent modulation of the color components in the framebuffer, we would require `Blend Zero SrcColor`

or `Blend Zero OneMinusSrcColor`

.

The different blend equation is actually about the only change in the shader code compared to the version for adding light:

Shader "GLSL projector shader for drop shadows" { Properties { _ShadowTex ("Shadow Shape", 2D) = "white" {} } SubShader { Pass { Blend Zero OneMinusSrcAlpha // attenuate color in framebuffer // by 1 minus alpha of _ShadowTex GLSLPROGRAM // User-specified properties uniform sampler2D _ShadowTex; // Projector-specific uniforms uniform mat4 _Projector; // transformation matrix // from object space to projector space varying vec4 positionInProjSpace; // position of the vertex (and fragment) in projector space #ifdef VERTEX void main() { positionInProjSpace = _Projector * gl_Vertex; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { if (positionInProjSpace.w > 0.0) // in front of projector? { gl_FragColor = texture2D(_ShadowTex , vec2(positionInProjSpace) / positionInProjSpace.w); // alternatively: gl_FragColor = texture2DProj( // _ShadowTex, vec3(positionInProjSpace)); } else // behind projector { gl_FragColor = vec4(0.0); } } #endif ENDGLSL } } // The definition of a fallback shader should be commented out // during development: // Fallback "Projector/Light" }

### Summary[edit]

Congratulations, this is the end of this tutorial. We have seen:

- How Unity's projectors work.
- How to implement a shader for a projector to add light to objects.
- How to implement a shader for a projector to attenuate objects' colors.

### Further Reading[edit]

If you still want to know more

- about the light space (which is very similar to projector space), you should read Section “Cookies”.
- about texture mapping and in particular alpha texture maps, you should read Section “Transparent Textures”.
- about projective texture mapping in fixed-function OpenGL, you could read NVIDIA's white paper “Projective Texture Mapping” by Cass Everitt (which is available online).
- about Unity's projectors, you should read Unity's documentation about projectors.

# Reflecting Surfaces[edit]

This tutorial introduces **reflection mapping** (and **cube maps** to implement it).

It's the first in a small series of tutorials about environment mapping using cube maps in Unity. The tutorial is based on the per-pixel lighting described in Section “Smooth Specular Highlights” and on the concept of texture mapping, which was introduced in Section “Textured Spheres”.

### Reflection Mapping with a Skybox[edit]

The illustration to the left depicts the concept of reflection mapping with a static skybox: a view ray is reflected at a point on the surface of an object and the reflected ray is intersected with the skybox to determine the color of the corresponding pixel. The skybox is just a large cube with textured faces surrounding the whole scene. It should be noted that skyboxes are usually static and don't include any dynamic objects of the scene. However, “skyboxes” for reflection mapping are often rendered to include the scene from a certain point of view. This is, however, beyond the scope of this tutorial.

Moreover, this tutorial covers only the computation of the reflection, it doesn't cover the rendering of the skybox, which is discussed in Section “Skyboxes”. For the reflection of a skybox in an object, we have to render the object and reflect the rays from the camera to the surface points at the surface normal vectors. The mathematics of this reflection is the same as for the reflection of a light ray at a surface normal vector, which was discussed in Section “Specular Highlights”.

Once we have the reflected ray, its intersection with a large skybox has to be computed. This computation actually becomes easier if the skybox is infinitely large: in that case the position of the surface point doesn't matter at all since its distance from the origin of the coordinate system is infinitely small compared to the size of the skybox; thus, only the direction of the reflected ray matters but not its position. Therefore, we can actually also think of a ray that starts in the center of a small skybox instead of a ray that starts somewhere in an infinitely large skybox. (If you are not familiar with this idea, you probably need a bit of time to accept it.) Depending on the direction of the reflected ray, it will intersect one of the six faces of the textured skybox. We could compute, which face is intersected and where the face is intersected and then do a texture lookup (see Section “Textured Spheres”) in the texture image for the specific face. However, GLSL offers cube maps, which support exactly this kind of texture lookups in the six faces of a cube using a direction vector. Thus, all we need to do, is to provide a cube map for the environment as a shader property and use the `textureCube`

instruction with the reflected direction to get the color at the corresponding position in the cube map.

### Cube Maps[edit]

A cube map shader property called `_Cube`

can be defined this way in a Unity shader:

Properties { _Cube ("Reflection Map", Cube) = "" {} }

The corresponding uniform variable is defined this way in a GLSL shader:

uniform samplerCube _Cube;

To create a cube map, select **Create > Cubemap** in the **Project View**. Then you have to specify six texture images for the faces of the cube in the **Inspector View**. Examples for such textures can be found in **Standard Assets > Skyboxes > Textures**. Furthermore, you should check **MipMaps** in the **Inspector View** for the cube map; this should produce considerably better visual results for reflection mapping.

The vertex shader has to compute the view direction `viewDirection`

and the normal direction `normalDirection`

as discussed in Section “Specular Highlights”. To reflect the view direction in the fragment shader, we can use the GLSL function `reflect`

as also discussed in Section “Specular Highlights”:

vec3 reflectedDirection = reflect(viewDirection, normalize(normalDirection));

And to perform the texture lookup in the cube map and store the resulting color in the fragment color, we simply use:

gl_FragColor = textureCube(_Cube, reflectedDirection);

That's about it.

### Complete Shader Code[edit]

The shader code then becomes:

Shader "GLSL shader with reflection map" { Properties { _Cube("Reflection Map", Cube) = "" {} } SubShader { Pass { GLSLPROGRAM // User-specified uniforms uniform samplerCube _Cube; // The following built-in uniforms // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix // Varyings varying vec3 normalDirection; varying vec3 viewDirection; #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors normalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); viewDirection = vec3(modelMatrix * gl_Vertex - vec4(_WorldSpaceCameraPos, 1.0)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { vec3 reflectedDirection = reflect(viewDirection, normalize(normalDirection)); gl_FragColor = textureCube(_Cube, reflectedDirection); } #endif ENDGLSL } } }

### Summary[edit]

Congratulations! You have reached the end of the first tutorial on environment maps. We have seen:

- How to compute the reflection of a skybox in an object.
- How to generate cube maps in Unity and how to use them for reflection mapping.

### Further Reading[edit]

If you still want to know more

- about the reflection of vectors, you should read Section “Specular Highlights”.
- about cube maps in Unity, you should read Unity's documentation about cube maps.

# Curved Glass[edit]

This tutorial covers **refraction mapping** and its implementation with cube maps.

It is a variation of Section “Reflecting Surfaces”, which should be read first.

### Refraction Mapping[edit]

In Section “Reflecting Surfaces”, we reflected view rays and then performed texture lookups in a cube map in the reflected direction. Here, we refract view rays at a curved, transparent surface and then perform the lookups with the refracted direction. The effect will ignore the second refraction when the ray leaves the transparent object again; however, many people hardly notice the differences since such refractions are usually not part of our daily life.

Instead of the `reflect`

function, we are using the `refract`

function; thus, the fragment shader could be:

#ifdef FRAGMENT void main() { float refractiveIndex = 1.5; vec3 refractedDirection = refract(normalize(viewDirection), normalize(normalDirection), 1.0 / refractiveIndex); gl_FragColor = textureCube(_Cube, refractedDirection); } #endif

Note that `refract`

takes a third argument, which is the refractive index of the outside medium (e.g. 1.0 for air) divided by the refractive index of the object (e.g. 1.5 for some kinds of glass). Also note that the first argument has to be normalized, which isn't necessary for `reflect`

.

### Complete Shader Code[edit]

With the adapted fragment shader, the complete shader code becomes:

Shader "GLSL shader with refraction mapping" { Properties { _Cube ("Environment Map", Cube) = "" {} } SubShader { Pass { GLSLPROGRAM // User-specified uniforms uniform samplerCube _Cube; // The following built-in uniforms // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix uniform mat4 _World2Object; // inverse model matrix // Varyings varying vec3 normalDirection; varying vec3 viewDirection; #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; mat4 modelMatrixInverse = _World2Object; // unity_Scale.w // is unnecessary because we normalize vectors normalDirection = normalize(vec3( vec4(gl_Normal, 0.0) * modelMatrixInverse)); viewDirection = vec3(modelMatrix * gl_Vertex - vec4(_WorldSpaceCameraPos, 1.0)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { float refractiveIndex = 1.5; vec3 refractedDirection = refract(normalize(viewDirection), normalize(normalDirection), 1.0 / refractiveIndex); gl_FragColor = textureCube(_Cube, refractedDirection); } #endif ENDGLSL } } }

### Summary[edit]

Congratulations. This is the end of another tutorial. We have seen:

- How to adapt reflection mapping to refraction mapping using the
`refract`

instruction.

### Further Reading[edit]

If you still want to know more

- about reflection mapping and cube maps, you should read Section “Reflecting Surfaces”.
- about the
`refract`

instruction, you could look it up in the “OpenGL ES Shading Language 1.0.17 Specification” available at the “Khronos OpenGL ES API Registry”.

# Skyboxes[edit]

This tutorial covers the rendering of **environment maps as backgrounds** with the help of cube maps.

It is based on Section “Reflecting Surfaces”. If you haven't read that tutorial, this would be a very good time to read it.

### Rendering a Skybox in the Background[edit]

As explained in Section “Reflecting Surfaces”, a skybox can be thought of as an infinitely large, textured box that surrounds a scene. Sometimes, skyboxes (or skydomes) are implemented by sufficiently large textured models, which approximate an infinitely large box (or dome). However, Section “Reflecting Surfaces” introduced the concept of a cube map, which actually represents an infinitely large box; thus, we don't need the approximation of a box or a dome of limited size. Instead, we can render any screen-filling model (it doesn't matter whether it is a box, a dome, or an apple tree as long as it covers the whole background), compute the view vector from the camera to the rasterized surface point in the vertex shader (as we did in Section “Reflecting Surfaces”) and then perform a lookup in the cube map with this view vector (instead of the reflected view vector in Section “Reflecting Surfaces”) in the fragment shader:

#ifdef FRAGMENT void main() { gl_FragColor = textureCube(_Cube, viewDirection); } #endif

For best performance we should, of course, render a model with only a few vertices and each pixel should be rasterized only once. Thus, rendering the inside of a cube that surrounds the camera (or the whole scene) is fine.

### Complete Shader Code[edit]

The shader should be attached to a material, which should be attached to a cube that surrounds the camera. In the shader code, we deactivate writing to the depth buffer with `ZWrite Off`

such that no objects are occluded by the skybox. (See the description of the depth test in Section “Per-Fragment Operations”.) Front-face culling is activated with `Cull Front`

such that only the “inside” of the cube is rasterized. (See Section “Cutaways”.) The line `Tags { "Queue" = "Background" }`

instructs Unity to render this pass before other objects are rendered.

Shader "GLSL shader for skybox" { Properties { _Cube ("Environment Map", Cube) = "" {} } SubShader { Tags { "Queue" = "Background" } Pass { ZWrite Off Cull Front GLSLPROGRAM // User-specified uniform uniform samplerCube _Cube; // The following built-in uniforms // are also defined in "UnityCG.glslinc", // i.e. one could #include "UnityCG.glslinc" uniform vec3 _WorldSpaceCameraPos; // camera position in world space uniform mat4 _Object2World; // model matrix // Varying varying vec3 viewDirection; #ifdef VERTEX void main() { mat4 modelMatrix = _Object2World; viewDirection = vec3(modelMatrix * gl_Vertex - vec4(_WorldSpaceCameraPos, 1.0)); gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = textureCube(_Cube, viewDirection); } #endif ENDGLSL } } }

### Shader Code for Unity's Skybox System[edit]

The shader above illustrates how to render a skybox by rendering a cube around the camera with a specific shader. This is a very general approach. Unity, however, has its own skybox system that doesn't require any game object: you just specify the material with the skybox shader in the main menu **Edit > Render Settings > Skybox Material** and Unity takes care of the rest. Unfortunately, we cannot use our shader for this system since we have to do the lookup in the cube texture at the position that is specified by the vertex texture coordinates. This is actually easier than computing the view direction. Here is the code:

Shader "GLSL shader for skybox" { Properties { _Cube ("Environment Map", Cube) = "" {} } SubShader { Tags { "Queue" = "Background" } Pass { ZWrite Off Cull Off GLSLPROGRAM // User-specified uniform uniform samplerCube _Cube; // Varying varying vec3 texCoords; #ifdef VERTEX void main() { texCoords = (vec3)gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } #endif #ifdef FRAGMENT void main() { gl_FragColor = textureCube(_Cube, texCoords); } #endif ENDGLSL } } }

As mentioned above, you should create a material with this shader and drag the material to **Edit > Render Settings > Skybox Material**. There is no need to attach the material to any game object.

### Summary[edit]

Congratulations, you have reached the end of another tutorial! We have seen:

- How to render skyboxes.
- How to render skyboxes in Unity without game objects.

### Further Reading[edit]

If you still want to know more

- about cube maps and reflections of skyboxes in objects, you should read Section “Reflecting Surfaces”.
- about lighting that is consistent with a skybox, you should read Section “Many Light Sources”.

# Many Light Sources[edit]

This tutorial introduces **image-based lighting**, in particular **diffuse (irradiance) environment mapping** and its implementation with cube maps.

This tutorial is based on Section “Reflecting Surfaces”. If you haven't read that tutorial, this would be a very good time to read it.

### Diffuse Lighting by Many Lights[edit]

Consider the lighting of the sculpture in the image to the left. There is natural light coming through the windows. Some of this light bounces off the floor, walls and visitors before reaching the sculpture. Additionally, there are artificial light sources, and their light is also shining directly and indirectly onto the sculpture. How many directional lights and point lights would be needed to simulate this kind of complex lighting environment convincingly? At least more than a handful (probably more than a dozen) and therefore the performance of the lighting computations is challenging.

This problem is addressed by image-based lighting. For static lighting environments that are described by an environment map, e.g. a cube map, image-based lighting allows us to compute the lighting by an arbitrary number of light sources with a single texture lookup in a cube map (see Section “Reflecting Surfaces” for a description of cube maps). How does it work?

In this section we focus on diffuse lighting. Assume that every texel (i.e. pixel) of a cube map acts as a directional light source. (Remember that cube maps are usually assumed to be infinitely large such that only directions matter, but positions don't.) The resulting lighting for a given surface normal direction can be computed as described in Section “Diffuse Reflection”. It's basically the cosine between the surface normal vector **N** and the vector to the light source **L**:

Since the texels are the light sources, **L** is just the direction from the center of the cube to the center of the texel in the cube map. A small cube map with 32×32 texels per face has already 32×32×6 = 6144 texels. Adding the illumination by thousands of light sources is not going to work in real time. However, for a static cube map we can compute the diffuse illumination for all possible surface normal vectors **N** in advance and store them in a lookup table. When lighting a point on a surface with a specific surface normal vector, we can then just look up the diffuse illumination for the specific surface normal vector **N** in that precomputed lookup table.

Thus, for a specific surface normal vector **N** we add (i.e. integrate) the diffuse illumination by all texels of the cube map. We store the resulting diffuse illumination for this surface normal vector in a second cube map (the “diffuse irradiance environment map” or “diffuse environment map” for short). This second cube map will act as a lookup table, where each direction (i.e. surface normal vector) is mapped to a color (i.e. diffuse illumination by potentially thousands of light sources). The fragment shader is therefore really simple (this one could use the vertex shader from Section “Reflecting Surfaces”):

#ifdef FRAGMENT void main() { gl_FragColor = textureCube(_Cube, normalDirection); } #endif

It is just a lookup of the precomputed diffuse illumination using the surface normal vector of the rasterized surface point. However, the precomputation of the diffuse environment map is somewhat more complicated as described in the next section.

### Computation of Diffuse Environment Maps[edit]

This section presents some JavaScript code to illustrate the computation of cube maps for diffuse (irradiance) environment maps. In order to use it in Unity, choose **Create > JavaScript** in the **Project View**. Then open the script in Unity's text editor, copy the JavaScript code into it, and attach the script to the game object that has a material with the shader presented below. When a new cube map of sufficiently small dimensions is specified for the shader property `_OriginalCube`

(which is labeled **Environment Map** in the shader user interface), the script will update the shader property `_Cube`

(i.e. **Diffuse Environment Map** in the user interface) with a corresponding diffuse environment map. Note that the script only accepts cube maps of face dimensions 32×32 or smaller because the computation time tends to be very long for larger cube maps. Thus, when creating a cube map in Unity, make sure to choose a sufficiently small size.

The script includes only a handful of functions: `Awake()`

initializes the variables; `Update()`

takes care of communicating with the user and the material (i.e. reading and writing shader properties); `computeFilteredCubemap()`

does the actual work of computing the diffuse environment map; and `getDirection()`

is a small utility function for `computeFilteredCubemap()`

to compute the direction associated with each texel of a cube map. Note that `computeFilteredCubemap()`

not only integrates the diffuse illumination but also avoids discontinuous seams between faces of the cube map by setting neighboring texels along the seams to the same averaged color.

@script ExecuteInEditMode() private var originalCubemap : Cubemap; // a reference to the // environment map specified in the shader by the user private var filteredCubemap : Cubemap; // the diffuse irradiance // environment map computed by this script function Update() { var originalTexture : Texture = renderer.sharedMaterial.GetTexture("_OriginalCube"); // get the user-specified environment map if (originalTexture == null) // did the user specify "None" for the environment map? { if (originalCubemap != null) { originalCubemap = null; filteredCubemap = null; renderer.sharedMaterial.SetTexture("_Cube", null); } return; } else if (originalTexture == originalCubemap && filteredCubemap != null && null == renderer.sharedMaterial.GetTexture("_Cube")) { renderer.sharedMaterial.SetTexture("_Cube", filteredCubemap); // set the computed diffuse environment map in the shader } else if (originalTexture != originalCubemap || filteredCubemap != renderer.sharedMaterial.GetTexture("_Cube")) // has the user specified a cube map that is different of // what we had processed previously? { if (EditorUtility.DisplayDialog("Processing of Environment Map", "Do you want to process the cube map of face size " + originalTexture.width + "x" + originalTexture.width + "? (This will take some time.)", "OK", "Cancel")) // does the user really want to process this cube map? { originalCubemap = originalTexture; if (filteredCubemap != renderer.sharedMaterial.GetTexture("_Cube")) { if (null != renderer.sharedMaterial.GetTexture("_Cube")) { DestroyImmediate(renderer.sharedMaterial.GetTexture( "_Cube")); // clean up } } if (null != filteredCubemap) { DestroyImmediate(filteredCubemap); // clean up } computeFilteredCubemap(); // compute the diffuse environment map renderer.sharedMaterial.SetTexture("_Cube", filteredCubemap); // set the computed diffuse environment map in the shader } else // no cancel the processing and reset everything { originalCubemap = null; filteredCubemap = null; renderer.sharedMaterial.SetTexture("_OriginalCube", null); renderer.sharedMaterial.SetTexture("_Cube", null); } } return; } function computeFilteredCubemap() // This function computes a diffuse environment map in // "filteredCubemap" of the same dimensions as "originalCubemap" // by integrating -- for each texel of "filteredCubemap" -- // the diffuse illumination from all texels of "originalCubemap" // for the surface normal vector corresponding to the direction // of each texel of "filteredCubemap". { filteredCubemap = Cubemap(originalCubemap.width, originalCubemap.format, true); // create the diffuse environment cube map var filteredSize : int = filteredCubemap.width; var originalSize : int = originalCubemap.width; // compute all texels of the diffuse environment // cube map by iterating over all of them for (var filteredFace : int = 0; filteredFace < 6; filteredFace++) { for (var filteredI : int = 0; filteredI < filteredSize; filteredI++) { for (var filteredJ : int = 0; filteredJ < filteredSize; filteredJ++) { var filteredDirection : Vector3 = getDirection(filteredFace, filteredI, filteredJ, filteredSize).normalized; var totalWeight : float = 0.0; var originalDirection : Vector3; var originalFaceDirection : Vector3; var weight : float; var filteredColor : Color = Color(0.0, 0.0, 0.0); // sum (i.e. integrate) the diffuse illumination // by all texels in the original environment map for (var originalFace : int = 0; originalFace < 6; originalFace++) { originalFaceDirection = getDirection(originalFace, 1, 1, 3).normalized; // the normal vector of the face for (var originalI : int = 0; originalI < originalSize; originalI++) { for (var originalJ : int = 0; originalJ < originalSize; originalJ++) { originalDirection = getDirection(originalFace, originalI, originalJ, originalSize); // direction to the texel, i.e. light source weight = 1.0 / originalDirection.sqrMagnitude; // take smaller size of more distant texels // into account originalDirection = originalDirection.normalized; weight = weight * Vector3.Dot(originalFaceDirection, originalDirection); // take tilt of texels // compared to face into account weight = weight * Mathf.Max(0.0, Vector3.Dot(filteredDirection, originalDirection)); // directional filter for diffuse illumination totalWeight = totalWeight + weight; // instead of analytically normalization, // we just normalize to the potentially // maximum illumination filteredColor = filteredColor + weight * originalCubemap.GetPixel(originalFace, originalI, originalJ); // add the illumination by this texel } } } filteredCubemap.SetPixel(filteredFace, filteredI, filteredJ, filteredColor / totalWeight); // store the diffuse illumination of this texel } } } // Avoid seams between cube faces: // average edge texels to the same color on both sides of the seam // (except corner texels, see below) var maxI : int = filteredCubemap.width - 1; var average : Color; for (var i : int = 1; i < maxI; i++) { average = (filteredCubemap.GetPixel(0, i, 0) + filteredCubemap.GetPixel(2, maxI, maxI - i)) / 2.0; filteredCubemap.SetPixel(0, i, 0, average); filteredCubemap.SetPixel(2, maxI, maxI - i, average); average = (filteredCubemap.GetPixel(0, 0, i) + filteredCubemap.GetPixel(4, maxI, i)) / 2.0; filteredCubemap.SetPixel(0, 0, i, average); filteredCubemap.SetPixel(4, maxI, i, average); average = (filteredCubemap.GetPixel(0, i, maxI) + filteredCubemap.GetPixel(3, maxI, i)) / 2.0; filteredCubemap.SetPixel(0, i, maxI, average); filteredCubemap.SetPixel(3, maxI, i, average); average = (filteredCubemap.GetPixel(0, maxI, i) + filteredCubemap.GetPixel(5, 0, i)) / 2.0; filteredCubemap.SetPixel(0, maxI, i, average); filteredCubemap.SetPixel(5, 0, i, average); average = (filteredCubemap.GetPixel(1, i, 0) + filteredCubemap.GetPixel(2, 0, i)) / 2.0; filteredCubemap.SetPixel(1, i, 0, average); filteredCubemap.SetPixel(2, 0, i, average); average = (filteredCubemap.GetPixel(1, 0, i) + filteredCubemap.GetPixel(5, maxI, i)) / 2.0; filteredCubemap.SetPixel(1, 0, i, average); filteredCubemap.SetPixel(5, maxI, i, average); average = (filteredCubemap.GetPixel(1, i, maxI) + filteredCubemap.GetPixel(3, 0, maxI - i)) / 2.0; filteredCubemap.SetPixel(1, i, maxI, average); filteredCubemap.SetPixel(3, 0, maxI - i, average); average = (filteredCubemap.GetPixel(1, maxI, i) + filteredCubemap.GetPixel(4, 0, i)) / 2.0; filteredCubemap.SetPixel(1, maxI, i, average); filteredCubemap.SetPixel(4, 0, i, average); average = (filteredCubemap.GetPixel(2, i, 0) + filteredCubemap.GetPixel(5, maxI - i, 0)) / 2.0; filteredCubemap.SetPixel(2, i, 0, average); filteredCubemap.SetPixel(5, maxI - i, 0, average); average = (filteredCubemap.GetPixel(2, i, maxI) + filteredCubemap.GetPixel(4, i, 0)) / 2.0; filteredCubemap.SetPixel(2, i, maxI, average); filteredCubemap.SetPixel(4, i, 0, average); average = (filteredCubemap.GetPixel(3, i, 0) + filteredCubemap.GetPixel(4, i, maxI)) / 2.0; filteredCubemap.SetPixel(3, i, 0, average); filteredCubemap.SetPixel(4, i, maxI, average); average = (filteredCubemap.GetPixel(3, i, maxI) + filteredCubemap.GetPixel(5, maxI - i, maxI)) / 2.0; filteredCubemap.SetPixel(3, i, maxI, average); filteredCubemap.SetPixel(5, maxI - i, maxI, average); } // Avoid seams between cube faces: average corner texels // to the same color on all three faces meeting in one corner average = (filteredCubemap.GetPixel(0, 0, 0) + filteredCubemap.GetPixel(2, maxI, maxI) + filteredCubemap.GetPixel(4, maxI, 0)) / 3.0; filteredCubemap.SetPixel(0, 0, 0, average); filteredCubemap.SetPixel(2, maxI, maxI, average); filteredCubemap.SetPixel(4, maxI, 0, average); average = (filteredCubemap.GetPixel(0, maxI, 0) + filteredCubemap.GetPixel(2, maxI, 0) + filteredCubemap.GetPixel(5, 0, 0)) / 3.0; filteredCubemap.SetPixel(0, maxI, 0, average); filteredCubemap.SetPixel(2, maxI, 0, average); filteredCubemap.SetPixel(5, 0, 0, average); average = (filteredCubemap.GetPixel(0, 0, maxI) + filteredCubemap.GetPixel(3, maxI, 0) + filteredCubemap.GetPixel(4, maxI, maxI)) / 3.0; filteredCubemap.SetPixel(0, 0, maxI, average); filteredCubemap.SetPixel(3, maxI, 0, average); filteredCubemap.SetPixel(4, maxI, maxI, average); average = (filteredCubemap.GetPixel(0, maxI, maxI) + filteredCubemap.GetPixel(3, maxI, maxI) + filteredCubemap.GetPixel(5, 0, maxI)) / 3.0; filteredCubemap.SetPixel(0, maxI, maxI, average); filteredCubemap.SetPixel(3, maxI, maxI, average); filteredCubemap.SetPixel(5, 0, maxI, average); average = (filteredCubemap.GetPixel(1, 0, 0) + filteredCubemap.GetPixel(2, 0, 0) + filteredCubemap.GetPixel(5, maxI, 0)) / 3.0; filteredCubemap.SetPixel(1, 0, 0, average); filteredCubemap.SetPixel(2, 0, 0, average); filteredCubemap.SetPixel(5, maxI, 0, average); average = (filteredCubemap.GetPixel(1, maxI, 0) + filteredCubemap.GetPixel(2, 0, maxI) + filteredCubemap.GetPixel(4, 0, 0)) / 3.0; filteredCubemap.SetPixel(1, maxI, 0, average); filteredCubemap.SetPixel(2, 0, maxI, average); filteredCubemap.SetPixel(4, 0, 0, average); average = (filteredCubemap.GetPixel(1, 0, maxI) + filteredCubemap.GetPixel(3, 0, maxI) + filteredCubemap.GetPixel(5, maxI, maxI)) / 3.0; filteredCubemap.SetPixel(1, 0, maxI, average); filteredCubemap.SetPixel(3, 0, maxI, average); filteredCubemap.SetPixel(5, maxI, maxI, average); average = (filteredCubemap.GetPixel(1, maxI, maxI) + filteredCubemap.GetPixel(3, 0, 0) + filteredCubemap.GetPixel(4, 0, maxI)) / 3.0; filteredCubemap.SetPixel(1, maxI, maxI, average); filteredCubemap.SetPixel(3, 0, 0, average); filteredCubemap.SetPixel(4, 0, maxI, average); filteredCubemap.Apply(); // apply all the texture.SetPixel(...) commands } function getDirection(face : int, i : int, j : int, size : int) : Vector3 // This function computes the direction that is // associated with a texel of a cube map { var direction : Vector3; if (face == 0) { direction = Vector3(0.5, -((j + 0.5) / size - 0.5), -((i + 0.5) / size - 0.5)); } else if (face == 1) { direction = Vector3(-0.5, -((j + 0.5) / size - 0.5), ((i + 0.5) / size - 0.5)); } else if (face == 2) { direction = Vector3(((i + 0.5) / size - 0.5), 0.5, ((j + 0.5) / size - 0.5)); } else if (face == 3) { direction = Vector3(((i + 0.5) / size - 0.5), -0.5, -((j + 0.5) / size - 0.5)); } else if (face == 4) { direction = Vector3(((i + 0.5) / size - 0.5), -((j + 0.5) / size - 0.5), 0.5); } else if (face == 5) { direction = Vector3(-((i + 0.5) / size - 0.5), -((j + 0.5) / size - 0.5), -0.5); } return direction; }

As an alternative to the JavaScript code above, you can also use the following C# code.

using UnityEngine; using UnityEditor; using System.Collections; [ExecuteInEditMode] public class ComputeDiffuseEnvironmentMap : MonoBehaviour { public Cubemap originalCubeMap; // environment map specified in the shader by the user //[System.Serializable] // avoid being deleted by the garbage collector, // and thus leaking private Cubemap filteredCubeMap; // the computed diffuse irradience environment map private void Update() { Cubemap originalTexture = null; try { originalTexture = renderer.sharedMaterial.GetTexture( "_OriginalCube") as Cubemap; } catch (System.Exception) { Debug.LogError("'_OriginalCube' not found on shader. " + "Are you using the wrong shader?"); return; } if (originalTexture == null) // did the user set "none" for the map? { if (originalCubeMap != null) { renderer.sharedMaterial.SetTexture("_Cube", null); originalCubeMap = null; filteredCubeMap = null; return; } } else if (originalTexture == originalCubeMap && filteredCubeMap != null && renderer.sharedMaterial.GetTexture("_Cube") == null) { renderer.sharedMaterial.SetTexture("_Cube", filteredCubeMap); // set the computed // diffuse environment map in the shader } else if (originalTexture != originalCubeMap || filteredCubeMap != renderer.sharedMaterial.GetTexture("_Cube")) { if (EditorUtility.DisplayDialog( "Processing of Environment Map", "Do you want to process the cube map of face size " + originalTexture.width + "x" + originalTexture.width + "? (This will take some time.)", "OK", "Cancel")) { if (filteredCubeMap != renderer.sharedMaterial.GetTexture("_Cube")) { if (renderer.sharedMaterial.GetTexture("_Cube") != null) { DestroyImmediate( renderer.sharedMaterial.GetTexture( "_Cube")); // clean up } } if (filteredCubeMap != null) { DestroyImmediate(filteredCubeMap); // clean up } originalCubeMap = originalTexture; filteredCubeMap = computeFilteredCubeMap(); //computes the diffuse environment map renderer.sharedMaterial.SetTexture("_Cube", filteredCubeMap); // set the computed // diffuse environment map in the shader return; } else { originalCubeMap = null; filteredCubeMap = null; renderer.sharedMaterial.SetTexture("_Cube", null); renderer.sharedMaterial.SetTexture( "_OriginalCube", null); } } } // This function computes a diffuse environment map in // "filteredCubemap" of the same dimensions as "originalCubemap" // by integrating -- for each texel of "filteredCubemap" -- // the diffuse illumination from all texels of "originalCubemap" // for the surface normal vector corresponding to the direction // of each texel of "filteredCubemap". private Cubemap computeFilteredCubeMap() { Cubemap filteredCubeMap = new Cubemap(originalCubeMap.width, originalCubeMap.format, true); int filteredSize = filteredCubeMap.width; int originalSize = originalCubeMap.width; // Compute all texels of the diffuse environment cube map // by itterating over all of them for (int filteredFace = 0; filteredFace < 6; filteredFace++) // the six sides of the cube { for (int filteredI = 0; filteredI < filteredSize; filteredI++) { for (int filteredJ = 0; filteredJ < filteredSize; filteredJ++) { Vector3 filteredDirection = getDirection(filteredFace, filteredI, filteredJ, filteredSize).normalized; float totalWeight = 0.0f; Vector3 originalDirection; Vector3 originalFaceDirection; float weight; Color filteredColor = new Color(0.0f, 0.0f, 0.0f); // sum (i.e. integrate) the diffuse illumination // by all texels in the original environment map for (int originalFace = 0; originalFace < 6; originalFace++) { originalFaceDirection = getDirection( originalFace, 1, 1, 3).normalized; //the normal vector of the face for (int originalI = 0; originalI < originalSize; originalI++) { for (int originalJ = 0; originalJ < originalSize; originalJ++) { originalDirection = getDirection( originalFace, originalI, originalJ, originalSize); // direction to the texel // (i.e. light source) weight = 1.0f / originalDirection.sqrMagnitude; // take smaller size of more // distant texels into account originalDirection = originalDirection.normalized; weight = weight * Vector3.Dot( originalFaceDirection, originalDirection); // take tilt of texel compared // to face into account weight = weight * Mathf.Max(0.0f, Vector3.Dot(filteredDirection, originalDirection)); // directional filter // for diffuse illumination totalWeight = totalWeight + weight; // instead of analytically // normalization, we just normalize // to the potential max illumination filteredColor = filteredColor + weight * originalCubeMap.GetPixel( (CubemapFace)originalFace, originalI, originalJ); // add the // illumination by this texel } } } filteredCubeMap.SetPixel( (CubemapFace)filteredFace, filteredI, filteredJ, filteredColor / totalWeight); // store the diffuse illumination of this texel } } } // Avoid seams between cube faces: average edge texels // to the same color on each side of the seam int maxI = filteredCubeMap.width - 1; for (int i = 0; i < maxI; i++) { setFaceAverage(ref filteredCubeMap, 0, i, 0, 2, maxI, maxI - i); setFaceAverage(ref filteredCubeMap, 0, 0, i, 4, maxI, i); setFaceAverage(ref filteredCubeMap, 0, i, maxI, 3, maxI, i); setFaceAverage(ref filteredCubeMap, 0, maxI, i, 5, 0, i); setFaceAverage(ref filteredCubeMap, 1, i, 0, 2, 0, i); setFaceAverage(ref filteredCubeMap, 1, 0, i, 5, maxI, i); setFaceAverage(ref filteredCubeMap, 1, i, maxI, 3, 0, maxI - i); setFaceAverage(ref filteredCubeMap, 1, maxI, i, 4, 0, i); setFaceAverage(ref filteredCubeMap, 2, i, 0, 5, maxI - i, 0); setFaceAverage(ref filteredCubeMap, 2, i, maxI, 4, i, 0)