GLSL Programming/Blender/Debugging of Shaders

From Wikibooks, open books for an open world
Jump to navigation Jump to search
A false-color satellite image.

This tutorial introduces attribute variables. It builds on the tutorial about a minimal shader and the RGB-cube tutorial about varying variables.

This tutorial also introduces the main technique to debug shaders in Blender: false-color images, i.e. a value is visualized by setting one of the components of the fragment color to it. Then the intensity of that color component in the resulting image allows you to make conclusions about the value in the shader. This might appear to be a very primitive debugging technique because it is a very primitive debugging technique. Unfortunately, there is no alternative in Blender.

Where Does the Vertex Data Come from?[edit | edit source]

In the RGB-cube tutorial you have seen how the fragment shader gets its data from the vertex shader by means of varying variables. The question here is: where does the vertex shader get its data from? Within Blender this data is specified for each selected object by the settings in the Properties window, in particular the settings in the Object Data tab, Material tab, and Textures tab. All the data of the mesh of the object is sent to OpenGL in each frame. (This is often called a “draw call”. Note that each draw call has some performance overhead; thus, it is much more efficient to send one large mesh with one draw call to OpenGL than to send several smaller meshes with multiple draw calls.) This data usually consists of a long list of triangles, where each triangle is defined by three vertices and each vertex has certain attributes, including position. These attributes are made available in the vertex shader by means of attribute variables.

Built-in Attribute Variables and how to Visualize Them[edit | edit source]

In Blender, most of the standard attributes (position, color, surface normal, and texture coordinates) are built in, i.e. you need not (in fact must not) define them. The names of these built-in attributes are actually defined by the OpenGL “compability profile” because such built-in names are needed if you mix an OpenGL application that was written for the fixed-function pipeline with a (programmable) vertex shader. If you had to define them, the definitions (only in the vertex shader) would look like this:

   attribute vec4 gl_Vertex; // position (in object coordinates, 
      // i.e. local or model coordinates)
   attribute vec4 gl_Color; // color (usually constant)
   attribute vec3 gl_Normal; // surface normal vector 
      // (usually normalized; also in object coordinates)
   attribute vec4 gl_MultiTexCoord0; //0th set of texture coordinates 
      // (a.k.a. “UV”; between 0 and 1) 
   attribute vec4 gl_MultiTexCoord1; //1st set of texture coordinates 
      // (a.k.a. “UV”; between 0 and 1)
   ...

There is only one attribute variable that is provided by Blender but has no standard name in OpenGL, namely the tangent vector, i.e. a vector that is orthogonal to the surface normal. You should define this variable yourself as an attribute variable of type vec4 with the specific name tangent as shown in the following shader:

import bge

cont = bge.logic.getCurrentController()

VertexShader = """
   varying vec4 color;
   attribute vec4 tangent; // this attribute is specific to Blender 
      // and has to be defined explicitly

   void main()
   {
       color = gl_MultiTexCoord0; // set the varying to this attribute
    
       // other possibilities to play with:
    
       // color = gl_Vertex;
       // color = gl_Color;
       // color = vec4(gl_Normal, 1.0);
       // color = gl_MultiTexCoord0;
       // color = gl_MultiTexCoord1;
       // color = tangent;

       gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
   }
"""

FragmentShader = """
   varying vec4 color;

   void main()
   {   
      gl_FragColor = color;
   }
"""


mesh = cont.owner.meshes[0]
for mat in mesh.materials:
    shader = mat.getShader()
    if shader != None:
        if not shader.isValid():
            shader.setSource(VertexShader, FragmentShader, 1)
            shader.setAttrib(bge.logic.SHD_TANGENT)

Note the line

shader.setAttrib(bge.logic.SHD_TANGENT)

in the Python script which tells Blender to provide the shader with tangent attributes. However, Blender will only provide several of the attributes for certain settings in the Properties window, in particular something should be specified as UV Maps in the Object Data tab (just click on the “+” button), a material should be defined in the Material tab, and a texture (e.g. any image) should be defined in the Textures tab.

In the RGB-cube tutorial we have already seen, how to visualize the gl_Vertex coordinates by setting the fragment color to those values. In this example, the fragment color is set to gl_MultiTexCoord0 such that we can see what kind of texture coordinates Blender provides for certain settings in the Properties window.

How to Interpret False-Color Images[edit | edit source]

When trying to understand the information in a false-color image, it is important to focus on one color component only. For example, if the attribute gl_MultiTexCoord0 is written to the fragment color then the red component of the fragment visualizes the x coordinate of gl_MultiTexCoord0, i.e. it doesn't matter whether the output color is maximum pure red or maximum yellow or maximum magenta, in all cases the red component is 1. On the other hand, it also doesn't matter for the red component whether the color is blue or green or cyan of any intensity because the red component is 0 in all cases. If you have never learned to focus solely on one color components, this is probably quite challenging; therefore, you might consider to look only at one color component at a time. For example by using this line to set the varying in the vertex shader:

            color = vec4(gl_MultiTexCoord0.x, 0.0, 0.0, 1.0);

This sets the red component of the varying variable to the x component of gl_MultiTexCoord0 but sets the green and blue components to 0 (and the alpha or opacity component to 1 but that doesn't matter in this shader).

The specific texture coordinates that Blender sends to the vertex shader depend on the UV Maps that is specified in the Object Data tab and the Mapping that is specified in the Textures tab.

Texture coordinates are particularly nice to visualize because they are between 0 and 1 just like color components are. Almost as nice are coordinates of normalized vectors (i.e., vectors of length 1; for example, gl_Normal is usually normalized) because they are always between -1 and +1. To map this range to the range from 0 to 1, you add 1 to each component and divide all components by 2, e.g.:

            color = vec4((gl_Normal + vec3(1.0, 1.0, 1.0)) / 2.0, 1.0);

Note that gl_Normal is a three-dimensional vector. Black corresponds then to the coordinate -1 and full intensity of one component to the coordinate +1.

If the value that you want to visualize is in another range than 0 to 1 or -1 to +1, you have to map it to the range from 0 to 1, which is the range of color components. If you don't know which values to expect, you just have to experiment. What helps here is that if you specify color components outside of the range 0 to 1, they are automatically clamped to this range. I.e., values less than 0 are set to 0 and values greater than 1 are set to 1. Thus, when the color component is 0 or 1 you know at least that the value is less or greater than what you assumed and then you can adapt the mapping iteratively until the color component is between 0 and 1.

Debugging Practice[edit | edit source]

In order to practice the debugging of shaders, this section includes some lines that produce black colors when the assignment to color in the vertex shader is replaced by each of them. Your task is to figure out for each line, why the result is black. To this end, you should try to visualize any value that you are not absolutely sure about and map the values less than 0 or greater than 1 to other ranges such that the values are visible and you have at least an idea in which range they are. Note that most of the functions and operators are documented in “Vector and Matrix Operations”.

            color = gl_MultiTexCoord0 - vec4(1.5, 2.3, 1.1, 0.0);


            color = vec4(1.0 - gl_MultiTexCoord0.w);


            color = gl_MultiTexCoord0 / tan(0.0);

The following lines work only with spheres and require some knowledge about the dot and cross product:

            color = dot(gl_Normal, vec3(tangent)) * gl_MultiTexCoord0;


            color = dot(cross(gl_Normal, vec3(tangent)), gl_Normal) * 
               gl_MultiTexCoord0;


            color = vec4(cross(gl_Normal, gl_Normal), 1.0);


            color = vec4(cross(gl_Normal, gl_Vertex), 1.0);

Does the function radians() always return black? What's that good for?

            color = radians(gl_MultiTexCoord0);

Consult the documentation in the “OpenGL ES Shading Language 1.0.17 Specification” available at the “Khronos OpenGL ES API Registry” to figure out what radians() is good for.

Special Variables in the Fragment Shader[edit | edit source]

Attributes are specific to vertices, i.e., they usually have different values for different vertices. There are similar variables for fragment shaders, i.e., variables that have different values for each fragment. However, they are different from attributes because they are not specified by a mesh (i.e. a list of triangles). They are also different from varyings because they are not set explicitly by the vertex shader.

Specifically, a four-dimensional vector gl_FragCoord is available containing the screen (or: window) coordinates of the fragment that is processed; see “Vertex Transformations” for the description of the screen coordinate system.

Moreover, a boolean variable gl_FrontFacing is provided that specifies whether the front face or the back face of a triangle is being rendered. Front faces usually face the “outside” of a model and back faces face the “inside” of a model; however, there is no clear outside or inside if the model is not a closed surface. Usually, the surface normal vectors point in the direction of the front face, but this is not required. In fact, front faces and back faces are specified by the order of the vertex triangles: if the vertices appear in counter-clockwise order, the front face is visible; if they appear in clockwise order, the back face is visible. An application is shown in the tutorial on cutaways.

Summary[edit | edit source]

Congratulations, you have reached the end of this tutorial! We have seen:

  • The list of built-in attributes in Blender: gl_Vertex, gl_Color, gl_Normal, gl_MultiTexCoord0, gl_MultiTexCoord1, and the special tangent.
  • How to visualize these attributes (or any other value) by setting components of the output fragment color.
  • The two additional special variables that are available in fragment programs: gl_FragCoord and gl_FrontFacing.

Further Reading[edit | edit source]

If you still want to know more


< GLSL Programming/Blender

Unless stated otherwise, all example source code on this page is granted to the public domain.