WARNING: This website has moved to http://codrt.fr/allardj/. Please click here to see the most recent version of this page.
Softwares - Shaders

Simple diffuse shader

Most objects are rendered with a diffuse lighting model. The vertices of the model must contain 3 attributes:

  • A 3D position
  • A vector normal to the surface of the object
  • A color RGBA value (optional, can be uniform for the whole object).
These attributes must be transformed using the model, view, and projection matrices. This is accomplished in the following vertex shader:
// Default Vertex Shader
// Color + Normal

void main(float4 position     : POSITION,
          half3 normal        : NORMAL,
          half4 color         : COLOR0,

  uniform float4x4 ModelViewProj,
  uniform float4x4 ModelView    ,
  uniform float4x4 ModelViewIT  ,

      out float4 ppos         : POSITION,
      out half3 pnormal       : TEXCOORD0,
      out half4 pcolor        : COLOR0)
{
  ppos = mul(ModelViewProj, position);
  pcolor = color;
  pnormal = mul((float3x3)ModelViewIT, normal);
}

Note that vertices attributes are passed as standard arguments, while objects parameters are tagged by uniform, and results are tagged by out. Once these computations are done for each vertex, we can compute the lighting itself for each pixel. A pixel shaders uses as inputs the results of the vertex shader and produces the final color of the object.

For shading the object we need a description of the light. The simplest approach is to use a directionnal white light, described only by its direction, which we'll call *lightdir*.To compute a diffuse lighting we simple use the dot product between the surface normal and this direction vector. Note that the interpolated normal must be renormalized first. This is important to obtain a very smooth shading without visible polygon boundaries. As some older graphics cards don't support this operation we can use hardware profile specific function overloading to disable it when necessary. To remove the shadowed part we need to clamp the result to the [0 - 1] range, which is what the saturate function does. We can then add an ambient term to the final color.

This leads to the following pixel shader:

sphere1.png
// Default Pixel Shader
// Color + Per-pixel diffuse shading

half3 unit(half3 v) { return normalize(v); }
// GeForce <= 4 don't support normalize
fp20 half3 unit(half3 v) { return v; }

half4 main(
          half3 pnormal      : TEXCOORD0,
          half4 pcolor       : COLOR,
          uniform float3 lightdir
  ) : COLOR
{
  half4 C = pcolor;
  half3 N = unit(pnormal);
  half diffuse = saturate(dot(N,lightdir));
  C.xyz = C.xyz*(0.5*diffuse+0.5);
  return C;
}
split

Visualizing intermediate values

When developing shaders or visualizing scientific data, it is often necessary to render information other than colors. A very simple approach is to directly output values as RGB color. For exemple, we can see what the interpolated normal vector looks like:

normals1.png
// Pixel shader rendering normal as color

half4 main( half3 pnormal : TEXCOORD0 ) : COLOR
{
  return half4(pnormal,1);
}

This however is often quite limited. Colors values use the range [0 1], while the data might be different. Normal vectors for example use [-1 1]. As a consequence, negatives values (left and bottom parts of the sphere) are not visible. In this case we need to apply a different computation to obtain colors from the original values.

To visualize the negative values we can for example try to use the opposite colors to RGB, i.e. Cyan Magenta Yellow. This is what the following shader does:

normals2.png
// Pixel shader rendering normal as color

half4 main( half3 pnormal : TEXCOORD0 ) : COLOR
{
  half3 npos = max(pnormal,0.0);
  half3 nneg = max(-pnormal,0.0);
  half3 c = npos.xyz + (nneg.yzx + nneg.zxy);
  return half4(c,1);
}

First the normal is decomposed in positive values (npos) and negative values (nneg) using the max function. Then the positive values are used as RGB coefficient while the negative values are transformed to correspond to CMY colors. To do this we use the fact that Cyan is in fact Green+Blue, Magenta is Blue+Red, and Yellow is Red+Green. This means that a negative X value will be applied to the GB color coefficients. This operation can be easily implemented using swizzle operator to reorder the values in a vector.

split

Advanced shaders

IN PROGRESS...

normals.png
vtexture2.png
// Textured Mesh Vertex Shader

#define NBT 6

uniform float3x4 texmatrix[NBT];
uniform float3 camdir[NBT];

void main(
  float4 position    : POSITION,
  float3 normal      : NORMAL,
  uniform float4x4 ModelViewProj,
  uniform float4x4 ModelView    ,
  uniform float4x4 ModelViewIT  ,

      out float4 ppos        : POSITION,
      out float4 ptexcoord[NBT] : TEXCOORD1,
      out float3 pnormal     : TEXCOORD0)
{
  ppos = mul(ModelViewProj, position);
  pnormal = normalize(mul((float3x3)ModelViewIT, normal));
  for (int t=0;t<NBT;t++)
  {
    ptexcoord[t].xyz = mul(texmatrix[t],position);
    ptexcoord[t].w = -dot(camdir[t],normal);
  }
}
// Textured Mesh Pixel Shader

#define NBT 6

uniform samplerRECT texture[NBT];
uniform half2 texsize;

half4 main(
              float3 pnormal : TEXCOORD0,
              float4 ptexcoord[NBT] : TEXCOORD1,
      uniform half4 color = half4(1,0,0,1)
  ) : COLOR
{
  float4 c;
  for (int t=0;t<NBT;t++)
  {
    float ft = max(0,ptexcoord[t].w);
    c += texRECTproj(texture[t], ptexcoord[t].xyz)*ft;
  }
  return c/c.w;
}
split
volren-cube.png
volren-fire.png

Last modification: 2005-Jun-08 01:02
validate ID/IMAG