During this break I implemented normal mapping for fun. Normal mapping is a computer graphics technique in which we simulate bumps and dents in a model without having to add more triangles to it in order to save processing time, yet still yield a result that looks fairly similar to the one if we used a high poly model.

Imperfections on a surface are mapped to a normal map texture that is used on top of the diffuse texture. The vertex normals are replaced by the normal map texture for the diffuse and specular lighting calculations.

diffuse texture

This is how a diffuse texture looks like. Each pixel is read in the fragment shader its color is multiplied to the diffuse contribution to the fragment color.

normal map texture

And this is a normal map texture. Each pixel is read the same as the diffuse texture in the fragment shader, but its color is used as normal coordinates to calculate the diffuse contribution.

// Fragment shader //

    // Texture
diffuseColor    = texture(diffuseTexture,fragTexcoord).xyz;
fragNormal      = texture(normalMapTexture,fragTexcoord).xyz;

    // Normalization
fragNormal = normalize(fragNormal*2-1);

    // Diffuse contribution
intensity   = dot(lightDirection,fragNormal);
diffuse     = lightDiffuse*matDiffuse*max(0,intensity);

So it is just a matter of passing a new texture to the fragment shader, right? If we do that, we get this:

It looks cool, but as the cube moves, we see how something must be wrong, all faces seem to be getting light from the same angle. The normal orientation needs to match the light source. We need to rotate these normals to where each triangle is pointing to, we also need to apply the model transformations to these normals too.

But if we'd be doing a lot of computing to apply matrix transformations for each normal in the fragment shader to match the light direction in world space. Rather, need to be smart and go with the path of least effort, computing wise, obviously, because new we'll have a lot of work to do to bring the light direction in world space to the texture space, or, how it is called, tangent space.

This tutorial goes step by step on how to implement normal mapping, so I'm not going in details on how to do it. Suffice to say, you'll need to build a transformation matrix from the vertex normal and other two vectors that are perpendicular to it and transform the light direction in the vertex shader to tangent space.

This is how I implemented it on my fragment shader:

    // Vertex shader //

    // rotate everything to world space 
viewDirectionIn = -camera;
fragNormalIn    = model3x3*normalize(vertNormal);
fragTangentIn   = model3x3*normalize(vertTangent);
fragBitangentIn = model3x3*normalize(vertBitangent);

mat3 tangent = transpose(mat3(
    fragTangentIn,
    fragBitangentIn,
    fragNormalIn
));

    // rotate everything to tangent space
viewDirectionIn     = tangent*viewDirectionIn;
lightDirectionIn    = tangent*lightSource;

I needed to compute the tangent and bitangent components to the fragment normal, but for that I needed to read the UV coordinates from the model and the current model parsing function I was using was tricky to work with.

So I picked up assimp and started learning its ins and outs, trying to create a wrapper around it. This took me some time. On day one I could load the vertex info, do tangent calculation, what I was looking for, and triangulations. On day two I worked on applying materials. On day three I did some optimizations such as vertex indexing.

model with materials

But I got carried away and when I least noticed I was having fun with MMD models.

blender

I regained focus and worked on applying some textures to the models. For that I needed to learn how to use Blender and I have to say I'd describe the experience as being far from pleasant.

Finally, I finished the shader coding and did some tests. The results look like this:


You can download the source code here.