## Rendering a dice

By SoftMotionLabs on Monday 16 February 2015, 23:02 - Game Programming - Permalink

Rendering a dice may seem a very basic task. In fact it's not that simple. In this blog post, we will see all the steps that took us from the initial render of the Yatzy dice in Family's Game Travel Pack to the final one, covering subject like shadow mapping, per pixel lighting, outlines and such.

When I started, I first ran Blender, searched for a few tutorials and quickly came up with a simple set up. After a few render, you get nice face renders like this one.

Quickly it becomes obvious that rolling dice is funnier with real 3D dice and physics. So you simplify the model and render it as a naive 3D model in OpenGL.With usual texture size, the dice looks blurry. The simple way to go is hires texture, but this comes with a price : the app size grows (when downloaded and when run). Since we have a variety of dice (4 faces, 6 faces, 20 faces,...), it did not look as an acceptable solution. Another not so usual way of rendering bicolor elements is Distance fields. The idea is to render the element at a much higher resolution, then encode it as a distance function. The color is given when rendering which fits well our situation, since it allows us to color our dice the way we want.

The dice looked better but still flat. This is the time, you think about improving lighting. The obvious solution is per pixel lighting. Since dice are usually small on the screen, they do not use that much processing power. Still, the edges looked too "edgy" and the dots were really flat. So how do we solve that ? Let's go for a normal map ! Now things look really better as you can see on the render:

Then we add a diceboard and again things do not feel right. Let's add shadow maps to better link the objects together. And we get something like this render.

Obviously the shadow map resolution is not satisfying. It happens to be that the shadow map is not optimized to the render view. To do so, we need to find which is the right light view frustrum for our scene. The smaller the view is the better the rendering quality will be. We could go for a static light view, but this would be largely sub optimal since it would need to contain not only the board but as well the space above where we roll the dice. The solution here is to go for a dynamic view hull fitting : when the dice are rolled the light view hull is enlarged to encompass all the objects (but since they are moving, this is not that much noticeable), when they lye on the board, the light view hull is fitted to the smallest possible. At the end we get, the following render:

At this state we can ask, what can we add to polish it a bit more ? Outlines ? Ok, right, here they are =:

On a side note, we may worry about how these 3D dice render in 2D compared to our original Blender render. The answer is quite fine like you can see on the render below:

And that's how starting from a very simple boring need (render a dice), we finish with a complete enthousiasthming 3D setup : 3 pass rendering (shadow map + outline + 3D objects) + per pixel lighting + normal mapping + shadow mapping + distance field coloring !

On the technical part, everything here is standard LibGdx code. You can find lots of resources regarding the 3D pipeline of LibGdx here and the shaders used in the game below :

Vertex shader :

uniform mat4 u_projTrans; uniform mat4 u_worldTrans; uniform mat4 u_worldViewTrans; uniform mat4 u_lightProjTrans; uniform vec3 u_cameraSpaceLightDir; attribute vec4 a_position; attribute vec3 a_normal; attribute vec3 a_tangent; attribute vec3 a_binormal; attribute vec2 a_texCoord0; varying vec2 v_texCoord0; varying vec4 v_shadowCoord; varying vec3 v_tangentSpaceLight; varying vec3 v_tangentSpaceEye; void main() { v_texCoord0 = a_texCoord0; vec4 cameraSpaceVertex = u_worldViewTrans * a_position; // Move light vector to tangent space vec3 cameraSpaceNormal = normalize((u_worldViewTrans * vec4(a_normal, 0.0)).xyz); vec3 cameraSpaceTangent = normalize((u_worldViewTrans * vec4(a_tangent, 0.0)).xyz); vec3 cameraSpaceBinormal = normalize((u_worldViewTrans * vec4(a_binormal, 0.0)).xyz); // Exactly the same as previous line. Don't know which is the best (parallelism vs simplicity) // vec3 cameraSpaceBinormal = cross(cameraSpaceNormal, cameraSpaceTangent); mat3 cameraTotangentMatrix = mat3(cameraSpaceTangent, cameraSpaceBinormal, cameraSpaceNormal); v_tangentSpaceLight = u_cameraSpaceLightDir * cameraTotangentMatrix; v_tangentSpaceEye = -normalize(cameraSpaceVertex.xyz) * cameraTotangentMatrix; // Coordinates for the shadow mapping v_shadowCoord = u_lightProjTrans * u_worldTrans * a_position; const mat4 biasMat = mat4(0.5, 0.0, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.5, 0.5, 0.0, 1.0); v_shadowCoord = biasMat * v_shadowCoord; gl_Position = u_projTrans * cameraSpaceVertex; }

Fragment shader :

#ifdef GL_ES precision mediump float; #endif uniform float u_alpha; uniform vec3 u_body; uniform vec3 u_ink; uniform float u_ambient; uniform float u_diffuse; uniform float u_specular; uniform float u_shininess; uniform sampler2D u_normalTexture; uniform sampler2D u_depthMap; varying vec2 v_texCoord0; varying vec4 v_shadowCoord; varying vec3 v_tangentSpaceLight; varying vec3 v_tangentSpaceEye; const float smoothing = 1.0 / 16.0; float unpack (vec4 colour) { const vec4 bitShifts = vec4(1.0, 1.0 / 255.0, 1.0 / (255.0 * 255.0), 1.0 / (255.0 * 255.0 * 255.0)); return dot(colour, bitShifts); } void main() { // Compute base color (monochrome distance field) vec4 tex = texture2D(u_normalTexture, v_texCoord0); float distance = tex.a; float alpha = smoothstep(0.5 - smoothing, 0.5 + smoothing, distance); vec3 color = mix(u_ink, u_body, alpha); // Compute distance from light float normalizedDistance = v_shadowCoord.z / v_shadowCoord.w; normalizedDistance = 0.5 * normalizedDistance + 0.5; // Gather max visibility from light float shadowDepth = unpack(texture2DProj(u_depthMap, v_shadowCoord)); // Perform shadow mapping if (normalizedDistance > shadowDepth) { gl_FragColor = vec4(u_ambient * color, u_alpha); } else { // Extract the perturbed normal in tangent space coordinate system from the texture, and perform standard lighting math vec3 N = tex.xyz * 2.0 - 1.0; vec3 L = normalize(v_tangentSpaceLight); vec3 E = normalize(v_tangentSpaceEye); vec3 H = normalize(L + E); float diffuseFactor = max(0.0, dot(N, L)); float specular = pow(max(0.0, dot(N, H)), u_shininess); gl_FragColor = vec4((u_ambient + u_diffuse * diffuseFactor) * color + vec3(u_specular * specular), u_alpha); } }