Many modern games can render complex geometry, and one of the trickiest things to render are translucent meshes. Normal meshes can be rendered in any order because the graphics card depth test ensure that the correct vertices are visible. The standard depth test works well for opaque meshes, but when the meshes are translucent, it can lead to incorrect results. When it comes to rendering translucent meshes, the graphics card does fragment wise alpha blending to create the final resultant pixel. To get the right result, the blending needs to happen in the order in which they will be seen by the camera. So you can see that if we render translucent meshes in an arbitrary order, the fragment wise alpha blending can be incorrect. In order to ensure correct translucency, the translucent need to be drawn last and need to be drawn back to front as seen from the camera. Getting into the implementation details: I created a new submit function that would be used specifically to submit translucent meshes, the translucent meshes use a different fragment shader where the output color alpha is set to 0.5 and the render state that the shader uses has alpha transparency turned on. When I am rendering a frame, I draw all the opaque meshes first followed by the translucent ones. Before drawing the translucent meshes, I sort them back to front as they are seen from the camera. This is achieved by transforming the mesh local position into camera space and comparing the z values to figure out the order. Once drawn the output looks like this: There are 2 screenshots, one from the front and one from the back. As you can see the translucency remains as expected. When looking from the front, the meshes are sorted such that the green cube is drawn last; when looking from the back, the green cube is drawn first. You can move the camera using the following controls:
![]()
|