Current Situation

I’m doing my thesis at Fatshark and in the Stingray engine, and so the subject will be determined to a large extent of their current situation and needs. Lets look at what Fatshark, and Stingray, is currently using and what problems they have.

Global Distance Fog

They are currently using a global exponential distance height fog quite extensively and does give a better impression of depth in the scene.

dfog    nodfog

The implementation is simple, following the exponential height fog formula found in the “better fog” article by Iñigo Quilez:

f = a*e^{-b*o_y}\frac{1-e^{-b*T*v_y}}{b*v_y}

where o_y is the camera origin height, T is the distance to the point, b is the density falloff and v_y vertical is the view direction. Since the global distance fog is very simple, it is straight forward use the distance formula to apply fog on objects that are in mist.



=> No local lights interaction.


The implementation does have sun blending which can give indication as to where the sun is located with respect to the viewing angle, but there is no interactions with other lighting in the scene.


=> fog blending of transparent objects is not correct. In addition to the background fog, transparent objects also get fog applied but incorrectly.

transpdist   transpdist2   transpdist3

Say the background, x, is at distance d_1 with outgoing radiance of L_o and the transparent surface, x_t, is at distance d_0. x' should get a radiance according to L_o' = f(L'_r + f(L_o, d_1), d_0), but since the transparency is calculated after the fog is applied to the background it instead becomes L_o' = f(L'_r + f(L_o, d_0+d_1), d_0). The distance d_0 is accounted for twice, which is why the flask gets brighter as we get further from it.




Billboard Particles

Stingray has a particle system which is used for many local fog/smoke/dust/fire/haze effects.

The implementation uses “Soft Particles” as described in the Wolffire blog article, which can yield pretty good looking billboard particle effects. The main advantage is that artists are very used to working with particle system such as these and they are versitile in that they can be used for many different visual effects.

Local lights “faked volumetrics” is also achieved to some extent by iterating culled light lists in particle vertex shaders. Per-pixel local lights was too expensive.




=> The billboard fog is not physically based and is often used very “trickily”. An example is a level with swamp fog where the desired effect is achieved by using a global distance fog and then billboard particles of blood stains to simulate how the fog is less dense

visiblebillboards2    bloodpoolsmoke

In the right picture you can see the blood stains billboard particles that are used to make the swamp fog effects.


=> The local lights code is per-vertex and yield very visible seams



=> Fill rate problems are easy to run into when many billboards overdraw


Light Shafts

The light shafts code is based on a GPU Gems article called Volumetric Light Scattering as a Post-Process (, where crepuscular rays can be calculated in screen space from the screen space sun position.




=> Effects disappear as soon as light source is out of view:




Local Fog Volumes

Stingray also has support for local fog volumes, which are used to some extent to create local fog effects. Primarily, local fog volumes are currently primarily used to create flowing fog on the ground. in some areas. The system is parameterized with a density height map texture, and static wind-parameters.

magnusplazafog    startfog



=> Hard to work with. Currently the local volumes fog systems is not very artist friendly and to achieve the effects above, very big volumes are placed beneath the level.

magnusplazafogvolume    magnusstartfog

=> The volumes are not really volumetric.

The same exponential height and distance formula as with the global fog is used and then weighted with a density sampled from the density height map texture. So, important effects such as glow around light sources or volumetric shadows are not included



=> Texture is visible. Also, the current sampling method produce visible patterns when looking at higher density fog (a bit hard to show in screenshot, but in the wind the texture can be perceived as very repetitive)




=> View dependant(?)

=> Temporal aliasing(?)


Fog Planes

“Fog planes” are used quite extensively to occlude vision in things such as pipes and wells, mostly black. These are simple quads that are rendered with a linear distance fog formula, which give pretty good results.

fogplane1 fogplane2 fogplane3

General Problems

=> Can’t handle Multiple Lights.

The different systems for fog doesn’t account for many lights in a good way. The closest incorporation is the per-vertex calculations in billboard systems, but they are not always too convincing.


=> Cant handle Transparency well.

Transparent objects simply don’t render correctly when placed inside fog or dust etc. At the current moment, transparency is used very sparingly since it’s very troublesome.




=> Artists place a lot of lights

=> Different scenarios required different fog systems

=> Systems don’t come together