Weathering and aging effects in the hand of artists

Version : 1.1 – Living blog – First version was 04 May 2014

With permission of Dontnod entertainmen

It is now frequent in game to have dynamic weathering and aging effects: Rain, snow, dirt, rust, dust, pollution… Most of the time these effects are drive by programmers because they require specific code like access to a custom shadow map or perform by a full screen space post process. Before leaving Dontnod I was working on a feature to allow artists to handle themselves these kind of effects. At the time we were requiring a lot of effects and I wanted to allow artists to prototype as much idea as they can. This post will provide explanation of this feature and will give implementation details under the Unreal Engine 4. It aim to work with a deferred rendering engine. These weathering and aging effects feature has been use at Dontnod but there is some pitfalls which may prevent to use it effectively in a shipping game. It all depends on the scale of the game and performance expected. I hope by exposing this idea that others could improve the system 🙂

I thanks Dontnod to allow me to talk about it.

 The weathering and aging effects

We know from graphic literature that a lot of weathering and aging effects can be done through surface properties modification [1]. Of course using complex multi-layered lighting model is the right way to handle it but it is way less flexible, will require a programmer to code every effects, and it will be difficult to get right regarding all supported lights type (like image based lighting or area lights).

As an example, the wet surfaces appearing when it is raining could be simulated with material properties modification: Roughness, diffuse albedo, normal… I talk a lot about this subject in other posts on this blog (see water drops series). However these modifications should be done only were it matter. When it is raining, you don’t want your interior surface to be wet. In Remember Me, we were handling this by adding some extra code in the shaders to modify the surface properties, then artists were vertex painting the part of the surfaces requiring to be dry. But this was not sufficient to handle all cases of wet surface. For example, a player walking in a wet street could, once he get into a dry interior, will let wet footprint on the ground. This could have been simulated with decals.

Taking another example, in Kill zone 4: Shadow fall [2] they perform a full screen space pass to modify the material attributes where normal are pointing up to add dust where it matter (after an explosion for example).

GBuffer modification

Thinking with a deferred renderer in mind, it is possible to identify a set of desired control for artists allowing them to modify material properties and simulating a weathering or aging effect. These controls are perform with GBuffer modification with:
– Deferred decals
– Full screen space quad. I call it material postprocess.
– Deferred effect lights: These are same as lights with shadow map or not, except they behave like deferred decal. The amount of light being use as opacity for blending operation. Soft shadow map also allow smooth transitions between effects.
– Object shaders: Properties will be modified directly at GBuffer generation time in the shader of the object. So this require specific code in each shaders. User could use vertex painting to bring information for an effect.

Every tools will perform some material properties modification to simulate an effect, like darkening the diffuse and boosting the smoothness for wet surfaces. All the GBuffer modification must be done before the lighting pass to be taken into account.

Applying the material property modification for the GBuffer modification control could be done in two ways.  Most common case is to use hardware blending, but it could be too restrictive. The other case is to use read/write ability into the same textures (a.k.a prorgammable blending). Sadly this ability is not widely supported. The PS4 and Mantle support it for example, DX11 doesn’t (I don’t talk about intel’s pixel synchronization but simple read/writing the same pixel).

Delaying the GBuffer modification

To support every platform and to be able to do any customization of material properties, I perform the modification of material properties in an extra full screen pass. Effectively delaying the GBuffer modification. Rather than modifying the GBuffer, the different tools simply output an effect weight inside the GBuffer. This effect weight is read later in the extra pass to apply the effect. There is multiple benefit to do that:
– Applying only one time the effect could save performance for heavy effect
– Accumulating effect weights could allow to clamp it in the delayed pass in order to limit the strength of an effect
– Centralized place to deal with the effect. Easier to author.
Sadly it will require to store the effect weight in the GBuffer.

Outputting only one weight will only allow to support one effect at a time. To support multiple effects, it is require to store more weights. I call this process of outputting effect weights “the tag system”. The extra pass is the weathering and aging pass and is artists controllable. I will simply call it “postprocess” pass later. Here is an example result of such a feature done by an artist without requiring a programmer (The puddle is a deferred effect light):


In practice artists are not limited to weathering and aging effect,they can do whatever they want (but I have conserve the original name I give to this feature). Here is another example where deferred effect lights are used to paint on objects.


Implementation in Unreal Engine 4

I want to provide detail on how this feature work under the UE4 in order that the reader get a better understanding of the artists control. I encourage people talking about new systems/features to also talk about the usability of it which is too often forget.

The tag system

I have added the support for 3 effect weights via an extra buffer (mean there is now 6 render target for the GBuffer, UE4 default to 5). The alpha channel of a render target can’t be blend by the screen space decal, the material postproces or the effect light. I use it to store the porosity. Porosity is a unsual parameter and reflect a characteristic of a surface like roughness do. It is use in some weathering and aging effects. More details can be found in DONTNOD Physically based rendering chart for Unreal Engine 4 and in Water drop 3a – Physically based wet surfaces.

Every tools of the tag system are customization of a UE4 features. UE4 already support lights, screen space decal and material postprocess and have a node-based material editor.

The object shaders tagging is done through the material editor by adding the 3 effect weights to the output node:


The effect weights are call DNEParameterX. The name is voluntary generic as artists decide to which effect the weight is link.

The screen space decal use a new decal blend mode to only affect the rendertarget with the effect weights. Any blending mode (opaque, additive, translucent, modulate) can be use to affect the effect weights. It is important that only the connected attributes are affected. If you blend one attribute you must take care to not overwrite the other attributes with a default value. We use a writeMask generated from the current list of connected output to avoid that.


The effect lights are a new type of lights derive from the UE4 ones. We have directional, spot and point. Lighst have same properties as the default one with shadow supports (still we disable some stuff like IES support). However we have written a specific rendering path for these lights to be more lightweight. All the effect lights qre rendered as deferred lights. A light can affect any combination of the 3 effect weights, the light intensity will represent the value to blend inside the GBuffer. Effect Lights use additive blending mode. A missing option that I don’t get time to add here is to only render static geometry in the shadow map. This is useful for effect like rain where moving objects should not “disable” the wet surfaces.


I haven’t got the time to add the material postprocess support. It will require to do the postprocess pass before the lighting. But the principe is the same as for screen space decals.

Order matter and the tools for the tag system were applied in following order:
– Object shader
– Light custom
– Material postprocess
– Screen space decal

The only reason behind this order is that we wanted to be able to remove effect weights with screen space decal (Can be achieve with blend translucent or negative additive) . This is something also present in Kill zone 4 approach [2]. Their screen space decal store a mask in the GBuffer reuse by their postprocess pass to not apply the effect. Here this is the reverse, decal will neutral the effect weights and this will automatically undo the application of effect.

The post process

The post process is where all is happening and where artists would express them the most. We use the editor of the material postprocess for it but create a new material domain to identify that it is the weathering and aging postprocess. The editing of the postprocess is really similar to any material postprocess in UE4. We have added a new DNESceneTexture node to access to the effect weights (DNEParameter0) and the other GBuffer parameters. Here is an example of a simple wet surface postprocess created with this system where DNEParameter0 store the water amount:


Note that this kind of effect can’t be done with classic blending. We use the metallic attribute to avoid to darken the diffuse in this effect and this will not be allowed if we weren’t using the delayed GBuffer modification (Unless programmable blending support). However this postprocess pass require a resolve of all the GBuffer to work as we read/write into the Gbuffer (Also this depends on platform. PS4 do not require it). We need to init all the GBuffer data, override some of these values based on the shader node then write the new result. To know when we need to overwrite a value or not, we detect if an attribute is connected to the output. Here is the resulting template shaders use with Postprocess:

void Main(
    float4 UVAndScreenPos : TEXCOORD0
    ,out float4 OutGBufferA : SV_Target0 
    ,out float4 OutGBufferB : SV_Target1
    ,out float4 OutGBufferC : SV_Target2
    ,out float4 OutGBufferD : SV_Target3
    float2 UV = UVAndScreenPos.xy;    
    FMaterialPixelParameters MaterialParameters = MakeInitializedMaterialPixelParameters();
    // Retrieve value of the GBuffer
    FGBufferData Data = GetGBufferData(UV, false);
    MaterialParameters.DNEGBufferData = Data;
    // The code below should follow what is done in BasePassPixelShader.usf
    Data.WorldNormal = normalize( GetMaterialNormal(MaterialParameters) );
    Data.Metallic = GetMaterialMetallic( MaterialParameters );

    Data.BaseColor = GetMaterialBaseColor( MaterialParameters );

    float4 OutGBufferE = 0.0f;
    float4 OutDNEGBuffer = 0.0f;
    EncodeGBuffer(Data, OutGBufferA, OutGBufferB, OutGBufferC, OutGBufferD, OutGBufferE, OutDNEGBuffer);

So, the postprocess is artists controlled but can become quite complex. Only one postprocess is allowed for a frame. Mean when you want to support multiple effects at the same time, like having dirt amount and water amount, this must be managed in the same shader node graph. Actually this is good because now you can mix different weathering and aging effects, like mixing dirt and snow. Artists will be able to improve the quality of this mixing, separate blending pass could give bad result.

To give more power to the artists and to deal a bit with the limited number of effect weights available, we allowed this postprocess to be change by postprocess volume:


That mean that part of a level can have different usage of the DNEParameterX. It can even allow to swap the shader graph currently use. For example, the water amount use for wet surfaces can suddenly be use as frozen amount and produce an ice look. Here is a proof of concept video done by Laurent Harduin (Lead lighter) showing fun stuff that can be achieved with this features (so don’t expect a sexy demo). All effect in this video were done without requiring a programmer.

Fun fact, the video take place in Dontnod office :).
Note how the light shadow of the character remove the lava effect at the end of the video. This is why I require to implement the option to render static geometry only in shadow map.

UE4 screen space decals advice

UE4 screen space decal don’t work with indirect lighting. This is a pity as it is not possible to deal without it for most game. The trouble is due to the fact UE4 store EmissiveColor and IndirectDiffuseLighting x Diffuse albedo inside the main buffer. As the Diffuse albedo is part of the equation, futher modification of it by decal won’t be taken into account, and the decal will not be visible in indirect light.
Recently Epic have added a “solution” with a DBuffer, where decal are rendered in specific buffer prior to the GBuffer generation. And the GBuffer generation use this DBuffer. Sadly this require a full Z-prepass, mean a large perrformance hit, which is bad.

A possible solution for this problem is to store only the IndirectDiffuseLighting + an emissive multiplier and use the baseColor as the emissive color (So emissive is divide in color and intensity exactly like light, which is fine as emissive is lighting anyway). Then perform the multiplication by the diffuse albedo later. At Dontnod this is what we have done and we perform the multiplication with albedo inside the postprocess pass of the weathering and aging effect. The downside of doing these steps is that now your diffuse albedo is coupled to emissive color. So if a decal modify the baseColor, it will affect the emissive color. In practice this is not a big deal and this only affect opaque objects. Transparent objects can still use classic emissive color.


Devil is in the details. There is plenty of stuff to deal with in practice. There is also a lot of possibilities. Deferred effect lights and deferred decal can be spawned and thus allow many dynamic events. For example there can be a directional explosion somewhere and a deferred effect light can be spawn to draw the dust on the wall (In this case the shadow must not take into account dynamic geometry). I don’t give performance number because it mainly depends on the effects which are done and how many deferred effects lights / deferred decals / Material postprocess are use. But cost are a bit faster than cost of the orginal non-tag tools (deferred lights etc…). The feature is not perfect and I haven’t got the time to leverage it to a robust system but allow to do amusing stuff. By sharing the concept I hope other could have nice idea.  Dontnod have made crazy thing with it :). It has been implemented for PC DX11 and for PS4. Not sure if this can be a part of a AAA game due to the cost of extra GBuffer required but could work if you are PS4 only. It all depends on your context and your performance budgets. Any discussion is welcome. I try to sum up the pros and cons of the method.


– Fully customizable operation on material properties
– Customizable by artists
– Proper handling of cummulative effect (snow + dirt for example)
– Better control on the cumulative effect weights (can clamp it)
– Deferred Decal can inhibit the effect


– As many deferred technique, don’t work on transparent
– As GBuffer modification is delayed it require space into the GBuffer to save effect weights . This will hurt the performance. 6 GBuffer can be really bad for platform like XBone (because of the ESRAM), but is affordable on PS4 (Infamous second son and Killzone use 6 GBuffer [4])
– Artists want always many effect weights! 3 were not enough for their usage
– No access to the geometry information (like heightmap or vertex normal) in the postprocess
– No acess to UV parameterization in the post process,. Mean artsits must use world space UV deduced from world position of the pixel. Can also be tricky to handle vertical surface texturing (Normal not pointing up)
– Extra resolve of the GBuffer (Depends on platform)
– Still programmer can be required for effect even with this system. It can’t handle deformable snow for example.

Some cons stuff are applicable with or without a delayed GBuffer modification pass. With the delayed pass we lose the possibility of doing modification with UV/geometry knowledge in object shaders. However, as the system is supposed to be compatible with other tools, this could not be done anyway.

Final note. Often this system is not enough and need to be complete with other FX. For example Rain still require a postprocess or GPU particles to produce the raindrops. Moreover, the handling of dynamics objects like character that can go back and forth in area with weather effect still require game designer script / gameplay programmer. In this case, depends on the weather type or event you have, you may require or not to have persistent effect on character. For example a character walking under the rain should be wet when arriving in dry area. But this is not true if a character come and back from a dusted area. A paint bomb could explode and require character to be marked permanently with paint. All these cases need to be handled in the object shaders but it will require an extra work to deal with the multiple context. One of the trouble here is that object shaders can’t access to deferred effect lights shadow.In Remember Me we where simply raycasting up above the character head to know if we were affected by the rain.


[1] Gu, Tu, Ramamoorthi, Belhumeur, Matusik, Nayar, “Time-Varying Surface Appearance: Acquisition, Modeling and Rendering”,
[2] Valient, “Taking Killzone Shadow Fall Image Quality into the Next Generation”,
[3] Lobanchikov, Gruen, “GSC Game World‘s S.T.A.L.K.E.R : Clear Sky –a showcase for Direct3D 10.0/1″,
[4] Bentley, “Engine Postmortem of inFAMOUS: Second Son”,

4 Responses to Weathering and aging effects in the hand of artists

  1. seblagarde says:

    v1.1: add misc precision a bit everywhere.

  2. Brian Karis says:

    Cool stuff! I know Martin will be very interested in this.

    The issue we are trying to solve with the Dbuffer is that the normals have already been used when the directional lightmap is evaluated. To be able to modify the normal after the base pass, the lightmaps with directionality would need to be stored to the Gbuffer or the lightmaps would need to be applied deferred. Neither is very attractive for us. Thus the Dbuffer and z prepass for any mesh with decals applied to it.

    An added bonus is blending doesn’t need to be supported in the Gbuffer, which is normally a pain in the ass.

    Another solution is clustered forward application of decals which we may implement but that wouldn’t support the full programability that our artists use a ton of. Decals would have to be fixed function, atlased and so on.

    Our artists are using decals more and more so solutions to these problems are of great interest to us. I have some even crazier ideas for decals but I’ll hold off for now till they are better formulated.

    • seblagarde says:

      Thanks for the comment Brian.

      Same analysis than you here. Also I haven’t thinking about DBuffer for solving this, good point.

      Any storage of non directional lighting information inside the Gbuffer will suffer from applying decal laters (not only for normal but also for albedo and roughness depends on your lighting technique). And storing directional information is too much data. In Frostbite, Enlighten cause us the same trouble. Either using lightmaps or lightprobes.
      As you, we were thinking of applying these directional lighting in deferred, so restricting lighting to lightprobe as lightmaps will not be practical. And this is exactly what Guerrilla have done (partially in KZ4) as explain in their GDC 2014 presentation. Still there is a lose in quality.

      And like you I was thinking about clustered decals which is highly desirable if you are doing clustered forward shading. But same conclusion as you, will inply lot of constraint.

      So I just resaid what you already said, useless comment :). Thanks to have share your thought here.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: