Siggraph 2017: Physically-Based Materials: Where Are We?

The slides of my talk “Physically-Based Materials: Where Are We?” in the open real-time rendering course at Siggraph 2017 are available here:

http://openproblems.realtimerendering.com/s2017/index.html

This talk is about current state of the art of physically based material in real time rendering and what could be done in the future.
Often people tend to say that material rendering is a solve problem, but we are very far to have solved it. And the main reason is that we even don’t know what is a true/correct model for a physically based material.

Note: I forget to mention in the slides 56-57 where I compare the anisotropic GGX with reference that I use Disney remapping for anisotropy parameter.

// Ref: http://blog.selfshadow.com/publications/s2012-shading-course/burley/s2012_pbs_disney_brdf_notes_v3.pdf (in addenda)
// Convert anisotropic ratio (0->no isotropic; 1->full anisotropy in tangent direction) to roughness
void ConvertAnisotropyToRoughness(float roughness, float anisotropy, out float roughnessT, out float roughnessB)
{
    // (0 <= anisotropy <= 1), therefore (0 <= anisoAspect <= 1)
    // The 0.9 factor limits the aspect ratio to 10:1.
    float anisoAspect = sqrt(1.0 - 0.9 * anisotropy);

    roughnessT = roughness / anisoAspect; // Distort along tangent (rougher)
    roughnessB = roughness * anisoAspect; // Straighten along bitangent (smoother)
}

The conclusion of my talk is that a future BRDF could be:

Layered BRDF: 2 specular BRDF + Diffuse BRDF
– All derives from the same anisotropic NDF
– Energy conserving: MultiScattering, Fresnel interfaces
– Option to switch to Airy reflectance Fresnel
– Shape-invariant “matching measure” NDF
– Multiscale Diffuse and Specular representation

Other than the two last points, I think we can approximate such BRDFs for realtime in two years (next console generation ?).
To extent a bit, here I think a good BRDF could be an anisotropic GGX diffuse lobe + two anisotropic GGX specular lobe (one for hazy and one for sharp) + one isotropic coat GGX lobe. Added to that an option to replace Fresnel term of base specular layer by Airy reflectance. Multiscattering specular/diffuse should be possible in real time with precomputed table. Approximation for Fresnel interfaces should be possible given some constrain and chosing a physical representation. Diffuse and specular roughness should be separated. Complex IOR for metal should use the artists friendly 2 color model of Framestore.
The true challenge being to be light coherent. i.e have a good approximation of the interaction of this model with area light, image based light and GI.

But what is interesting is the consequence of such a choice. With 2 parameter for diffuse roughness, 5 for specular roughness, RGB diffuse, 2x RGB specular color, 1 for coat specular: 17 parameters solely for BRDF without normal map, AO, SO… And with the growing adoption of VR and 4k, I predict that forward engine will be the norm in the future. Not necessary for game, but maybe for the growing demand of real time movie.

Originally I wanted to cover current state of physically based rendering (Material, lighting and camera). Due to lack of time I have switch to Material only (volume, surface, character BRDF etc…) and , again, due to lack of time, I have restrict myself to “common” material. Then reduced it to only opaque reflective material (no transparency)… Too much thing to cover!

PBR is a huge unsolved topic and when I heard people saying, “yes, we are PBR”, I just heard “I have no clue what PBR mean”. And I am not confident myself that I understand what PBR really mean 🙂

6 Responses to Siggraph 2017: Physically-Based Materials: Where Are We?

  1. Sam says:

    I do not know what goals you have . I’m just a designer and want to be simple. In real life, ㅑcan construct my desired scenes in real-time CG in such a way without any additional other specialist , just as I construct scenes I want using lighting and background. But … it seems to be getting harder and harder.

  2. Yves Poissant says:

    Hi Sebastien,

    At the beginning of this blog you mentions that the course slides and notes are at the given URL. I found only the slides there. Are there any additional notes?

  3. Yves Poissant says:

    That was a very good read. Thanks for putting your thoughts in this document.

    There is something that’s been bothering me for a while concerning the BRDF formulation as
    (F(Wo,Wh)G2(Wo,Wi,Wh)D(Wh))/(4|Wg.Wo||Wg.Wi|)
    and even the distribution of visible normals
    (G1(Wo,Wh)D(Wh))/cos(thetao)

    It is the fact that the normal distribution does not change with Wo. Whatever Wo grazing angle, D is always the same, only scaled by G.

    It seems to me that D appears to be a smoother distribution as Wo nears grazing angle. Drawing a schematic surface profile and hiding the masked portions of the surface with Wo grazing angles intuitively shows that.

  4. Paul Houx says:

    Hi,

    I’m sure you’ve heard of this paper by now?

    Seems like your prediction of “2 years” was too pessimistic.

Leave a comment