Siggraph 2016: An Artist-Friendly Workflow for Panoramic HDRI
August 28, 2016 15 Comments
The slides and course notes files of me and my-coworker Sébastien Lachambre and Cyril Jover “An Artist-Friendly Workflow for Panoramic HDRI” are available here:
On the official PBR course website:
http://blog.selfshadow.com/publications/s2016-shading-course/
and Unity Labs website:
https://labs.unity.com/
On the asset store of Unity there is a pack of HDRI. This is a pack of 7 LatLong, 8192×4096 HDR images shot in different locations around the world. Accurate, unclamped cubemaps of interior and exterior environments; HDRIs that include the Sun are provided with an alternate version with the Sun already removed. https://www.assetstore.unity3d.com/en/#!/content/72511
Note: HDRI can be download in Unity then use in another context, they are just .exr files. There is no restriction on both commercial and non-commercial usage for them.
Few notes:
As a programmer, I was often asking myself how artists are capturing HDRI ? After few research I have noticed that very often the HDRI available on the Internet lack range or metadata information to reconstruct absolute HDRI. Worse, many HDRIs are tweaked to looks good instead of being accurate to be used as a light sources. With my co-worker we have decided to write an extensive “tutorial” that explain how to capture an accurate HDRI. We have voluntary provide a lot of details and our equipment/software recommendation list. We hope by this to save readers time when they will try to reproduce our workflow. We have consider our workflow from an artists point of view instead of a programmer point of view, trying to use commonly know artist softwares. We also limit ourselves to average budget for this kind of capture.
The course notes (~80 pages) is the interesting part of the talk, the slide are just here to give an overview. I have included a section about what I call the “path of light”. It explain what is happening inside a camera from an emit photon to its life through the optics, the sensor, then the software processing. This is not something that is needed to understand to apply our method. But it is a really interesting piece of knowledge and help to put correct words on thing :).
Lastly in the course note I have decided to integrate a section call “Look Development”. Look development is a term pretty common in VFX industry but I was surprise to see how few people know about it in the game industry. At Unity we have develop a new tools call “Look Dev” that will be experimental in the next version 5.5 beta. This is a viewer that aim to help look development with HDRIs:
Of course, this kind of tools already exist, Substance, Marmoset, Mari, etc… The benefit of having one integrated into Unity is that it is WYSIWYG (What you see is what you get) in your Unity game.
Finally I have chose to add some digression about how to remove the sun and replace it with analytic directional light of similar intensity as it is often either not discuss or resume to a single sentence in others documents. This is currently a complex topics and I will be happy to see more technical discussion around this topics.
We hope that our document will help people to take their first steps into accurate HDRI capture and see more absolute HDRI appearing. The document come with some materials: Various Photoshop Actions that we use and set of HDRIs. One is available on publications website but we expect to deliver soon a package of HDRI via the Unity Asset Store for free that include all versions (white balance, absolute, unprocessed…) of an HDRI and metadata. At first we were willing to distribute all the CR2 too but the size of the download become crazy insane and we think to do it only for one HDRI, so programmer willing to do some test will be able to do it.
This is fantastic resource Sebastian.Thank you so much.
behram
I was just looking into how to create the HDR images for Unity. Now the HDR images I make are rendered skies from Vue and I save out multiple exposures to be assembled for use as a sky texture. Is there a similar process?
When dealing with CG image you don’t need to assemble the HDRI, unless the software you use only output in LDR but it is unlikely to happen in this modern ages :). So you can export directly a HDR latlong in exr or .hdr to be load by Unity. Hope that help
Thank you for your replies Sebastian.
I hope in 2017 one will be able to shoot “and” generate HDI right within Unity.
That way I can use devices such as Hololens, Lenovo Tango, Magic Leap, to capture my environment and generate a Real Time HDR Probe to better integrate my assets into the scene.
I know you will give us this gift 😉
Cheers,
behram
It’good , But why is it experimental?
I’ll ask you a another question.
I am making an open world game.
Granite is used for texture streaming.
But I am worried because there is no cluster LOD. instance LOD is too heavy.
Is there a development plan for Unity that corresponds to cluster LOD?
Hi. Please redirect your questions to unity forum. This blog is not the right place to ask question/features related to unity. Sorry.
Really appreciate you publishing this information. It has been really informative. I am in the process of buying additional camera gear and had a question about the sigma lens and neutral density filters you used.
How did you manage to mount the neutral density filters on the the Sigma 8mm lens? Does the front adapter ring that comes with the lens accept 72mm threaded filters? SIGMA states on their site that only rear insertion filters are supported but you clearly managed to use filters affixed to the front of the lens which is awesome [http://blog.selfshadow.com/publications/s2016-shading-course/unity/s2016_pbs_unity_hdri_notes.pdf, page 37]. Did this require any hacks or adapters to make work?
cheers,
Jens
Hey, yes, we simply use the adapter provide with the lens. The ND filter can be mount on it.
Thanks! That keep things simple and elegant.
Have you tried using ND gels mounted to the back of the fisheye? What issues did you run into using gels?
Hi, no we haven’t tried it. Problem with gel is that it is not simple to remove.
Hi, I just try to follow your steps create HDRIs, but I can’t get the result like you in step “ Absolute HDRIs ” . When using the resized image(The pixel value is the illumimage value )divide with original HDRI, I always got a brighter result, but Figure in your paper shows a darker result. And my final result just pure white. Am I do something wrong?
Is HDRI supported modifying exposure?
Yes, it is, but I didn’t adjust any exposure of HDRI, by the way, I use 2 EV step for shot, will this affect the result?
Hey Sébastien, i’ve just discovered the whitepaper and your presentation these days. Really interesting stuff! I’am producing high-res-HDRI with resolutions up to 600MP, so if I would convert all the RAWs to TIFFs, I’d use about 40-50 GB per HDRI. That’s why there is a question about the RAW to TIFF conversion. The paper says “dcraw is used to convert RAW files extracted from the camera into a linear 16 bit TIFF. PTGui can load a RAW file directly (PTGui uses dcraw internally, like most HDRI tools) but our team decided to perform some extra manipulation (like chromatic aberration and vignetting corrections) before sending the shots to PTGui.” Can you give me further informations what are the “negative” points in using direct native RAW in PTGui? Or what are the benefits using TIFFs in PTGui? I would be glad to hear from you! 🙂 Regards, Chris