You might have been using HDR Images already, to put the last touch on your lighting. Or maybe you used to wrap a library HDRI around your all-new model. Because HDR Images carry real-world illumination, right?
So, wouldn't it be nice to put your model right in the same place, the very place that illumination is taken from?
Let's get started!
This will be your first task. Download whatever is necessary to import and setup the scene in your 3D Software. The real challenge will be to create a convincing glass, chrome, and pottery. Use the real objects in the background plate as visual guideline.
This image will allow a direct comparison.
It will also put your 3D software to the test.
And, most important, it will make you familiar with the technique.
So this one is going to be fun.
Once you manage to recreate this scene, you end up with a perfect virtual stage. Now just clean up the table, and put any model of your own on it. You can use your materials from stage 1 as template, or create new ones with the experience you gained.
If everything goes right, this will be a matter of plug-n-play.
Ready? Steady. GO!
The images come in different flavours and sizes. Bigger isn't always better. You will most certainly find the Low Resolution environments to give better GI results in shorter time. But hey - see for yourself!
Medium - 800 x 600 [1.3 Mb]
High - 1600 x 1200 [5.5 Mb]
Spherical Map a.k.a. Lat/Long
Low - 512 x 256 [0.3 Mb]
Medium - 1024 x 512 [1.3 Mb]
High - 2048 x 1024 [4.9 Mb]
Angular Map a.k.k Probe
Low - 256 x 256 [0.1 Mb]
Medium - 512 x 512 [0.5 Mb]
High - 1024 x 1024 [1.7 Mb]
The screenshot shows the original LightWave scene.
All conversions were made using Deep Exploration.
Make sure to switch off the lightsource, since we need this one for OpenGl only. The HDR Environment is supposed to take care of all your lighting needs. In case the format of your choice lost the camera information, use this data:
|X||0.81524509 m||Heading (Y)||-13.30°||Zoom Factor||3.06|
|Y||1.9344934 m||Pitch (X)||32.40°||Horizontal FOV||47.09°|
|Z||-2.9125748 m||Bank (Z)||1.10°||Vertical FOW||36.19°|
Now, the trick is to project the backdrop image on the table. In LightWave it's called a Front Projection, your Software might use a different name. However, this will allow you to render above image in one pass, with the table acting as a shadow catcher. You might need to adjust the table brightness to get a smooth transition to the actual backdrop. Do some quick testrenders of the region where the laptop overlaps our table.
Specular hits are in fact reflections of surrounding lightsources, so-to-speak the upper dynamic range of the environment. Think about it for a sec! We don't have traditional lightsources here, we merely have the actual environment available. But hey - that's even better, isn't it?
So when you set up your material, use the reflection property instead of specular. Use blurred reflections instead of a glossiness/spec spread factor.
A short description of the material settings above:
|Mirror Ball||70% Reflection|
|Glass_Outer||Plain white color, Refraction Index 1.033,
Fast Fresnel shader with 20° glancing angle,
reflection 0%-40%, Transparency 100%-80%,
Interference shader (color bandings) blend mode 37%
|Glass_Inner||same as Glass, but Refraction Index 0.93
This takes care of the rays passing from Glass into Air.
No need for double sided surfacing!
|Plate, Saucer, Cup||Color 66 / 75 / 90,
5% Bump FBM (3mm/3mm/3mm),
14% Reflection, 60% Reflection Blurring
|Table:||Color Map: Backdrop in Front Projection,
10% Luminosity, 100% Diffuse
These settings are far from beeing perfect, but might be a good starting point. Off course, you will do better.
Use GI, make sure all 3D lights are off, turn on Reflection and Refraction.
And let everybody know at the HDRI Challenge HQ!
Now you are ready to enter stage 2: Freestyle.
Clean up the table, and put your own monster on it. Animate it! Do something cool. Common, anything this cooler than this boring plate...
Join the discussion!
The primary goal of this challenge is to examine the usuability of HDRI. Most 3D packages claim to support HDRI and Image Based Lighting, but to what extent? How many hoops do you have to jump through to make it work? Is the extended range taken into account by the Raytrace algorithms of your render engine - i.e. does it show up in reflections and refractions?
Do you encounter any special workflow issues that complicate the whole setup process?
In a combined effort we will evaluate these issues, and share our solutions. As a result, this is supposed to become a knowledge base, providing an optimized HDRI workflow for every 3D software around.
This project actually is part of a bigger project: My diploma thesis, entitled "Practical application of High Dynamic Imaging in a Postproduction Pipeline". No, it's not about showing fancy pictures about which big studio used HDRI in what movie - in their own propietary pipeline that we unfortunately cannot give more information about.
Instead, it is a hands-on approach I am following. I did compare different ways on how to photograph and generate HDRI environments, how to edit them, and now how to apply them in 3D. All using nothing but non-propietary tools, so that this technology is open to everybody.
And this is where this challange comes into play. I surely hope, you don't feel like beeing taken advantage of. It's just not fair to make such a workflow evaluation and take only the big 3-5 packages into account. And to test every package around myself, is way too much work.
In the end, your efforts are higly appreciated.
See, in exchange you get a nice template scene for your own use. And the best part is - this one will only be the beginning. Expect more of these HDRI sets in the future...
quite soon, actually.
Special thanks go to Christian Bauer for hosting this event.
For comments or questions join the discussion on HDRI Challenge HQ, or email me privately to Blochi@Blochi.com.