In a span of approximately four months (November 2021 to February 2022), together with Pablico Fournier and Hjalti Hjalmarsson, and under supervision by Andy Goralczyk and help from Francesco Siddi, we created a short promotional animation for the Snow character. (rigging: Demeter Dzadik, pipeline support: Paul Golter, additional modeling: Julien Kaspar, FX and geometry nodes: Simon Thommes, layout/animation feedback: Rik Schutte)
The idea was to place Snow in a dystopian Aztec-inspired mega-city running away from two drones. After being trapped by a third drone, he uses his secret power, and the animation ends.
We wanted to create a short animation to test Eevee in a production environment. What are the advantages and obstacles when working with Eevee vs Cycles? What are the render times? Which factors increase render times? These are some of the things we wanted to find out.
Keep in mind that some of the aspects mentioned here will change due to the Eevee Rewrite.
Textures (Krita) and shaders are designed to be re-used in a semi-procedural way. The “hotspotting” technique for texturing was used to assign faces (based on proportions & scale) of the object. Using vertex paint to create various effects like puddles, dirt, and rust to make them look more distinct but streamlined.
The scene was built as if you would render it in Cycles. Meaning no optimizations for models like you would see in modern games. No LODs, baked normal maps or retopology was part of this production. Being aware of this, adding modifiers like subdivision surface or bevels were limited to more important objects. Decimation was used to optimize scene geometry, some destructive and non-destructive.
The production structure was simplified by adding the lighting in the set file. This creates more consistency but is less performance friendly. The lighting.blend contains the drone Spot Lights, the set lighting, and the Snow lighting rig. This is an empty object constrained to the rig with key, fill, rim, and other lights.
Beau's machine:
Paul's machine:
Animation:
Eevee and Cycles have their own strengths and weaknesses when rendering animations. It is possible that Eevee has the potential to render faster and with better image clarity; however, during the compositing stage, the passes were missing the refinement of the sample-based “Combined” pass. Specifically, the Depth of Field and Motion Blur are reverted to a lower fidelity version, as well as any materials with Alpha Blend will be missing from the image.
This ultimately caused the reconstructed image to be noisier and incomplete making compositing less effective. That is why a lot of smaller adjustments and iterations were made before creating the final renders.
When rendering (F12) the CPU is being used for object evaluation, this process is on a single thread. After spending time logging performance it became more apparent that the single-thread performance plays a larger role determining render times, no matter the GPU power. This is context sensitive and applies mostly to Snow and the Drones having Subdivision Surface modifiers (when downloading the production files, a good way to limit render times is enabling simplify).
As you can see in the image below, Paul's machine has faster single thread performance (see Paul's Machine above, under 'Quick Stats'). Event though my machine has a better GPU the server processor has a much lower single thread performance. This could use more testing where the GPU is the same, but with different CPU.
Eevee requires the compilation of shaders to load on the GPU in order to render. This might take some time at the beginning of the production, fortunately, this will be faster when almost all of the shaders have been written.
Rendering in the viewport to change the lighting with eight or more samples will slow down the responsiveness a lot. That is why the viewport samples are set to two, with a scaled down viewport. To improve the performance we used “Simplify” to lower the object evaluation time.
Lights in Cycles can influence the various channels like the Diffuse and Glossy to be turned on and off. But lights in Eevee allow more control by treating the Diffuse, Glossy, and Volumetric as slider parameters. This is one of my favorite aspects of working in Eevee, especially increasing the values above 1.0 to boost a channel’s influence.
To avoid uniformity in the drone spot lights and give it more variation - like an old flashlight - small "shadow caster" objects are placed to block the light. There are two spot lights, one for lighting the scene and the other to affect volume only (see image below).
The textures were created in a way that they can be re-used in various ways. Either by making a copy of the existing material and assigning different colors using a ColorRamp node. Or by using vertex blending or textures driving unique changes.
Using the Asset Browser - first time during a project - to drag and assign materials was a good way to iterate and experiment. I recommend using the Asset Browser to keep an overview which materials you need to create.
My general suggestion is to build and create your scene like you would for external game engines. Which is to optimize your models and bake to image textures when rendering animations in Eevee.
Geometry
Shading
Lighting
Rendering
Compositing
Creating the Snow Parkour in Eevee was incredibly fun, but also required a more thoughtful approach. For example: baking Irradiance Volumes & Cubemaps, deciding which lights use contact shadows, avoiding bump or displacement nodes, and the limitations that come with screen space effects. On the other hand, Eevee gives a lot of creative freedom when using lights, great volumetric effects and to render without much or any noise.
As mentioned in the ‘Recommendations’, the best way to approach rendering in Eevee is building your scene as efficient as possible, like exporting models to game engines. This includes optimizing your models and baking all of the channels to texture files. And to avoid using subdivision surface modifiers that increases object evaluation time, when possible.
There is an interesting balance between flexibility and performance when talking about the shader nodes. More flexibility, such as using vertex paint to blend textures, takes additional shader graph processing time for Eevee. While the performance friendly use of baked textures might use more vram, but is faster to load in the shader graph. This comes down to the art direction and goal of the project.
If you wanna learn more about Eevee, check out the documentation
Thanks to Clément Foucault for more in-depth explanations
wow, bravo. great article!
I like how thorough you are with this article, however one thing that rubs me the wrong way is comparing Eevee to a game engine. A comparable scene in either Unreal or Unity would be rendering animation in real time on your beast machines.
Eevee needing 3 minutes for a single frame is hardly a real-time performance, considering that it'll probably take around 10 min/frame on an average machine. I've found that with medium to large scenes Cycles isn't much slower to render. I hope that Eevee rewrite will bring us closer to the goal of the instant rendering.
Join to comment publicly.