Blender Studio
  • Films
  • Projects
  • Training
  • Characters
  • Tools
  • Blog
  • Join
  • BLENDER.ORG

    • Download

      Get the latest Blender, older versions, or experimental builds.

    • What's New

      Stay up-to-date with the new features in the latest Blender releases.

    LEARNING & RESOURCES

    • Blender Studio

      Access production assets and knowledge from the open movies.

    • Manual

      Documentation on the usage and features in Blender.

    DEVELOPMENT

    • Developers Blog

      Latest development updates, by Blender developers.

    • Documentation

      Guidelines, release notes and development docs.

    • Benchmark

      A platform to collect and share results of the Blender Benchmark.

    • Blender Conference

      The yearly event that brings the community together.

    DONATE

    • Development Fund

      Support core development with a monthly contribution.

    • One-time Donations

      Perform a single donation with more payment options available.

Training Highlights
Stylized Rendering with Brushstrokes
Geometry Nodes from Scratch
Procedural Shading Fundamentals
Stylized Character Workflow

Training types
Course Documentation Production Lesson Workshop

Training categories
Animation Geometry Nodes Lighting Rendering Rigging Shading
Film Highlights
Wing It!
2023
Charge
2022
Sprite Fright
2021
Spring
2019
Project Highlights
Project DogWalk
Interactive
Gold
Showcase
BCON24 Identity
Showcase
Fighting with Grease Pencil
Article
  • Wing It!

Shading and Rendering of 'Wing It!'

An in-depth look at how we achieved the 2D cartoon appeal with a 3D pipeline in Blender
  • Article
  • 5 Oct 2023
  • 14 min read
  • 4 min watch time
Simon Thommes
Simon Thommes Shading and FX
Report Problem

Introduction

With every open movie project that we start, we aim at creating its own unique identity as well as using various different art styles that we think are both appealing and a challenge, to us as artists, and to Blender's capabilities as our tool of implementing the project.
In the case of our previous two films, the difference in visual style could hardly be more different. 'Charge' with its heightened realism and game cinematic aesthetic stands in stark contrast to the 2D charm of 'Wing It!' inspired by hand-crafted Saturday morning cartoons.

Of course, while 3D animation is the common choice when it comes to CG realism, 2D animations are traditionally made by drawing every animated frame for the view angle. Trying to emulate some of the appeal you gain from that in a 3D production is its own can of worms.
There are some viable ways of going the hybrid route and creating essentially two-dimensional rigs. But we decided to push solutions that we had come up with for previous productions to their limits and go the route of the full 3D pipeline with extras.

Clearly, there is more to it than just slapping cartoony shaders onto any 3D mesh and animation to get the sweet sweet 2D look. It all has to come together, starting with models that capture a strong artistic vision of the concepts and support the technical implementation, flexible rigs that give a lot of control and allow the animators to go wild with deforming the characters into extreme cartoony poses. And, of course, everything can still fall apart in motion where too much shift in perspective or not enough exaggeration can destroy the carefully crafted illusion.

In this article I want to untangle how we implemented the shading and rendering part of achieving that 2D appeal and give you a deep dive into how we made it work for us. To gain some insight on the rigging and animation side of things, head over to our production lessons on those topics in the content gallery.

Video breakdown of the main visual elements that make up the rendering of an asset in 'Wing It!'

Overall Breakdown

There are a lot of elements and layers as part of the shading and rendering that contribute to the look of 'Wing It!'.
Let me give you an overview of the kind of techniques that we used, and the decisions that went into them.

Render Engine

Quite early into the process of finding the look for the film, we decided that we would try to render it with Eevee. In the past we had made good experiences using Eevee for non-photorealistic rendering styles. Eevee would allow us, more easily than Cycles, to cheat around the fact that everything is 3D geometry by manipulating the way that the light interacts with the surface. And fast render times, especially for previews, allow for a quick turnaround during production.

One of the key elements we did to make use of Eevee is something that we had previously done on 'Coffee Run', which is to just use a subsurface scattering shader on everything. Yes, everything! That is a great and simple technique to wash out the light information over the render and and get rid of some of the dimensional detail in the surfaces that is necessary in the silhouettes, but can otherwise make the model look too three-dimensional.

Difference between Diffuse and SSS shading with Eevee on the dog character of 'Wing It!'

At this point we're already cutting ties with being able to render this film in Cycles, as subsurface scattering has a drastically different effect there when cranked to the maximum.

Another aspect that makes Eevee attractive for NPR rendering is the very explicit control it gives us over the normals that the shader uses for rendering. The normals dictate a lot more over the light interaction of the surface than in Cycles, which allows for more fakery.

I just briefly want to point out that we decided againt using the popular 'Shader to RGB' workflow for multiple reasons, like the inability to use it together with SSS and the high reliance on the actual 3D shape. Instead, we opted for entirely custom solutions to emulate 2D looking shading on the characters.
I will talk about this more later in another article.

Limitation of Shadows

Shadows have the property that they can easily give away the spatial relationship between objects , plus they also add a level of overall noise to the image, especially in motion. So something we decided to do was to largely get rid of shadows and reserve them only for important elements with simple shapes (like the overall hull of the rocket). For elements that needed a shadow to give a base level of visual fidelity and support the lighting, we simplified the shape of the shadow and faded it out over distance with a gradient. The way we did this was to remove the shadows from the actual visible geometry and then, instead, use a proxy geometry that is simplified and casts a transparent shadow, out the opacity from the ground up.

The use of proxy meshes for shadow simplification.

Surface Colors

To keep the overall look flat (and the workload small) the base colors of props and environments are usually uniform colors with some slight breakup with a pattern texture. This pattern is affecting the variation mainly in chroma and not so much in the value.

Setup for generic subtle texture variation pattern.

On top of these uniform colors we had simple nodegroup that would darken/brighten the color of a face based on an attribute that we could easily set in the model. That way we could emulate some generic lighting as part of the props/environments themselves to separate areas that are at a certain angle to each other.

Using a face attribute on environment assets to control shifting the value of the color for shading.

Normal Faceting

One technique that turned out to be really valuable throughout the course of the production is faceting of the normal vectors. Restricting the direction of the normals used for shading to discrete average directions for certain areas of the surface leads to flat patches of color and breaks up gradients created by round surfaces.

Example of normal faceting to break up gradients into flat patches of color.
Example of normal faceting to break up gradients into flat patches of color.

To achieve this we had two different methods that we used individually case by case, depending on the shape of the geometry. One simple way is to do this as part of the shader, by replacing the actual mesh normals with ones that have been snapped to discrete vectors using a voronoi texture.
This method is great in that it is fully automatic and gives great results in a lot of cases.

Procedural shader-based normal faceting used to break up gradients.

Whenever we needed more manual control over how these faceted areas are separated and merged, we had a Geometry Nodes setup to calculate these faceted normals based on either automatically generated boundaries using the edge angle (similar to auto smoothing) or to manually set boundary edges using an attribute.

Initial demonstration of setting the facet boundary edges in combination with Geometry Nodes.

Outlines

It became clear at some point that our visual style would really benefit from some kind of outlines for both clarity and the 2D appeal. There are some established ways to do outlines in Blender, for example using shell meshes with inverted normals and backface culling and the LineArt modifier for Grease Pencil, but they come with their own issues.
We would have loved to be able to use Grease Pencil for the outlines, especially since that would mean that we could easily integrate it into our workflow and other elements drawn with Grease Pencil. But since Grease Pencil data is not compatible with Geometry Nodes (yet), we knew that if we committed to this we wouldn't be very flexible to make adjustments to the behavior and look of LineArt.

The main things that we wanted to achieve, but that LineArt couldn't give us, were the ability to draw the outline inside of the silhouette, rather than on top of it, and to control the color based on the surface that spawns the line. So I decided to try implementing similar functionality like what the lineart modifier does in Geometry Nodes and it worked surprisingly well from the start.

Early turntable test of the custom lineart nodes modifier using the character model of the dog.

The way that we used this technique was different for the environment and the characters, as there were slightly different technical and visual requirements.
I will be going into more detail on the character side of things and related topics in a separate dedicated blog post.

For the environment, the outlines were generated and rendered as strips of mesh geometry on top of the actual meshes based on the camera view. Subscribers can download an example file with the setup here.

Example setup of the mesh outlines generated for the camera using Geometry Nodes.

The fact that we're using Geometry Nodes to generate the outlines means that we can use attributes to store information on each object individually to influence how the outlines are generated. That makes it easy to pick the color of an outline for each object of an asset as part of the asset creation.
For that reason we had a 'Line Properties' modifier as part of our asset library, which would simply write a few named attributes that the outline generation uses to the mesh and make the settings nicely available in the modifier UI.

Splitting the outlines into their own render layer also allowed us to light them separately, which is useful to create stylized rim lights.

Surface Lines

On top of the automatically generated outlines depending on the camera angle, we also made a setup that allowed us to freely draw lines in the same style as the outlines, on top of surfaces in the shot file. This was great to add some additional polish for close-up shots and make everything look a bit more expensive, without that much extra work.
The lines adjust to the view angle to render them in the same style as the environment outlines. This makes sure that they look like they are drawn from the view perspective, rather than on the surface like a texture, which results in distortion when you look at the surface at a steep angle.

Setup to easily draw lines on surfaces

We used this technique both on the shot file level as part of the lighting process and also on some of the assets directly, so the lines would show up in all shots. For example the seams on the improvised wings and the screws on the rocket hull were done this way.
The Geometry Nodes setup to generate the line geometry from drawn curves also allows deforming them with the surface mesh, they just need to be drawn in the rest pose.

Drawing lines on the rest pose of the character resulting in the lines deforming with the mesh like a texture.

Paper Texture

To marry everything together and nudge the image a bit further away from that clean CG look, in a way that is more reminiscent of 2D animation, we added a subtle paper texture overlay on top of everything. Unfortunately Youtube compression eats up most of this effect, but it does add to the look.

Comparison of frame without (top) and with (bottom) paper texture overlay on everything.
Comparison of frame without (top) and with (bottom) paper texture overlay on everything.

The reason we didn't do this as a compositing effect at the very end, but as part of the shader instead, is that this way we could more easily attach the anchoring of that texture to moving objects or to the environment with a moving camera respectively.

Compositing

Besides the shot-by-shot compositing to adjust to composition of the shot, guide the eye and beautify the frames, we had a node-group on everything to help things look just a bit more painterly.

Demonstration of the simple painterly compositing filter applied on all channels individually and then on the diffuse light pass only.

This is mainly how we did it since there wasn't an implementation of the Kuwahara filter yet, which there will be now in Blender 4.0. It helps a lot to simplify a high frequency of detail and round off sharp corners in shapes, which makes it look just a bit more hand-made.

In the end we didn't rely on this effect that much, as the outlines ended up being much more prevalent than we initially planned. So the frames stand mainly on their own without too much heavy lifting in the compositing. That is generally very nice for the lighting process since what you see is what you get in the viewport camera. This will be, of course, much less important as the viewport compositor gets more and more features.

Character Shading

The topic of shading and rendering of 'Wing It!' is not complete without talking specifically about how we did the character shading. Since this topic seems like a great opportunity to go into a little more technical detail of how it works and this article is already very long, I'd like to do exactly that in a follow-up article.

But let me give you an overall breakdown of the different elements that we had for the character shading:

  • On top of the SSS shader to reduce the dimentional detail we blurred the normals over the mesh using Geometry Nodes.
  • The outlines were done with a custom Geometry Nodes solution like I mentioned before. But instead of generating the outlines as geometry as we did for environment elements, they were part of the shader.
  • Using a similar technique we had fake rim lights and toon shading directly as part of the shader. This way we achieved sufficiently two-dimensional looking effects and a lot of control over the look for the characters to match different lighting scenarios throughtout the film.

Too Many Topics

There is a list of other things I could have gone into detail on here,

  • like how we managed to pipe all the different properties that make up the settings of the shaders and are highly relevant for the lighting of a shot into each shader, while making them nicely accessible for the lighting artist in the shot file
  • how we stylized reflections by projecting a flat texture from the view direction and locking it to the center of the mesh island that we get via Geometry Nodes
  • or how we used the same outline technique from the characters to create a fuzzy silhoutte on the hay stacks

but there are just so many things that come up during a production and of course this is only the shading and rendering part of it...

To get an idea of the whole process, on how this shortfilm came to be and the iterations that we went through, check out our weekly production logs or watch the video series that Haru has been filming alongside the production, showing a look behind the scenes into what was happening in our daily work on the film.

Conclusion

All in all, with the tech that we built throughout the production we were successfully able to pull off the look that we were aiming at for most of the runtime of the film. That said, there are definitely some unfortunate, obvious hiccups with the outline/silhouette tech that needed to be fixed for the poses frame by frame in a handful of shots.

Putting things into perspective, the project was a really interesting experiment in the realm of NPR rendering for us. Now we are moving on to our next project 'Gold' that is aiming to strike a halfway point with a highly stylized painterly visuals but also quite cinematic, realistic lighting with refraction and bounces and everything in Cycles. It will be exciting to see how some of the techniques that we used in Eevee for 'Wing It!' can translate and be implemented using Cycles.

Join to leave a comment.

3 comments
Ryan M. Williams
Ryan M. Williams
Oct. 31st, 2023

Pretty amazing! It gives me a lot to think about for my own projects.

johnny michael
johnny michael
Oct. 11th, 2023

It's amazing, I like Wing it style, I think this article have already provided an important impetus for learning blender.

Kai Müri
Kai Müri
Oct. 5th, 2023

Awesome! Thank you for taking the time to write this breakdown, Simon! 🙌

Films Projects Training Blog Blender Studio for Teams
Pipeline and Tools
  • CloudRig
  • Blender Kitsu
  • Brushstroke Tools Add-on
  • Blender Studio Extensions
Characters
  • Mikassa
  • Whale
  • Ballan Wrasse
  • Snow
Studio
  • Terms & Conditions
  • Privacy Policy
  • Contact
  • Remixing Music
Blender Studio

The creators who share.

Artistic freedom starts with Blender The Free and Open Source 3D Creation Suite