Rookie Awards 2024 - Open for Entries!
Stylized Rendering in Unreal
Share  

Stylized Rendering in Unreal

by mrtapa on 1 Jun 2020 for Rookie Awards 2020

How a game can be completely transformed with heavy post-processing, while maintaining gameplay clarity

5 10352 1
Round of applause for our sponsors

One of the teams at university was tasked with developing a game with a painterly art style. And they outsourced a request for a shader to help them achieve this style.

In this article, I will be covering the stylized rendering tool which I created for them, which consists of various post-process shaders while having per-object control and most importantly, it prioritizes sharp and clear gameplay without screen space distortion.

There are 4 distinct shaders at play here, all contributing, one after another (the post-processing priority in unreal was utilized here) to create the final image.

We take the pre-tonemapped scene Unreal gives us, reduce its color information, blur it out, apply a paint filter, and then figure out where shadows are so we can give it that hard shadowed look with flat colors

Posterization

This shader is one of the most common ones among stylized environments.

We prep the scene with this, as later on some shaders will benefit from the information generated from this. Posterization usually introduces bands with wrong colour information in them, in the screenshot above, it does not, as there is a simple lerp in the shader that dictates how much wrong colour information should be added to the scene.

Kuwahara

Typically in image processing, a Kuwahara blur filter is a method used to denoise an image with minimal loss in detail

We apply the filter to our scene, but with a slight modification: pixel normal direction. We obviously blur pixels in screen space, but when taking the normal into account, we prevent that screen space shader look as if you slapped on an image in front of your camera and can't remove it.

The paint effect

This was the most challenging effect to nail down, but the final solution is quite elegant. How does one generate brush strokes: do we displace objects, then we need to rely on high vert count, maybe a masked secondary mesh? But that still needs to be displaced and would again be doubling the existing polycount. Even if something were to be procedurally generated in a DDC like Houdini, that is still a big introduction to the pipeline.

So the answer was once again: post process. We project a normal map, in world space, so that once in motion the normals don't change, and then we use that information to distort the viewport UVs. This gets us 90% there, but this technique has one inherent flaw which I will cover its soultion a bit

Flat Lighting

Now we have modified the scene towards a stylized direction, but we have no altered the lighting yet. The last shader in the stack is only about lighting. It compares the raw scene albedo and the final image to determine where shadows are and generate a mask. Now we can give the artists control over what gets applied where, so both shadows and lit areas can have a separate tint. The final touch is that we apply the albedo back again to the final image, resulting in a more flat and stylized overall look

The tools

Iteration speed is always of the utmost importance, and while managing 4 shaders with material instances is manageable, it is limiting the amount of control and ease of use we can bring to the artists.

Stylization Volume

When you also factor in multiple levels, and the need for manual control over the look of the stylization at runtime (e.g day and night cycle), the need for a tool which controls all aspects of the stylized rendering becomes the obvious solution.

And so, a custom blueprint actor was built, with a single post-process volume component. It takes care of assigning the materials to the PP stack and exposes control for each shader. No need for keeping track of material instances for each level, we can use this actor the same way we would use a normal post-process volume, with the added benefit that everything is exposed and available for other blueprints to control at runtime.

Per Object Control

Having manual control over objects is important, especially for gameplay. Unreal already gives us a few tools which we can use to enable support: Custom Depth and Custom Stencil.

We can build a few material functions, which then can be used in our global stylization volume, effectively allowing us to control how each effect works. Things like which objects in the scene should receive a particular effect, or should it be global, should it start from a certain distance, and/or end at a specific one form the camera, how hard should that fade edge be, and so on.

The Custom Stencil buffer in Unreal is from 0-255, but we can remap that to achieve finer control. With a simple material function, we can use the Stencil as a multiplier to any parameters that we want in our shaders.

For example, the paint effect uses the stencil values to multiply both the angle and the strength of the normal, and our stylization actor defines the remapped range as 10. This means that each object with a stencil value of 255 would multiply the global PP values by 10.

Object Control Tool

Every Primitive component in Unreal has Custom Depth and Custom stencil controls in its details panel, but when you have a lot of objects in your scene, and especially if they are organized in blueprints rather than just placed around the scene, it becomes extremely tedious trying to tweak and control these values.

With Editor Utility Widgets we can build a little tool that does that for us, for any objects the artist has selected, regardless if they are placed in the level or components in a blueprint.

Earlier I mentioned that the paint effect shader has one significant drawback, and that is with any object in motion. Because it projects normals in world space, as soon as the position of an object changes, so does the effect applied to it. We can isolate a specific Custom Stencil value, and disable the paint post-process effect entirely for that mesh. I assigned 1 as that value. One problem solved, but now any movable object loses that painterly look.

The Object Control Tool also solves this, as we basically convert the paint shader, into a function that is added to the UVs of the master materials. And then it is just a matter of exposing the parameters through the Utility Widget

Another benefit of using an Editor tool like this is asset management and folder structure. Artists no longer need to create material instances to tweak the properties of objects, because a dynamic material instance is created for every object the tool edits, and that is saved alongside the object in whatever sub-level it exists.

Performance

One of the benefits of using post-process shaders is that they have a fixed cost. At 1440p, on an RTX 2070, the 4 shaders combined cost 1.25 ms.

Unreal's post-processing stack can work alongside the temporal upsampling, and because we have two shaders that are more or less blur filters, we can effectively run the game at a much lower native resolution, without losing any significant detail or clarity. In this progress view, you can see the difference between 50%, 100% and 200% resolution scaling.

The end result

Alright, enough about how it works, here are some On/Off screenshots, and a bit of gameplay, because in the end, that is what we care about :)

Wrapping Up

I developed these shaders and tools specifically for this project and its requirements, but the end result is a somewhat general and flexible stylization system, hence it is highly likely that in the upcoming weeks I release the source as a standalone package on GitHub. If that happens, I will probably post about it on my twitter.

If you want to check out the Rookies entry for the actual game, you can do so here.


Comments (1)