Texture Generation Part 2: Normal Swizzling

Last time we learned how to generate new image files from shader output. This tutorial is a direct continuation of that one, so go back and do that one if you haven't already!

We just looked at generating an unlit texture by mixing two textures together, but what if we have a shader that mixes textures on scene objects with a physically-based lighting model?  What if we want to preview the final result before we generate the images? We can do that! Let's get started!

Setup

Open the same scene from last time or open a new scene. Put some 3D models into your scene, either by right clicking in the hierarchy and adding basic shapes or by dragging in models from your assets.

It doesn't matter how you arrange your scene, we just want some 3D stuff that we can light up. Here's what I've got:

I'm also using the skybox we made in an earlier lesson but you can use whatever skybox you want.

Making a texture generation library

We want to generate textures that are unaffected by light and shadow so we put our texture combining code into an unlit shader, but we want to preview what the result will look like on a lit object. We'll do that by making a separate shader we can slap on our scene objects. We can copy over our texture combining code into a surface shader and be up and running right away!

In your Shaders folder in the Project tab in Unity, make a new folder called TexGen or something along those lines. Move the TexGen.shader we made last time into this new folder. After Unity finishes thinking about that, right click inside the folder and select Create > Shader > Standard Surface Shader.

Call the new shader TexGenLit and open it in Visual Studio.

Change the first line to 

Shader "Xibanya/TexGen/TexGenLit"

Open up the TexGen shader we made last time in a different Visual Studio tab. Change its first line to 

Shader "Xibanya/TexGen/TexGenUnlit"

Then copy all its properties and paste them into the properties of TexGenLit.

Down in the subshader where the template has sampler2D _MainTex;, delete that line and instead declare a sampler2D for all the texture properties and a half for the _BlendPower property, like this:

Unlike in the unlit shader, we won't be declaring any float4 _TextureName_ST values. That's because in a surface shader, those are declared for us in the invisible code we don't see. Update the Input struct so that we can get the UV coordinates of all three of our textures separately, just like in our unlit shader.

You have to type in the values here exactly like this, as this is the format the surface shader is looking for behind the scenes to know whether to get the separate UV coordinates for a texture or not.

Then delete the _Glossiness, _Metallic, _Color lines as we got rid of those properties. Now we can get to copying over our texture blend code. Skip on down to the surface function and delete everything inside of it.

then paste the lines inside the frag function from the unlit shader inside of it.

But we can't use it as-is! Let's tweak this.

We replaced all the old UVs with the UVs as we'll be getting them from the surface input struct, and instead of returning the final color, we're writing it to the SurfaceOutputStandard values. Save both shader files and return to Unity.

Make a new material in your project called TexGenLit and put the TexGenLit shader onto it. Put this material on the stuff in your scene.

Find the material we were using last time in your project files and click on it so that you can see it in the Inspector. (You can find it easily by searching your assets for t:material TexGen!)

In the inspector, click the gear icon in the upper right corner and select Copy Material Properties.

Then click the TexGenLit material so that you can see it in the inspector. Click the gear icon and this time select Past Material Properties

Because the properties have the same names, they'll paste in perfectly. Saved ya a few clicks.

For some reason the scale offset doesn't copy over, so you can tweak the TexGenUnlit blend texture to tile at 4x4 for a better match.

 So now we can preview what our generated textures will look like out in the wild. Still, seems a bit of a pain to maintain two shaders. If we make a tweak in one, we have to remember to tweak the other. That is, unless they both reference the same code. Let's make a texture generation shader include! That way we only have to tweak anything once!

Open the Shaders/TexGen folder in your operating system's file explorer and create a new text file. Call it TexGen.cginc. (So be sure to change the file extension from .txt to .cginc) In Visual Studio in the Solution Explorer tab, right click the Shaders/TexGen folder and select Add > Existing Item and pick the TexGen.cginc file you just made.

Once it's in your VS project, open TexGen.cginc

Just like with our toon lighting library, we'll start by adding some code that will keep us from accidentally defining this library twice.

#ifndef TEXTURE_GENERATION_LIBRARY_INCLUDED
#define TEXTURE_GENERATION_LIBRARY_INCLUDED
#endif

Since we're only going to be using this library for texture generating utility shaders, let's try something different from our toon lighting library - let's define the properties we use in both shaders in the library this time! This means we'll have to make sure we don't define them twice in our shaders, but it will also let us share more code.

Let's put this into our library. 

Now let's tweak our TexGen shaders to use it!

In the unlit TexGen shader, add the line

#include "TexGen.cginc"

right under the include for UnityCG.cginc

Then remove the sampler2D declarations and the _BlendPower declaration, but leave the _ST ones. The vert function can be left the same, and the entire frag function can be replaced with the single line

return Mix2(i.packedUV.xy, i.packedUV.zw, i.maskUV);

Save and have a look back at Unity to make sure nothing's broken. If you did everything right, your unlit TexGen material should look exactly the same!

Now for the surface shader. Put #include "TexGen.cginc" under the other pragmas, then delete all the property declarations. The surface function can be simplified to three lines.

half4 mix = Mix2(IN.uv_Tex, IN.uv_BlendTex, IN.uv_Mask);
o.Albedo = mix.rgb;
o.Alpha = mix.a;

Save and check back on Unity again. Everything looking the same? Cool cool cool.

Mixing Normals

Since we're going to be previewing this mix with lit objects, let's get the full benefit by throwing on some normal maps! If you grabbed different textures from me, now's a good time to import the normal maps from the same set into your project. If you are using the same textures as me, you can find the normal maps attached to this post. They're both free textures from Textures.com.

Update the properties of TexGenLit to add slots for two normal maps.

Then declare these in the cginc.

Then we'll add a function called MixNormal under our Mix2 function. We can get started by getting our mask value and unpacking the normal maps so we can blend them.

So approaching the issue of how to blend the normals...using lerp is okay enough, but it's really not great for blending normals because normals aren't colors, they're directions. If one normal map is flat and we blend it with a normal map that's bumpy, we don't want the end result to be something half as bumpy as the second normal map. That'll just make everything look flatter all around. We don't want flatness to cancel out bumpiness, we just want the opposite bumpiness to cancel out bumpiness. Flatness shouldn't affect the blend at all.

Fortunately Unity has a built-in function specifically for blending normals that we can use called BlendNormals! Let's include UnityCG.cginc at the top of our own cginc so we can use it.

(Ignore that it's outlined in red in my screenshot, I'm using an HLSL plugin in VS that doesn't play nice with Unity's ShaderLab.)

There's a problem with BlendNormals though! It only takes two arguments! How are we going to say how strong the blend should be in one direction or another? Well, we could unpack our normals using the mask as the scale. Update your includes so we can use the UnpackScaleNormal function!

MixNormal can be updated like this. We'll use the inverse of the mask for the strength of the first normal map, just like how in a lerp the first value is the strongest when the mix value is closest to 0.

Now we can update our surface function to look like this:

Save and go back to Unity. Drag the normal maps into the new slots and everything will look properly bumpy!

Note the scale offset of the tile normal map isn't the same as the scale offset of the tile albedo texture but it still aligns properly. That's because we're unpacking the normal map with the same UV coordinates that we're using with the albedo texture. If that's confusing, you can hide the scale/offset fields for the normal maps by adding [NoScaleOffset] in front of the properties like this:

But it'll work fine either way.

Next problem: how are we going to save a separate blended together normal map?

You Shall Multi-Pass

In the past we've looked at multi-pass shaders as a way to layer on an effect, like when we use one pass to create an outline and another pass for toon shading. When we use a shader on an object in our scene, generally all the passes get applied one after another unless they have a special tag or directive, but that's not the only way to use multiple passes within the same shader. 

If we are using a blit operation to "stamp" a picture through a shader, we actually do that one pass at a time, and we can tell the blit operation to use a specific pass. This is really handy for post processing effects with major variations, as it lets you swap out which version you're using without further polluting your project's shader keywords. We can also use multiple passes with our texture generation tool to generate different kinds of textures.

If we use one pass for the albedo mix and a different pass for the normal mix, then we can use them to output each combined image separately. Let's tweak our unlit TexGen shader to let us do that! Copy the properties from TexGenLit into the properties section of TexGenUnlit.

Now we'll need to rearrange the shader a little to let us use multiple passes more easily. After your tags but before the Pass block, put

CGINCLUDE
ENDCG

As we've seen when we've made vert/frag shaders before, we can use CGINCLUDE blocks to put code every pass in the shader can use. Well, let's start by moving EVERYTHING into the CGINCLUDE block! Then we're left with this sad little Pass block.

We can use this pass for the albedo output. Let's rename our frag function FragAlbedo. Delete the #pragma fragment frag line and put #pragma fragment FragAlbedo inside the pass.

Let's add a different frag function that we can use for our Normal Map output.

We'll use the float4 data type instead of half4 because we want to preserve as much accuracy in our normal maps as possible, since they're directions and not colors.

Now we can add another pass that looks like this:

If you save and go back to unity, you'll notice something weird about the material preview though. It's just a blue square!

If this were a shader we were using for an object in our scene, since the normal mix function is last, that's what the final result would look like, and so that's why the preview looks like this. Let's try dragging in the normal map textures to see what we get.

Okay but that doesn't look like any normal map I've ever seen! What gives?

Well think about it, a direction can be positive or negative, right? But it doesn't make sense to have a positive or negative color. The data we get from a normal map has to come to us with an RGB between 0 and 1. So the actual normal value we use is twice as wide as the range we get from the texture. We don't just want to mix our normal maps treating them like our albedo textures though, because that'll just get us flat and blah results because the flat parts of one normal map will cancel out the bumpy parts of the other, so we really do have to unpack the normal maps before blending them. But if we saved this to an image, there's no way it would be read correctly, it doesn't resemble the input normal maps at all! So how do we fix this?

In TexGen.cginc add this

Then change the FragNormal function to this

Save and peek back at Unity. Beautiful.

I know what you're thinking! What in the name of Sam Hill is going on here? It all has to do with the way normal maps are used to keep information. Remember, normal maps aren't really pictures for RGB color, they're just convenient places to keep XYZ directions.

If you look at an albedo texture in the inspector, you'll see something like this in the preview pane.

If you look at a normal map you'll see this

Did you spot the difference? One is marked DXT1 and the other is marked DXTnm. That indicates the color space that should be used for interpreting the data the texture has. That's why when you drop a texture that isn't imported as a normal map into a normal map slot, Unity notices and prompts you to change it. It actually changes the colors that Unity thinks are there in the texture!

Let's look at the RepackNormal function again.

I wrote this by looking at the UnpackNormal function in UnityCG.cginc and going in reverse. Here it is:

What are they doing? Why is it like that? The key is in the comment! It's that there are two different major formats for encoding directional data into a texture we call a normal map, and Unity's shader code is trying to cover all their bases in a way that is seamless to the end user. And some platforms don't even read normal maps like others do, so that's being accounted for as well. We can address that first.

Per Unity docs, "UNITY_NO_DXT5nm is set when compiling shader for platform that do not support DXT5nm, meaning that normal maps will be encoded in RGB instead." So that's why that define is there. In that case, when unpacking the normal map, the values we get from the texture are just doubled so that we can have negative values. Therefore when repacking the normals, we squish them into half while maintaining the same proportional size. Simple enough, we even did that recently with sine waves

Okay but if we're on a PC or Mac, we CAN read in DXT5nm format, so what the heck is going on there?

Swizzle My Normals

There are two major normal map encoding formats, DXT5nm and BC5. In DXT5nm, the green channel and alpha channels are used to keep the X and Y directions, and Red and Blue aren't even used, so they put 1s there. In BC5 format, the red and green channels are used to keep the X and Y directions, and Blue and Alpha aren't used, so a 0 is put in the blue channel and a 1 in the alpha channel. In both cases, green is used because it can hold more precise numbers. Red or alpha are used next because they're the next most precise. Apparently blue has the least precision. Keeping the directional data like this is called swizzling! I swear I am not making that up!

You might notice something interesting about this little scheme. In both formats we're accounting for X and Y, but what about Z? Where's the Z? There is no Z! Z is invented when the normal maps are unpacked. See for yourself!

At no point is the B channel of the normal map texture used at all! The alpha channel is multiplied by the Red channel because regardless of the format, one of those will be 1, so multiplying them together will get whichever one actually has data in it. But past that we're only seeing RG/XY being used! 

When I found that out it was little disillusioning. And here I thought Z mattered. Is our life a lie? No! There is a good reason for this, and we can use it to our advantage!

If you want to follow along with me, go back to our old friend the Frame Debugger (via Window > Analysis > Frame Debugger) 

Click Enable, then click RenderDeferred.GBuffer in the left pane and select Render Target 2 in the right pane.

Render Target 2, or GBuffer2, contains the scene's normals. You'll see this:

You know what's really interesting though, is that this looks absolutely nothing like the normals it's coming from

Also if you move any of the objects, you'll notice that the colors are mostly on the same sides, even if you rotate it.

That's because there's actually two kinds of normals! Local normals and world normals! GBuffer2 is showing us world normals. In other words, the direction of everything relative to the scene. That's used to create the scene lighting, because we want to use the direction of everything in a consistent way so that they all look like they're together in the same space being lit by the same light. (We take that for granted when we talk about how all this is in the same scene, but do remember that none of this is actually real!) The other kind of normals are local normals, in other words, the direction of everything relative to the object itself. Most of the time when we use normal maps, we use them for local normals because we want them to stay in the same place relative to the object if the object is rotated. It would be weird if the square impressions on the concrete tile slid away from the squares on the texture!

For that reason, in most normal maps we use XY direction as a way to come up with the perceived depth because we're working in local space. That said, you CAN use normal maps that provide world normal directional data! We'll probably look at that at some point. Those are really handy for environmental scenery that is never ever going to move! But since that depends on how everything is going to be set up in your scene, it's not likely you'll find many of those in texture packs lying around on the internet -- you're more likely to encounter one if you're working with a team and you're given one by your environment artist. 

So anyway, back to business -- let's generate a new image from our combined normals!

Multi-Pass Blitting 

Let's open our Texture Generation tool again. You can find it at Tools > Xibanya > Texture Generator

If you drag in the unlit TexGen material, click Generate, and pick a save location, you get the combined normal map, hooray!

But what if we want to tweak the albedo textures? We need to update our tool to let us pick what kind of output we want! Open the TextureGenerator script in Visual Studio!

Just like with the resolution, we'll make a class-local enum for our passes and assign the actual pass index to the enum value so we can easily convert from the enum to the number of the index.

In fact these indices are assigned implicitly, but it's nice to see them written out. Next add an outputType field under your other fields.

Since we're not assigning a default value to it, it will default to the first option, Albedo, which seems good to me. Add this line under the other enum popup line in OnGUI

outputType = (OutputType)EditorGUILayout.EnumPopup("Output Type", outputType);

At this point you should have something like this,

Now scroll down to Generate() and find the line with Graphics.Blit. We're going to tweak that so that we're blitting through the pass that goes with our selected output type.

Save and have a peek back at Unity. Now you can switch between the output types with the dropdown! 

But more importantly you learned the word "swizzle"!

The code we worked on in this tutorial is attached to this post. If you have any questions or want to share what you come up with, let me know in the comments here, on Twitter, or in Discord. And if this tutorial helped you out, please consider becoming a patron!  

This tutorial is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike4.0 International License. The code shared with this tutorial is licensed under a CreativeCommonsAttribution 4.0 International License.


Team Dogpit released this post 3 days early for patrons.   Become a patron

Become a patron to

36
Unlock 36 exclusive posts
Be part of the community
Connect via private message