Blender to Nuke Workflow - Render Passes

Tutorial / 26 November 2023

Recently, I have been working on a short film that has involved multiple different software in the pipeline. A sequence of shots I am working on are being rendered out in Blender with the Cycles render engine. This is my first time however, using Blender in a professional pipeline and so I had to learn how to correctly render the shots out with the proper render passes and all. I had previously worked in a similar pipeline using Maya and RenderMan, so I had prior knowledge of what the end result was supposed to look like when I brought everything into Nuke.

In the process of figuring it all out though, I had come to find that it was going to be a bit less straightforward than I had originally expected. There were a few obstacles along the way but it was a good learning experience, there is still more that I can learn but for now, I thought I'd share my findings by going through the same process that I used, but for re-rendering a model I made during my early studies at the Academy of Interactive Entertainment to post on Instagram.


What will we need?

The first step is to figure out all the components we want to bring into Nuke.

For my purposes, these are the types of passes I would like:

  • RGBA
  • Direct diffuse and specular
  • Indirect diffuse and specular
  • Emission
  • Ambient occlusion
  • Cryptomatte

So, the first place I looked was in View Layer Properties, this tab has all the options for render passes, and already, it looked promising enough.


One thing I noticed that was different though was the "Color" pass underneath Diffuse and Glossy. In RenderMan, the color information is baked into the direct and indirect passes of diffuse and specular, and in post, they are just added together. From what I discovered in Blender, however, the direct and indirect are mostly just the light values, and the color needs to be multiplied with those passes added together.


How do we render the passes?

To render out all the passes, make sure to change your file format to OpenEXR MultiLayer so that you get all your passes in the images. I'm also going to use a lossy codec since I am basically at the end of the pipeline and the file sizes are 5x smaller. If you're in a bigger production pipeline and you want to be 100% safe to not lose any quality though, it is probably better to use a lossless codec. 


Now comes the first hurdle I had to overcome, when viewing the passes in Nuke, it appeared that a lot of the passes had not been denoised, this would become an issue when I started to combine them to get the beauty pass. After some googling and searching through various forums, I found a workaround that involved Blender's compositor, which, while I enjoyed messing around with when I was first learning 3D, does not have the same capabilities as a more in-depth compositor such as Nuke. 

To get denoised passes out of Blender, we have to set up a node tree in the compositor.

Outputting passes

In the compositor, enable "Use Nodes", and if you want to view the passes, enable "Backdrop", these are both at the top of the window by default. Now to denoise our passes, Blender by default will just export the image that is plugged into the "Composite" node, and the output settings above determine the location and format for the exports. This doesn't work if we want to make changes to the passes and still export them all separately. To work around this, we can create a "File Output" node to specify exactly what we want to export.

In the node tab of the sidebar (press "N" if there is no sidebar) with the file output node selected, expand "Properties" and it will allow you to choose your export settings like the Output tab. Here, we can start adding all the inputs we need for all of our passes and naming them accordingly. If you are using a lossy codec, it is good to create a separate file output for just your cryptomatte passes as they need to be 32-bit and lossless, otherwise, it won't work in Nuke. Also make sure to name your paths accordingly e.g. YOURPROJECT.####.exr and YOURPROJECT_cryptomatte.####.exr as to not overwrite your render passes with your cryptomatte (the #### will format the frame numbers like 0001, 0002, etc.).

Finally, to denoise the passes that need it, add "Denoise" nodes, and put them between the passes, you can press "H" to collapse the node and tidy up the tree, as well as Shift Right-Click dragging to add reroutes.


Unfortunately, as far as I am aware there is no way of turning off Blender's default output export and exclusively outputting your file output nodes. Because of this, you'll just have to set the output to go somewhere you remember to delete so that it doesn't take up too much of your storage space.

Now just fire off your render with CTRL/CMD + F12

Bringing it all into Nuke

Open up a new Nuke comp, drag in the folder with your EXRs and it will automatically read in the image sequences. If you can't see anything from viewing the main sequence, it might be because you didn't name your rgb channel correctly, by default, Blender outputs its rgb channel to ViewLayer_Combined so you can just add a shuffle node to direct that channel into rgba.

At this point in most other pipelines, we would just add together all the passes we need to build back up to the beauty pass, but as I mentioned before, it's different for Blender's render passes. First, shuffle out all your passes and add together your indirects and directs, then multiply them with the associated color pass, and add those multiplied passes together along with anything else you need to match the beauty pass (if it doesn't match, make sure to check if you need to also render out transmission or volume passes as well).

At the end of the chain, add a copy node to get your alpha and you should now have the same image as your beauty pass. The only other thing to add is a cryptomatte node to attach to your cryptomatte image sequence.


Now comes another hurdle, when zooming in on some parts of the render, I noticed that there were details that were lost between the original beauty pass and the comped together version with all the passes from the denoising. I tried some interesting methods like getting the difference between the two versions and adding it to the new one, but I couldn't quite get the same result. In the end, I decided to just use supersampling, so I rendered out the images again at double the resolution with less samples and then reformatted it at the end in Nuke. This ended up giving me much better results and didn't end up adding a significant amount of render time, but always make sure to experiment with what render settings work best for you as it really depends on each scene.


And that's it! I was now able to work with all the render passes I wanted from Blender in Nuke.

I tweaked the glossy passes, used the cryptomatte to selectively add glow to different areas, did some color correction, added a background, and got some nice looking results!


Also as a quick bonus tip, if you're exporting EXRs from your Nuke comp to go into a video editing software such as Davinci Resolve, the color space will be off but you can fix it by grabbing the OCIO file in your Blender files, typically in Blender->VERSION->datafiles->colormanagement->config.ocio

In Resolve, this is adjusted in the Fusion tab by adding an OCIO Colorspace node, selecting the config.ocio file, and choosing Linear for your source, and sRGB for your output.


These were all just my findings from trying to get a working pipeline from Blender to Nuke for my own projects, it was fun implementing this into an old asset that I had made to get some fancier renders for my Instagram since I somehow forgot to post it when I originally made it.

Hope that some of this has helped!