Skip to main content
Skip table of contents

Convert non-color managed Unreal content

To be able to reproduce consistent colors across different displays, devices and platforms, Pixotope makes certain assumptions about how the content should be created. For the most part this is consistent with how you would normally work within Unreal, but Unreal offers many different ways of achieving a specific look and some of these could make it more difficult to have consistent colors.

At a high level, Pixotope works under these principals:

  1. Exposure should be static, meaning we highly discourage using dynamic exposure compensation.

  2. Exposure compensation should be 0. Be aware that raising or lowering Exposure compensation can change the appearance of predefined colors. Ie. if you have a logo in your scene with a specified color of pure white, changing the Exposure compensation up will make that a super bright color.

  3. All rendering and compositing happens in scene referred linear space, and all textures (both on 3D objects and video feeds) are converted from their native color space to linear on ingest.

  4. All color manipulations happen in 32 bit floating space and are non destructive, meaning that blacks never crush and whites never clip.

  5. Pixotope expects light to best set up in a "physically plausible" way, so that exposure levels in the 3D scene match those in the real world.

  6. Creative Tone mapping and color space conversions are separated and possible to do independently. This is different from how Unreal works.

  7. Unreal has multiple explicit and implicit ways to change how colors are altered. In Pixotope we offer ways to bypass all of this for a selected surface, so that the textures are passed through untouched. This is typically used for video feeds.

  8. Pixotope’s video processing is always happening in ACEScg. Unreal is by default set to LinearSRGB, but can be switched to ACEScg. This will cause a small shift in colors, but is necessary if you want the graphics to have wide gamut colors, which would be required for HDR.

Tone mapping

Because our eyes and cameras have a built in "tone mapping" that will compress and "roll off" highlights and lift parts of the shadows, we also need to simulate this effect when we output our graphics, otherwise the results would look very harsh and contrasty. We call this "tone mapping".

Tone mapping is a creative tool to make the image more pleasing, and in Unreal it is implemented as the tool called "Film" in the post process material. This tone mapper simulates how a film would react to light, and creates the "standard" look in Unreal.

Unreal’s tone mapper (Film)

Issues

  • In addition to tone mapping, it also bakes in a transfer curve from linear to Gamma 2.2 /sRGB/Rec.709 space. This makes it unusable for any other type of output, especially HDR output

  • Because it is parametric, it can be changed, and thus change the look of your output, making it harder to achieve a consistent look across displays, devices, etc.

  • Because it is on by default, artists will tweak their lights, colors and exposure to create a pleasing image, not necessarily taking into account how the graphics will be used downstream, and possibly having to redo creative work to be able to use the content for example for HDR output.

Pixotope’s tone mapper

To address these challenges Pixotope comes with a custom version of the Filmic tone mapper, which leaves creatives with more choice to achieve their desired look, while maintaining color consistency.

It separates the tone mapping from the color space conversion, allowing the artist to achieve the same look and feel, while being able to use the tone mapper in more scenarios.

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.