This is an old revision of the document!
Buffer Name | Description | Channels 1) | Sample Image |
---|---|---|---|
Final Render | This buffer contains the final, rendered image. This is identical to the image save by using the LightWave 3D render globals. | RGBA | |
Raw Colour | This is the basic surface colour without any shading applied to it. | RGB | |
Reflection Colour | This colour buffers stores the reflections of surfaces. It could also be referred to as reflection shading. | RGB | |
Diffuse Colour | This buffers contains the shading due to the surfaces colour and the diffuse shading. | RGB | |
Specular Colour | This buffer stores the shading due to the specularity settings. | RGB | |
Refraction Colour | This buffer stored the refractions of surfaces… basically the shading due to transparency. | RGBA | |
Backdrop Colour | This buffer just stores the backdrop colour. | RGB | |
Pre-effect Colour | This is the final rendered image before any pixel filter or image filter plugins are applied to it. | RGB | |
Special | The special buffer is quite … “special” indeed 2) . In the surface editor, under the Advanced tab, there is a button to edit the Special Buffers. LightWave 3D provides 4 of them. These are applied on a surface by surface basis. A value of 0.0 in a special buffer is equivalent to black, 1.0 is white (values can go beyond those limit as they are stored in float/HDR buffers). The tricky bit is the fact that image filters can only read one of these special buffers. Which one depends on the location of the image filter plugin in the list of applied image filters in the Processing tab of the Effects panel in LightWave 3D. The first image filter in the list will only be able to read special buffer #1, the second image filter can only read special buffer #2, the third one bufer #3 and the fourth one only buffer #4. All image filters beyond that only have access to special buffer #4. Practically this means that exrTrader will need to be applied four times as an image filter to the list to be able to access all four special buffers. In this case we'd recommend using the first instance of exrTrader in the list to save all the other buffers as well, and only use the slots 2-4 to save the special buffers – to separate image files. Please note, given the AOVs in LightWave 2017, this buffer is obsolete nowadays. | Y | |
Luminous | This greyscale buffer stores the Luminosity value of a surface. | Y | |
Diffuse | This greyscale buffer stores the diffuse value of the surfaces, as defined in the surface editor. | Y | |
Specular | This greyscale buffer stores the specularity value of the surfaces, as defined in the surface editor. | Y | |
Mirror | This greyscale buffer stores the reflectivity value of the surfaces, as defined in the surface editor. | Y | |
Transparency | This greyscale buffer stores the transparency value of the surfaces, as defined in the surface editor. | Y | |
Shading | This buffer contains the shading due to the diffuse and specularity values as a greyscale image. | Y | |
Shadow | This buffer highlights the areas of the image that receive shadows from lights. | Y | |
Geometry | This buffer stores the orientation of surfaces toward the camera. Facing surfaces are white while surfaces that face away from the camera are black. | Y | |
Depth | This buffer stores the distance of a surface from the camera. Due to the fact that the depth is stored as a proper float image and the pixels represent the distance to the camera in metres, there's a few facts to remember. The first image to the right shows the normalized depth buffer for the sample render. There isn't much to see because the top of the image is the backdrop. In LightWave the backdrop is always rendered at an infinite distance 3). A normalized view looks at the highest and the lowest value in an image and then tries to display anything between them in the visible range from 0.0 to 1.0. In this case the backdrop compressed to be at 1.0, and all other values are compressed respectively. This means that the actual items are relatively close to the camera (at least in relation to the backdrop) and thus displayed as being black (which corresponds of a distance of 0, or something very close to it). If you intend to composite in an application that handles float images (and the tools that are used to actually work with a float depth buffer) then there should be no need to change anything. If you intend to save the depth buffer as a low dynamic range image then you can use the Minimum and Maximum settings to define a range which should be normalized into the visible range. The second image illustrates this, it is the same depth buffer as in the first image, but the Maximum has been set to 10.0. | Y | |
Motion | This buffer stores the motion of a pixel during the current frame in screen space. The motion is encoded in the R and B channel. Since LightWave 3D creates float channels, the values represent the movement in pixels. This implies that the values may actually be negative as well, depending on the direction of movement. If the compositing package can deal with floating point motion buffers this is the best way to export them. Some plugins require the motion vectors to be normalized into values from 0.0 to 1.0, where 0.5 is equivalent to no motion at all. The Offset and Scale processing options allow you to change the motion buffer to be acceptable by such plugins 4) - as a downside one needs to estimate how far the any of the pixels actually travel (in pixels) – or use an arbitrary value, such as the largest dimension of the image in pixels. As an example, using 1920 as the largest distance an pixel could travel: The Offset would need to be 1920 - to basically push the negative values into the positive range. The Scale would need to be 1 / (1920 * 2) (roughly 0.00026) – to compress the values into a range from 0 to 1.0. | XY | |
Normals | This buffer stores the normals in world space coordinates of the rendered surfaces. | XYZ | |
Surface ID | |||
Object ID | |||
Radiosity | RGB | ||
Ambient Occlusion | This buffer is rendered as a separate render pass in LightWave and is only available if radiosity is used to render the current frame. Important Since this buffer requires a separate render pass, exrTrader will only request this buffer for a VIPER preview if it has actually been activated by the user. Please check the section on the AO Options for further options affecting this buffer. | RGB | |
UV Tangent Space Tangent UV Tangent Space Bitangent UV Tangent Space Normal | These three buffers only show data if a UV mapped images is applied to a surface. The images on the right were created by baking a UV mapped sphere (with the UV created as a spherical map). They basically represents the orientation of the UVs in object coordinates. Tangent Space Normal is the normal of the surface in object coordinates. The Tangent and Bitangent are the direction of the U and V coordinate (or slope) on object space. | XYZ | |
Camera Tangent Space | XYZ | ||
Edges | RGBA | ||
Final Render (CC) | RGBA | ||
SSS Direct | RGB | ||
SSS Indirect | RGB | ||
Volumetric Direct | RGB | ||
Volumetric Indirect | RGB | ||
Volumetric Emission | RGB | ||
Translucency | RGB | ||
Transparency RGB | RGB | ||
Luminosity | RGB | ||
Luminosity | |||
Coverage | |||
World Position | XYZ | ||
Object Position | XYZ | ||
Texture Position | XYZ | ||
UV | UV | ||
dPdU | XYZ | ||
dPdV | XYZ | ||
Lens Flare | RGB | ||
Diffuse Direct | RGB | ||
Diffuse Indirect | RGB | ||
Specular Direct | RGB | ||
Specular Indirect | RGB | ||
Translucency Direct | RGB | ||
Translucency Indirect | RGB | ||
Fog | RGB | ||
Legacy Volumetric | RGB |