Graphs & Rendering
The Loki3D renderer works by gathering all of the cameras, lights, and geometry produced by the Graph. A Graph is a collection of nodes and the connections between those nodes.
At the very minimum, you need a Camera to render anything. The renderer sorts the Cameras by their priority, and for each camera, renders what it sees.
Loki3D Renderer
The Loki3D renderer is a Forward Renderer that, for each frame, gathers renderable nodes from the graph and renders them to the display. It does this in mulitple stages:
- Gather all of the Cameras from the graph and sorts them into two categories: cameras that render to a texture, and cameras that render to the display. For each camera category, it sorts the cameras in that set in order of their Priority property, where higher priority cameras render after lower priority cameras. Cameras with the same priority are just sorted by the order they were added to the graph.
- Gather all of the renderable Geometry from the graph and sorts them into rendering queues. These queues are: background, opaque, and transparent. Background geometry is from Cameras that have connections to their Background property. Opaque geometry is geometry with materials that have no transparency. Transparent geometry is from geometry with materials that have some transparency. The geometry in render queues is sorted by distance based on the type of queue. Opaque objects are sorted so the geometry closest to the camera is drawn first. Transparent objects are sorted so the geometry farthest from the camera is drawn first. Geometry is then sorted by material, so that all geometry that share a material is drawn together.
- Gather all of the lights from the graph. These are used for rendering geometry if the material used for that geometry has lighting.
- Render each camera. It first renders each camera that render to a texture, sorted by camera Priority. It then renders each camera that renders to the display, sorted by camera Priority.
- For each camera that renders, filter the geometry that is renderable by the camera, including frustum culling. Render each geometry object, in the order that they were sorted by queue and material. Bind the render texture for the camera, either the render texture target of the camera, or the display render texture. The display render texture is an HDR render texture with MSAA anti-aliasing.
- For each object rendered by the camera, find the lights closest to the camera up to the maximum number of lights supported by the material. Bind the material, if it is different than the last material used. Apply its per object material properties. Bind and draw the geometry, including all instances of the geometry.
- Resolve the camera render texture's MSAA. If the camera has a Filter connection, compute the image nodes connected to the Filter socket. The resolved render texture is used as the input image for any image nodes that do image processing but don't have an input.
- If it's a display camera, composite the camera's render texture onto the display output.