Graphics

From OniGalore
Revision as of 02:53, 10 July 2015 by Iritscen (talk | contribs) (adding screenshot of what happens when ray-casting fails)
Jump to navigation Jump to search

Visibility

A game never needs to draw all the objects in a scene at all times. Whole parts of the scene (polygons, meshes or groups of objects) can thus be "culled" (i.e. killed, not drawn).

An object (polygon, mesh) can be culled for several reasons, including the following :

  • it's outside of the camera's field of view (view frustum culling)
  • it's within the camera's field of view, but it's hidden by other objects (occlusion culling)

Culling speeds up the drawing process, but runtime detection of objects that need to be culled also takes time... One generally adopts a hierarchical subdivision of the scene, whereby the children (subparts) of a large object are automatically culled if their parent (the large object) is detected as hidden or out of sight.

Oni's solution: Rays

If you've ever thought that you briefly saw flashes of color while turning the camera, now you know why.

It's an original algorithm that achieves both view frustum and occlusion culling. An overview of the algorithm is presented in the following paper:
An Algorithm for Hidden Surface Complexity Reduction and Collision Detection Based On Oct Trees (figure 1 and 2 are mixed up) by Brent H. Pease (coder for Oni)

Just a sidenote: the ray-casting engine that handles occlusion/frustum is not to be confused with the common term "ray-tracing", a CG rendering method where a ray is cast for every rendered pixel, achieving a high degree of photorealism. This method is very CPU-intensive and is typically used only for still images, short video sequences or CG movies such as Pixar's Cars. Implementations of real-time ray tracing for gaming are underway, but can not yet compete with the standard scanline rendering and rasterisation approaches.

Pease acknowledges that even occlusion testing was too CPU-intensive until they lowered the number of rays being emitted and started altering their emission angles even when the camera remained still, in order to catch any polygons that the first emission of rays missed; this is why distant polygons, when suddenly revealed by the camera, sometimes are culled at first and then appear a moment later, because the rays didn't hit them on the first pass; more generally, a distant object can be culled if it's only visible through a narrow slit, because the rays have a low probability to pass through the slit and hit the objects behind it.


Environment mapping

It's a technique to simulate reflectivity: the diffuse texture is mapped using regular texture coordinates (UVs), and there's an additional texture layer, the UVs for which are generated from the vertex normals and the orientation of the object in space (and, ideally, the orientation of the camera). While env mapping is the most advanced shader in Oni, it's about the worst implementation of env mapping there ever was (the projection is planar and the camera angle is not taken into account at all).