Jump to content

Graphics: Difference between revisions

50 bytes removed ,  28 June 2022
m
fixing WP redirect; other link tweaks
(LOD wording, poly count table)
m (fixing WP redirect; other link tweaks)
Line 2: Line 2:


==Environment mapping==
==Environment mapping==
:Wikipedia: [[wikipedia:Environment mapping|Environment mapping]]
:Wikipedia: [[wp:Reflection mapping|Environment mapping]]
It's a technique to simulate reflectivity: the diffuse texture is mapped using regular texture coordinates (UVs), and there's an additional texture layer, the UVs for which are generated from the vertex normals and the orientation of the object in space (and, ideally, the orientation of the camera). While env mapping is the most advanced shader in Oni, it's about the worst implementation of env mapping there ever was (the projection is planar and the camera angle is not taken into account at all).
It's a technique to simulate reflectivity: the diffuse texture is mapped using regular texture coordinates (UVs), and there's an additional texture layer, the UVs for which are generated from the vertex normals and the orientation of the object in space (and, ideally, the orientation of the camera). While env mapping is the most advanced shader in Oni, it's about the worst implementation of env mapping there ever was (the projection is planar and the camera angle is not taken into account at all).


==Fog==
==Fog==
:See [[BSL:Frustum_and_fog]] on how to alter fog using [[BSL]].
:See [[BSL:Frustum_and_fog]] on how to alter fog using [[BSL]].
:''In mid-to-late 1990s games, when processing power was not enough to render far viewing distances, clipping was employed. However, the effect could be very distracting since bits and pieces of [objects] would flicker in and out of view instantly; by applying a medium-ranged fog, the clipped polygons would fade in more realistically from the haze.'' — [[wikipedia:Distance_fog|"Distance fog"]], Wikipedia
:''In mid-to-late 1990s games, when processing power was not enough to render far viewing distances, clipping was employed. However, the effect could be very distracting since bits and pieces of [objects] would flicker in and out of view instantly; by applying a medium-ranged fog, the clipped polygons would fade in more realistically from the haze.'' — [[wp:Distance fog|"Distance fog"]], Wikipedia
:''For more technical information on fog and on frustum-based space (or whatever it's called), see [https://docs.microsoft.com/en-us/previous-versions//ms537113(v=vs.85)?redirectedfrom=MSDN here] and [https://docs.microsoft.com/en-us/windows/win32/direct3d9/pixel-fog?redirectedfrom=MSDN here] and [https://web.archive.org/web/20130520191527/http://cs.fit.edu/~wds/classes/graphics/PTOC/ptoc/ elsewhere].
:''For more technical information on fog and on frustum-based space (or whatever it's called), see [https://docs.microsoft.com/en-us/previous-versions//ms537113(v=vs.85)?redirectedfrom=MSDN here] and [https://docs.microsoft.com/en-us/windows/win32/direct3d9/pixel-fog?redirectedfrom=MSDN here] and [https://web.archive.org/web/20130520191527/http://cs.fit.edu/~wds/classes/graphics/PTOC/ptoc/ elsewhere].


Line 25: Line 25:
==Frustum==
==Frustum==
:See [[BSL:Frustum_and_fog|Frustum and fog]] on how to alter the far clip plane and field-of-view (FOV) using [[BSL]].
:See [[BSL:Frustum_and_fog|Frustum and fog]] on how to alter the far clip plane and field-of-view (FOV) using [[BSL]].
:''In 3D computer graphics, the viewing frustum or view frustum is the region of space in the modeled world that may appear on the screen; it is the field of view of the notional camera. The exact shape of this region varies depending on what kind of camera lens is being simulated, but typically it is a frustum of a rectangular pyramid (hence the name). The planes that cut the frustum perpendicular to the viewing direction are called the near plane and the far plane. Objects closer to the camera than the near plane or beyond the far plane are not drawn.'' — [[wikipedia:Viewing_frustum|"Viewing frustum"]], Wikipedia
:''In 3D computer graphics, the viewing frustum or view frustum is the region of space in the modeled world that may appear on the screen; it is the field of view of the notional camera. The exact shape of this region varies depending on what kind of camera lens is being simulated, but typically it is a frustum of a rectangular pyramid (hence the name). The planes that cut the frustum perpendicular to the viewing direction are called the near plane and the far plane. Objects closer to the camera than the near plane or beyond the far plane are not drawn.'' — [[wp:Viewing frustum|"Viewing frustum"]], Wikipedia


;Field of view (FOV)
;Field of view (FOV)
Line 47: Line 47:


;Ray-casting
;Ray-casting
:Wikipedia: [[wikipedia:Ray_casting|Ray casting]]
:Wikipedia: [[wp:Ray casting|Ray casting]]
[[Image:Ray-casting failure.jpg|thumb|right|If you've ever thought that you briefly saw flashes of color while turning the camera, now you know why. Wherever a polygon is not rendered, the level's skybox will be visible.]]
[[Image:Ray-casting failure.jpg|thumb|right|If you've ever thought that you briefly saw flashes of color while turning the camera, now you know why. Wherever a polygon is not rendered, the level's skybox will be visible.]]


It's an original algorithm that achieves both view frustum and occlusion culling. During each frame, 16-20 rays (depending on graphics quality) are shot from the camera into the environment, distributed randomly in the frustum. Along the path of each ray, the engine first determines the relevant leaf node of the oct tree (either by navigating the oct tree or using the information on the neighbors of the previously traversed leaf node), and then the ray is tested for collision with non-transparent environment quads intersecting that node (if the ray is stopped by a quad, then its further path is ignored). Quads thus detected with collision will be drawn. An overview of the algorithm is presented in the paper [http://oni.bungie.org/archives/brent_gdc00.html An Algorithm for Hidden Surface Complexity Reduction and Collision Detection Based On Oct Trees] by [[Credits|Brent H. Pease]] (note that figure 1 and 2 are mixed up).
It's an original algorithm that achieves both view frustum and occlusion culling. During each frame, 16-20 rays (depending on graphics quality) are shot from the camera into the environment, distributed randomly in the frustum. Along the path of each ray, the engine first determines the relevant leaf node of the oct tree (either by navigating the oct tree or using the information on the neighbors of the previously traversed leaf node), and then the ray is tested for collision with non-transparent environment quads intersecting that node (if the ray is stopped by a quad, then its further path is ignored). Quads thus detected with collision will be drawn. An overview of the algorithm is presented in the paper [http://oni.bungie.org/archives/brent_gdc00.html An Algorithm for Hidden Surface Complexity Reduction and Collision Detection Based On Oct Trees] by [[Credits|Brent H. Pease]] (note that figure 1 and 2 are mixed up).
:Just a side note: the ray-casting feature which handles occlusion/frustum is not to be confused with the common term "[[wikipedia:Ray_tracing_(graphics)|ray-tracing]]", a CG rendering method where a ray is cast for every rendered pixel, achieving a high degree of photorealism. This method is very CPU-intensive and is typically used only for still images, short video sequences or CG movies such as Pixar's ''Cars''. Implementations of real-time ray-tracing for gaming are underway, but still need to be complemented with the standard [[wikipedia:Scanline_rendering|scanline rendering]] and [[wikipedia:Rasterisation|rasterisation]] approaches for performance reasons.
:Just a side note: the ray-casting feature which handles occlusion/frustum is not to be confused with the common term "[[wp:Ray tracing (graphics)|ray-tracing]]", a CG rendering method where a ray is cast for every rendered pixel, achieving a high degree of photorealism. This method is very CPU-intensive and is typically used only for still images, short video sequences or CG movies such as Pixar's ''Cars''. Implementations of real-time ray-tracing for gaming are underway, but still need to be complemented with the standard [[wp:Scanline rendering|scanline rendering]] and [[wp:Rasterisation|rasterisation]] approaches for performance reasons.
Pease acknowledges that even ray-casting was too CPU-intensive until they lowered the number of rays being emitted and started altering their emission angles even when the camera remained still, in order to catch any polygons that the first emission of rays missed; this is why distant polygons, when suddenly revealed by the camera, sometimes are culled at first (pictured, right) and then appear a moment later: because the rays didn't hit them on the first pass. A distant face in the environment can be accidentally culled if it's only visible through a narrow slit, because the rays have a low probability of passing through the slit and hitting the objects behind it. A similar problem occurs when modders experiment with outdoor levels that have uneven ground (natural terrain): many of the ground polygons will be culled, making the map look like there are holes everywhere.
Pease acknowledges that even ray-casting was too CPU-intensive until they lowered the number of rays being emitted and started altering their emission angles even when the camera remained still, in order to catch any polygons that the first emission of rays missed; this is why distant polygons, when suddenly revealed by the camera, sometimes are culled at first (pictured, right) and then appear a moment later: because the rays didn't hit them on the first pass. A distant face in the environment can be accidentally culled if it's only visible through a narrow slit, because the rays have a low probability of passing through the slit and hitting the objects behind it. A similar problem occurs when modders experiment with outdoor levels that have uneven ground (natural terrain): many of the ground polygons will be culled, making the map look like there are holes everywhere.