High-dynamic-range rendering

High-dynamic-range rendering(HDRRorHDR rendering), also known ashigh-dynamic-range lighting,is therenderingofcomputer graphicsscenes by usinglightingcalculations done inhigh dynamic range(HDR). This allows preservation of details that may be lost due to limitingcontrast ratios.Video gamesandcomputer-generated movies and special effectsbenefit from this as it creates more realistic scenes than with more simplistic lighting models.

A comparison of the standard fixed-aperture rendering (left) with the HDR rendering (right) in the video gameHalf-Life 2: Lost Coast

Graphics processor companyNvidiasummarizes the motivation for HDR in three points: bright things can be really bright, dark things can be really dark, and details can be seen in both.[1]

History

edit

The use ofhigh-dynamic-range imaging(HDRI) in computer graphics was introduced by Greg Ward in 1985 with his open-sourceRadiancerendering andlighting simulationsoftware which created the first file format to retain a high-dynamic-range image. HDRI languished for more than a decade, held back by limited computing power, storage, and capture methods. Not until recently has the technology to put HDRI into practical use been developed.[2][3]

In 1990, Nakame,et al.,presented a lighting model for driving simulators that highlighted the need for high-dynamic-range processing in realistic simulations.[4]

In 1995, Greg Spencer presentedPhysically-based glare effects for digital imagesatSIGGRAPH,providing a quantitative model for flare and blooming in the human eye.[5]

In 1997,Paul DebevecpresentedRecovering high dynamic range radiance maps from photographs[6]at SIGGRAPH, and the following year presentedRendering synthetic objects into real scenes.[7]These two papers laid the framework for creating HDRlight probesof a location, and then using this probe to light a rendered scene.

HDRI and HDRL (high-dynamic-rangeimage-based lighting) have, ever since, been used in many situations in 3D scenes in which inserting a 3D object into a real environment requires the light probe data to provide realistic lighting solutions.

In gaming applications,Riven: The Sequel to Mystin 1997 used an HDRI postprocessing shader directly based on Spencer's paper.[8]AfterE32003,Valvereleased a demo movie of theirSource enginerendering a cityscape in a high dynamic range.[9]The term was not commonly used again until E3 2004, where it gained much more attention whenEpic GamesshowcasedUnreal Engine 3and Valve announcedHalf-Life 2: Lost Coastin 2005, coupled with open-source engines such asOGRE 3Dand open-source games likeNexuiz.

Examples

edit

One of the primary advantages of HDR rendering is that details in a scene with a large contrast ratio are preserved. Without HDR, areas that are too dark are clipped to black and areas that are too bright are clipped to white. These are represented by the hardware as a floating point value of 0.0 and 1.0 for pure black and pure white, respectively.

Another aspect of HDR rendering is the addition of perceptual cues which increase apparent brightness. HDR rendering also affects how light is preserved in optical phenomena such asreflectionsandrefractions,as well as transparent materials such as glass. In LDR rendering, very bright light sources in a scene (such as the sun) are capped at 1.0. When this light is reflected the result must then be less than or equal to 1.0. However, in HDR rendering, very bright light sources can exceed the 1.0 brightness to simulate their actual values. This allows reflections off surfaces to maintain realistic brightness for bright light sources.

Limitations and compensations

edit

Human eye

edit

Thehuman eyecan perceive scenes with a very high dynamiccontrast ratio,around 1,000,000:1.Adaptationis achieved in part through adjustments of theirisand slow chemical changes, which take some time (e.g. the delay in being able to see when switching from bright lighting to pitch darkness). At any given time, the eye's static range is smaller, around 10,000:1. However, this is still higher than the static range of most display technology.[citation needed]

Output to displays

edit

Although many manufacturers claim very high numbers,plasma displays,liquid-crystal displays,andCRT displayscan deliver only a fraction of the contrast ratio found in the real world, and these are usually measured under ideal conditions.[citation needed]The simultaneous contrast of real content under normal viewing conditions is significantly lower.

Some increase in dynamic range in LCD monitors can be achieved by automatically reducing the backlight for dark scenes. For example, LG calls this technology "Digital Fine Contrast";[10]Samsung describes it as "dynamic contrast ratio". Another technique is to have an array of brighter and darker LED backlights, for example with systems developed by BrightSide Technologies.[11]

OLEDdisplays have better dynamic range capabilities than LCDs, similar to plasma but with lower power consumption.Rec. 709defines the color space forHDTV,andRec. 2020defines a larger but still incomplete color space forultra-high-definition television.

Light bloom

edit

Light blooming is the result of scattering in the human lens, which human brain interprets as a bright spot in a scene. For example, a bright light in the background will appear to bleed over onto objects in the foreground. This can be used to create an illusion to make the bright spot appear to be brighter than it really is.[5]

Flare

edit

Flare is the diffraction of light in the human lens, resulting in "rays" of light emanating from small light sources, and can also result in some chromatic effects. It is most visible on point light sources because of their small visual angle.[5]

Typical display devices cannot display light as bright as the Sun, and ambient room lighting prevents them from displaying true black. Thus HDR rendering systems have to map the full dynamic range of what the eye would see in the rendered situation onto the capabilities of the device. Thistone mappingis done relative to what the virtual scene camera sees, combined with severalfull screen effects,e.g. to simulate dust in the air which is lit by direct sunlight in a dark cavern, or the scattering in the eye.

Tone mappingandblooming shaderscan be used together to help simulate these effects.

Tone mapping

edit

Tone mapping, in the context of graphics rendering, is a technique used to map colors from high dynamic range (in which lighting calculations are performed) to a lower dynamic range that matches the capabilities of the desired display device. Typically, the mapping is non-linear – it preserves enough range for dark colors and gradually limits the dynamic range for bright colors. This technique often produces visually appealing images with good overall detail and contrast. Various tone mapping operators exist, ranging from simple real-time methods used in computer games to more sophisticated techniques that attempt to imitate the perceptual response of the human visual system.

Applications in computer entertainment

edit

Currently HDRR has been prevalent ingames,primarily forPCs,Microsoft'sXbox 360,andSony'sPlayStation 3.It has also been simulated on thePlayStation 2,GameCube,XboxandAmigasystems.Sproing Interactive Mediahas announced that their new Athena game engine for theWiiwill support HDRR, adding Wii to the list of systems that support it.

Indesktop publishingand gaming, color values are oftenprocessedseveral times over. As this includes multiplication and division (which can accumulaterounding errors), it is useful to have the extended accuracy and range of 16 bit integer or 16 bitfloating pointformats. This is useful irrespective of the aforementioned limitations in some hardware.

Development of HDRR through DirectX

edit

Complex shader effects began their days with the release ofShader Model 1.0with DirectX 8. Shader Model 1.0 illuminated 3D worlds with what is called standard lighting. Standard lighting, however, had two problems:

  1. Lighting precision was confined to 8 bit integers, which limited the contrast ratio to 256:1. Using theHVS color model,the value (V), or brightness of a color has a range of 0 – 255. This means the brightest white (a value of 255) is only 255 levels brighter than the darkest shade above pure black (i.e.: value of 0).
  2. Lighting calculations wereintegerbased, which didn't offer as much accuracy because the real world is not confined to whole numbers.

On December 24, 2002,Microsoftreleased a new version ofDirectX.DirectX 9.0 introduced Shader Model 2.0, which offered one of the necessary components to enable rendering of high-dynamic-range images: lighting precision was not limited to just 8-bits. Although 8-bits was the minimum in applications, programmers could choose up to a maximum of 24 bits for lighting precision. However, all calculations were still integer-based. One of the firstgraphics cardsto support DirectX 9.0 natively wasATI'sRadeon 9700,though the effect wasn't programmed into games for years afterwards. On August 23, 2003, Microsoft updated DirectX to DirectX 9.0b, which enabled the Pixel Shader 2.x (Extended) profile for ATI'sRadeon X seriesand NVIDIA'sGeForce FXseries of graphics processing units.

On August 9, 2004, Microsoft updated DirectX once more to DirectX 9.0c. This also exposed the Shader Model 3.0 profile forHigh-Level Shader Language(HLSL). Shader Model 3.0's lighting precision has a minimum of 32 bits as opposed to 2.0's 8-bit minimum. Also all lighting-precision calculations are nowfloating-point based.NVIDIAstates that contrast ratios using Shader Model 3.0 can be as high as 65535:1 using 32-bit lighting precision. At first, HDRR was only possible on video cards capable of Shader-Model-3.0 effects, but software developers soon added compatibility for Shader Model 2.0. As a side note, when referred to as Shader Model 3.0 HDR, HDRR is really done by FP16 blending. FP16 blending is not part of Shader Model 3.0, but is supported mostly by cards also capable of Shader Model 3.0 (exceptions include the GeForce 6200 series). FP16 blending can be used as a faster way to render HDR in video games.

Shader Model 4.0 is a feature of DirectX 10, which has been released with Windows Vista. Shader Model 4.0 allows 128-bit HDR rendering, as opposed to 64-bit HDR in Shader Model 3.0 (although this is theoretically possible under Shader Model 3.0).

Shader Model 5.0 is a feature of DirectX 11. It allows 6:1 compression of HDR textures without noticeable loss, which is prevalent on previous versions of DirectX HDR texture compression techniques.

Development of HDRR through OpenGL

edit

It is possible to develop HDRR throughGLSLshader starting fromOpenGL1.4 onwards.

Game engines that support HDR rendering

edit

See also

edit

References

edit
  1. ^Simon Green and Cem Cebenoyan (2004)."High Dynamic Range Rendering (on the GeForce 6800)"(PDF).GeForce 6Series.nVidia. p. 3.
  2. ^Reinhard, Erik; Greg Ward; Sumanta Pattanaik; Paul Debevec (August 2005).High Dynamic Range Imaging: Acquisition, Display, and Image-Based Lighting.Westport, Connecticut: Morgan Kaufmann.ISBN978-0-12-585263-0.
  3. ^Greg Ward."High Dynamic Range Imaging"(PDF).anywhere.com.Retrieved18 August2009.
  4. ^Nakamae, Eihachiro; Kaneda, Kazufumi; Okamoto, Takashi; Nishita, Tomoyuki (1990). "A lighting model aiming at drive simulators".Proceedings of the 17th annual conference on Computer graphics and interactive techniques.pp. 395–404.doi:10.1145/97879.97922.ISBN978-0201509335.S2CID11880939.
  5. ^abcSpencer, Greg; Shirley, Peter; Zimmerman, Kurt; Greenberg, Donald P. (1995). "Physically-based glare effects for digital images".Proceedings of the 22nd annual conference on Computer graphics and interactive techniques - SIGGRAPH '95.p.325.CiteSeerX10.1.1.41.1625.doi:10.1145/218380.218466.ISBN978-0897917018.S2CID17643910.
  6. ^Paul E. DebevecandJitendra Malik(1997). "Recovering high dynamic range radiance maps from photographs".Proceedings of the 24th annual conference on Computer graphics and interactive techniques - SIGGRAPH '97.pp. 369–378.doi:10.1145/258734.258884.ISBN0897918967.
  7. ^Paul E. Debevec(1998). "Rendering synthetic objects into real scenes: Bridging traditional and image-based graphics with global illumination and high dynamic range photography".Proceedings of the 25th annual conference on Computer graphics and interactive techniques - SIGGRAPH '98.pp. 189–198.doi:10.1145/280814.280864.ISBN0897919998.
  8. ^Forcade, Tim (February 1998). "Unraveling Riven".Computer Graphics World.
  9. ^ Valve (2003)."Half-Life 2: Source DirectX 9.0 Effects Trailer (2003)".YouTube.Archivedfrom the original on 2021-12-21.
  10. ^Digital Fine Contrast
  11. ^BrightSide Technologies is now part of Dolby -Archived2007-09-10 at theWayback Machine
  12. ^"Rendering – Features – Unreal Technology".Epic Games.2006. Archived fromthe originalon 2011-03-07.Retrieved2011-03-15.
  13. ^"SOURCE – RENDERING SYSTEM".Valve.2007. Archived fromthe originalon 2011-03-23.Retrieved2011-03-15.
  14. ^"The Amazing Technology of The Witcher 3".PC-Gamer.2015.Retrieved2016-05-08.
  15. ^"FarCry 1.3: Crytek's Last Play Brings HDR and 3Dc for the First Time".X-bit Labs.2004. Archived fromthe originalon 2008-07-24.Retrieved2011-03-15.
  16. ^"CryEngine 2 – Overview".CryTek.2011.Retrieved2011-03-15.
  17. ^Pereira, Chris (December 3, 2016)."Kojima Partnering With Killzone, Horizon Dev Guerrilla for Death Stranding".GameSpot.CBS Interactive.Archivedfrom the original on December 4, 2019.RetrievedDecember 3,2016.
  18. ^"Unigine Engine – Unigine (advanced 3D engine for multi-platform games and virtual reality systems)".Unigine Corp.2011.Retrieved2011-03-15.
  19. ^"BabylonDoc".Archived fromthe originalon 2015-07-04.Retrieved2015-07-03.
  20. ^"MIT Licensed Open Source version of Torque 3D from GarageGames: GarageGames/Torque3D".GitHub.2019-08-22.
edit