Jump to content

Shader

From Wikipedia, the free encyclopedia
(Redirected fromShaders)

An example of two kinds of shadings:Flat shadingon the left andPhong shadingon the right. Phong shading is an improvement onGouraud shading,and was one of the first computer shading models developed after the basic flat shader, greatly enhancing the appearance of curved surfaces in renders. Shaders are most commonly used to produce lit and shadowed areas in the rendering of3D models.
Another use of shaders is for special effects, even on 2D images, (e.g., aphotofrom awebcam). The unaltered, unshaded image is on the left, and the same image has a shader applied on the right. This shader works by replacing all light areas of the image with white, and all dark areas with a brightly colored texture.

Incomputer graphics,ashaderis acomputer programthat calculates the appropriate levels oflight,darkness,andcolorduring therenderingof a3D scene—a process known asshading.Shaders have evolved to perform a variety of specialized functions in computer graphicsspecial effectsandvideo post-processing,as well asgeneral-purpose computing on graphics processing units.

Traditional shaders calculaterenderingeffects on graphics hardware with a high degree of flexibility. Most shaders are coded for (and run on) agraphics processing unit(GPU),[1]though this is not a strict requirement.Shading languagesare used to program the GPU'srendering pipeline,which has mostly superseded thefixed-function pipelineof the past that only allowed for commongeometry transformingandpixel-shadingfunctions; with shaders, customized effects can be used. Thepositionandcolor(hue,saturation,brightness,andcontrast) of allpixels,vertices,and/ortexturesused to construct a final rendered image can be altered usingalgorithmsdefined in a shader, and can be modified by externalvariablesor textures introduced by the computer program calling the shader.[citation needed]

Shaders are used widely incinemapost-processing,computer-generated imagery,andvideo gamesto produce a range of effects. Beyond simple lighting models, more complex uses of shaders include: altering thehue,saturation,brightness(HSL/HSV) orcontrastof an image; producingblur,light bloom,volumetric lighting,normal mapping(for depth effects),bokeh,cel shading,posterization,bump mapping,distortion,chroma keying(for so-called "bluescreen/greenscreen"effects),edgeandmotion detection,as well aspsychedeliceffects such as those seen in thedemoscene.[clarification needed]

History

[edit]

This use of the term "shader" was introduced to the public byPixarwith version 3.0 of theirRenderMan InterfaceSpecification, originally published in May 1988.[2]

Asgraphics processing unitsevolved, major graphicssoftware librariessuch asOpenGLandDirect3Dbegan to support shaders. The first shader-capable GPUs only supportedpixel shading,butvertex shaderswere quickly introduced once developers realized the power of shaders. The first video card with a programmable pixel shader was the NvidiaGeForce 3(NV20), released in 2001.[3]Geometry shaderswere introduced with Direct3D 10 and OpenGL 3.2. Eventually, graphics hardware evolved toward aunified shader model.

Design

[edit]

Shaders are simple programs that describe the traits of either avertexor apixel.Vertex shaders describe the attributes (position,texture coordinates,colors, etc.) of a vertex, while pixel shaders describe the traits (color,z-depthandAlphavalue) of a pixel. A vertex shader is called for each vertex in aprimitive(possibly aftertessellation); thus one vertex in, one (updated) vertex out. Each vertex is then rendered as a series of pixels onto a surface (block of memory) that will eventually be sent to the screen.

Shaders replace a section of the graphics hardware typically called the Fixed Function Pipeline (FFP), so-called because it performslightingand texture mapping in a hard-coded manner. Shaders provide a programmable alternative to this hard-coded approach.[4]

The basicgraphics pipelineis as follows:

  • The CPU sends instructions (compiledshading languageprograms) and geometry data to the graphics processing unit, located on the graphics card.
  • Within the vertex shader, the geometry is transformed.
  • If a geometry shader is in the graphics processing unit and active, some changes of the geometries in the scene are performed.
  • If a tessellation shader is in the graphics processing unit and active, the geometries in the scene can besubdivided.
  • The calculated geometry is triangulated (subdivided into triangles).
  • Triangles are broken down intofragment quads(one fragment quad is a 2 × 2 fragment primitive).
  • Fragment quads are modified according to the fragment shader.
  • The depth test is performed; fragments that pass will get written to the screen and might get blended into theframe buffer.

The graphic pipeline uses these steps in order to transform three-dimensional (or two-dimensional) data into useful two-dimensional data for displaying. In general, this is a large pixel matrix or "frame buffer".

Types

[edit]

There are three types of shaders in common use (pixel, vertex, and geometry shaders), with several more recently added. While older graphics cards utilize separate processing units for each shader type, newer cards featureunified shaderswhich are capable of executing any type of shader. This allows graphics cards to make more efficient use of processing power.

2D shaders

[edit]

2D shaders act ondigital images,also calledtexturesin the field of computer graphics. They modify attributes ofpixels.2D shaders may take part in rendering3D geometry.Currently the only type of 2D shader is a pixel shader.

Pixel shaders

[edit]

Pixel shaders, also known asfragmentshaders, computecolorand other attributes of each "fragment": a unit of rendering work affecting at most a single outputpixel.The simplest kinds of pixel shaders output one screenpixelas a color value; more complex shaders with multiple inputs/outputs are also possible.[5]Pixel shaders range from simply always outputting the same color, to applying alightingvalue, to doingbump mapping,shadows,specular highlights,translucencyand other phenomena. They can alter the depth of the fragment (forZ-buffering), or output more than one color if multiplerender targetsare active. In 3D graphics, a pixel shader alone cannot produce some kinds of complex effects because it operates only on a single fragment, without knowledge of a scene's geometry (i.e. vertex data). However, pixel shaders do have knowledge of the screen coordinate being drawn, and can sample the screen and nearby pixels if the contents of the entire screen are passed as a texture to the shader. This technique can enable a wide variety of two-dimensionalpostprocessingeffects such asblur,oredge detection/enhancement forcartoon/cel shaders.Pixel shaders may also be applied inintermediatestages to any two-dimensional images—spritesortextures—in thepipeline,whereasvertex shadersalways require a 3D scene. For instance, a pixel shader is the only kind of shader that can act as apostprocessororfilterfor avideo streamafter it has beenrasterized.

3D shaders

[edit]

3D shaders act on3D modelsor other geometry but may also access the colors and textures used to draw the model ormesh.Vertex shaders are the oldest type of 3D shader, generally making modifications on a per-vertex basis. Newer geometry shaders can generate new vertices from within the shader. Tessellation shaders are the newest 3D shaders; they act on batches of vertices all at once to add detail—such as subdividing a model into smaller groups of triangles or other primitives at runtime, to improve things likecurvesandbumps,or change other attributes.

Vertex shaders

[edit]

Vertex shaders are the most established and common kind of 3D shader and are run once for eachvertexgiven to the graphics processor. The purpose is to transform each vertex's 3D position in virtual space to the 2D coordinate at which it appears on the screen (as well as a depth value for the Z-buffer).[6]Vertex shaders can manipulate properties such as position, color and texture coordinates, but cannot create new vertices. The output of the vertex shader goes to the next stage in the pipeline, which is either a geometry shader if present, or therasterizer.Vertex shaders can enable powerful control over the details of position, movement, lighting, and color in any scene involving3D models.

Geometry shaders

[edit]

Geometry shaders were introduced in Direct3D 10 and OpenGL 3.2; formerly available in OpenGL 2.0+ with the use of extensions.[7]This type of shader can generate new graphicsprimitives,such as points, lines, and triangles, from those primitives that were sent to the beginning of thegraphics pipeline.[8]

Geometry shader programs are executed after vertex shaders. They take as input a whole primitive, possibly with adjacency information. For example, when operating on triangles, the three vertices are the geometry shader's input. The shader can then emit zero or more primitives, which are rasterized and their fragments ultimately passed to apixel shader.

Typical uses of a geometry shader include point sprite generation, geometrytessellation,shadow volumeextrusion, and single pass rendering to acube map.A typical real-world example of the benefits of geometry shaders would be automatic mesh complexity modification. A series of line strips representing control points for a curve are passed to the geometry shader and depending on the complexity required the shader can automatically generate extra lines each of which provides a better approximation of a curve.

Tessellation shaders

[edit]

As of OpenGL 4.0 and Direct3D 11, a new shader class called a tessellation shader has been added. It adds two new shader stages to the traditional model: tessellation control shaders (also known as hull shaders) and tessellation evaluation shaders (also known as Domain Shaders), which together allow for simpler meshes to be subdivided into finer meshes at run-time according to a mathematical function. The function can be related to a variety of variables, most notably the distance from the viewing camera to allow activelevel-of-detailscaling. This allows objects close to the camera to have fine detail, while further away ones can have more coarse meshes, yet seem comparable in quality. It also can drastically reduce required mesh bandwidth by allowing meshes to be refined once inside the shader units instead of downsampling very complex ones from memory. Some algorithms can upsample any arbitrary mesh, while others allow for "hinting" in meshes to dictate the most characteristic vertices and edges.

Primitive and Mesh shaders

[edit]

Circa 2017, theAMD Vegamicroarchitectureadded support for a new shader stage—primitive shaders—somewhat akin to compute shaders with access to the data necessary to process geometry.[9][10]

Nvidia introduced mesh and task shaders with itsTuring microarchitecturein 2018 which are also modelled after compute shaders.[11][12]Nvidia Turing is the world's first GPU microarchitecture that supports mesh shading through DirectX 12 Ultimate API, several months before Ampere RTX 30 series was released.[13]

In 2020, AMD and Nvidia releasedRDNA 2andAmperemicroarchitectures which both support mesh shading throughDirectX 12 Ultimate.[14]These mesh shaders allow the GPU to handle more complex algorithms, offloading more work from the CPU to the GPU, and in algorithm intense rendering, increasing the frame rate of or number of triangles in a scene by an order of magnitude.[15]Intel announced that Intel Arc Alchemist GPUs shipping in Q1 2022 will support mesh shaders.[16]

Ray tracing shaders

[edit]

Ray tracingshaders are supported byMicrosoftviaDirectX Raytracing,byKhronos GroupviaVulkan,GLSL,andSPIR-V,[17]byAppleviaMetal.

Tensor shaders

[edit]

Tensor shaders are supported byMicrosoftviaDirectML,byKhronos GroupviaOpenVX,byAppleviaMetal.

Compute shaders

[edit]

Compute shadersare not limited to graphics applications, but use the same execution resources forGPGPU.They may be used in graphics pipelines e.g. for additional stages in animation or lighting algorithms (e.g.tiled forward rendering). Some rendering APIs allow compute shaders to easily share data resources with the graphics pipeline.

Parallel processing

[edit]

Shaders are written to apply transformations to a large set of elements at a time, for example, to each pixel in an area of the screen, or for every vertex of a model. This is well suited toparallel processing,and most modern GPUs have multiple shaderpipelinesto facilitate this, vastly improving computation throughput.

A programming model with shaders is similar to ahigher order functionfor rendering, taking the shaders as arguments, and providing a specificdataflowbetween intermediate results, enabling bothdata parallelism(across pixels, vertices etc.) andpipeline parallelism(between stages). (see alsomap reduce).

Programming

[edit]

The language in which shaders are programmed depends on the target environment. The official OpenGL andOpenGL ESshading language isOpenGL Shading Language,also known as GLSL, and the official Direct3D shading language isHigh Level Shader Language,also known as HLSL.Cg,a third-party shading language which outputs both OpenGL and Direct3D shaders, was developed byNvidia;however since 2012 it has been deprecated. Apple released its own shading language calledMetal Shading Languageas part of theMetal framework.

GUI shader editors

[edit]

Modernvideo gamedevelopment platforms such asUnity,Unreal EngineandGodotincreasingly includenode-based editorsthat can create shaders without the need for actual code; the user is instead presented with adirected graphof connected nodes that allow users to direct various textures, maps, and mathematical functions into output values like the diffuse color, the specular color and intensity, roughness/metalness, height, normal, and so on. Automatic compilation then turns the graph into an actual, compiled shader.

See also

[edit]

References

[edit]
  1. ^"LearnOpenGL - Shaders".learnopengl.RetrievedNovember 12,2019.
  2. ^"The RenderMan Interface Specification".
  3. ^Lillypublished, Paul (May 19, 2009)."From Voodoo to GeForce: The Awesome History of 3D Graphics".PC Gamer– via pcgamer.
  4. ^"ShaderWorks' update - DirectX Blog".August 13, 2003.
  5. ^"GLSL Tutorial – Fragment Shader".June 9, 2011.
  6. ^"GLSL Tutorial – Vertex Shader".June 9, 2011.
  7. ^Geometry Shader - OpenGL.Retrieved on December 21, 2011.
  8. ^"Pipeline Stages (Direct3D 10) (Windows)".msdn.microsoft.January 6, 2021.
  9. ^"Radeon RX Vega Revealed: AMD promises 4K gaming performance for $499 - Trusted Reviews".July 31, 2017.
  10. ^"The curtain comes up on AMD's Vega architecture".January 5, 2017.
  11. ^"NVIDIA Turing Architecture In-Depth".September 14, 2018.
  12. ^"Introduction to Turing Mesh Shaders".September 17, 2018.
  13. ^"DirectX 12 Ultimate Game Ready Driver Released; Also Includes Support for 9 New G-SYNC Compatible Gaming Monitors".
  14. ^"Announcing DirectX 12 Ultimate".DirectX Developer Blog.March 19, 2020.RetrievedMay 25,2021.
  15. ^"Realistic Lighting in Justice with Mesh Shading".NVIDIA Developer Blog.May 21, 2021.RetrievedMay 25,2021.
  16. ^Smith, Ryan."Intel Architecture Day 2021: A Sneak Peek At The Xe-HPG GPU Architecture".anandtech.
  17. ^"Vulkan Ray Tracing Final Specification Release".Blog.Khronos Group.November 23, 2020.Retrieved2021-02-22.

Further reading

[edit]
[edit]