Geometry pipelinesGeometric manipulation of modelling primitives, such as that performed by a geometry pipeline, is the first stage in computer graphics systems which perform image generation based on geometric models. While geometry pipelines were originally implemented in software, they have become highly amenable to hardware implementation, particularly since the advent of very-large-scale integration (VLSI) in the early 1980s.
LightmapA lightmap is a data structure used in lightmapping, a form of surface caching in which the brightness of surfaces in a virtual scene is pre-calculated and stored in texture maps for later use. Lightmaps are most commonly applied to static objects in applications that use real-time 3D computer graphics, such as video games, in order to provide lighting effects such as global illumination at a relatively low computational cost. John Carmack's Quake was the first computer game to use lightmaps to augment rendering.
Scene graphA scene graph is a general data structure commonly used by vector-based graphics editing applications and modern computer games, which arranges the logical and often spatial representation of a graphical scene. It is a collection of nodes in a graph or tree structure. A tree node may have many children but only a single parent, with the effect of a parent applied to all its child nodes; an operation performed on a group automatically propagates its effect to all of its members.
Shading languageA shading language is a graphics programming language adapted to programming shader effects. Shading languages usually consist of special data types like "vector", "matrix", "color" and "normal". Shading languages used in offline rendering tend to be close to natural language, so that no special knowledge of programming is required. Offline rendering aims to produce maximum-quality images, at the cost of greater time and compute than real-time rendering.
Image planeIn 3D computer graphics, the image plane is that plane in the world which is identified with the plane of the display monitor used to view the image that is being rendered. It is also referred to as screen space. If one makes the analogy of taking a photograph to rendering a 3D image, the surface of the film is the image plane. In this case, the viewing transformation is a projection that maps the world onto the image plane. A rectangular region of this plane, called the viewing window or viewport, maps to the monitor.
Stencil bufferA stencil buffer is an extra data buffer, in addition to the color buffer and Z-buffer, found on modern graphics hardware. The buffer is per pixel and works on integer values, usually with a depth of one byte per pixel. The Z-buffer and stencil buffer often share the same area in the RAM of the graphics hardware. In the simplest case, the stencil buffer is used to limit the area of rendering (stenciling). More advanced usage of the stencil buffer makes use of the strong connection between the Z-buffer and the stencil buffer in the rendering pipeline.
Texture synthesisTexture synthesis is the process of algorithmically constructing a large from a small digital sample image by taking advantage of its structural content. It is an object of research in computer graphics and is used in many fields, amongst others , 3D computer graphics and post-production of films. Texture synthesis can be used to fill in holes in images (as in inpainting), create large non-repetitive background images and expand small pictures. Procedural textures are a related technique which may synthesise textures from scratch with no source material.
Vertex (computer graphics)A vertex (plural vertices) in computer graphics is a data structure that describes certain attributes, like the position of a point in 2D or 3D space, or multiple points on a surface. 3D models are most often represented as triangulated polyhedra forming a triangle mesh. Non-triangular surfaces can be converted to an array of triangles through tessellation. Attributes from the vertices are typically interpolated across mesh surfaces. The vertices of triangles are associated not only with spatial position but also with other values used to render the object correctly.
Glossary of computer hardware termsThis glossary of computer hardware terms is a list of definitions of terms and concepts related to computer hardware, i.e. the physical and structural components of computers, architectural issues, and peripheral devices.
Shadow mappingShadow mapping or shadowing projection is a process by which shadows are added to 3D computer graphics. This concept was introduced by Lance Williams in 1978, in a paper entitled "Casting curved shadows on curved surfaces." Since then, it has been used both in pre-rendered and realtime scenes in many console and PC games. Shadows are created by testing whether a pixel is visible from the light source, by comparing the pixel to a z-buffer or depth image of the light source's view, stored in the form of a texture.