Texture mappingTexture mapping is a method for mapping a texture on a . Texture here can be high frequency detail, surface texture, or color. The original technique was pioneered by Edwin Catmull in 1974. Texture mapping originally referred to diffuse mapping, a method that simply mapped pixels from a texture to a 3D surface ("wrapping" the image around the object).
Virtual realityVirtual reality (VR) is a simulated experience that employs pose tracking and 3D near-eye displays to give the user an immersive feel of a virtual world. Applications of virtual reality include entertainment (particularly video games), education (such as medical or military training) and business (such as virtual meetings). Other distinct types of VR-style technology include augmented reality and mixed reality, sometimes referred to as extended reality or XR, although definitions are currently changing due to the nascence of the industry.
Immersion (virtual reality)Immersion into virtual reality (VR) is a perception of being physically present in a non-physical world. The perception is created by surrounding the user of the VR system in images, sound or other stimuli that provide an engrossing total environment. The name is a metaphoric use of the experience of submersion applied to representation, fiction or simulation.
Texture filteringIn computer graphics, texture filtering or texture smoothing is the method used to determine the texture color for a texture mapped pixel, using the colors of nearby texels (pixels of the texture). There are two main categories of texture filtering, magnification filtering and minification filtering. Depending on the situation texture filtering is either a type of reconstruction filter where sparse data is interpolated to fill gaps (magnification), or a type of anti-aliasing (AA), where texture samples exist at a higher frequency than required for the sample frequency needed for texture fill (minification).
Bump mappingBump mapping is a texture mapping technique in computer graphics for simulating bumps and wrinkles on the surface of an object. This is achieved by perturbing the surface normals of the object and using the perturbed normal during lighting calculations. The result is an apparently bumpy surface rather than a smooth surface although the surface of the underlying object is not changed. Bump mapping was introduced by James Blinn in 1978. Normal mapping is the most common variation of bump mapping used.
Virtual reality headsetA virtual reality headset (or VR headset) is a head-mounted device that provides virtual reality for the wearer. VR headsets are widely used with VR video games but they are also used in other applications, including simulators and trainers. VR headsets typically include a stereoscopic display (providing separate images for each eye), stereo sound, and sensors like accelerometers and gyroscopes for tracking the pose of the user's head to match the orientation of the virtual camera with the user's eye positions in the real world.
Augmented realityAugmented reality (AR) is an interactive experience that combines the real world and computer-generated content. The content can span multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive (i.e. additive to the natural environment), or destructive (i.
Reflection mappingIn computer graphics, environment mapping, or reflection mapping, is an efficient technique for approximating the appearance of a reflective surface by means of a precomputed texture. The texture is used to store the of the distant environment surrounding the rendered object. Several ways of storing the surrounding environment have been employed. The first technique was sphere mapping, in which a single texture contains the image of the surroundings as reflected on a spherical mirror.
Rendering (computer graphics)Rendering or image synthesis is the process of generating a photorealistic or non-photorealistic image from a 2D or 3D model by means of a computer program. The resulting image is referred to as the render. Multiple models can be defined in a scene file containing objects in a strictly defined language or data structure. The scene file contains geometry, viewpoint, texture, lighting, and shading information describing the virtual scene. The data contained in the scene file is then passed to a rendering program to be processed and output to a or raster graphics image file.
Procedural textureIn computer graphics, a procedural texture is a texture created using a mathematical description (i.e. an algorithm) rather than directly stored data. The advantage of this approach is low storage cost, unlimited texture resolution and easy texture mapping. These kinds of textures are often used to model surface or volumetric representations of natural elements such as wood, marble, granite, metal, stone, and others. Usually, the natural look of the rendered result is achieved by the usage of fractal noise and turbulence functions.