Bayer filterA Bayer filter mosaic is a color filter array (CFA) for arranging RGB color filters on a square grid of photosensors. Its particular arrangement of color filters is used in most single-chip digital s used in digital cameras, camcorders, and scanners to create a color image. The filter pattern is half green, one quarter red and one quarter blue, hence is also called BGGR, RGBG, GRBG, or RGGB. It is named after its inventor, Bryce Bayer of Eastman Kodak. Bayer is also known for his recursively defined matrix used in ordered dithering.
Color filter arrayIn digital imaging, a color filter array (CFA), or color filter mosaic (CFM), is a mosaic of tiny color filters placed over the pixel sensors of an to capture color information. The term is also used in reference to e paper devices where it means a mosaic of tiny color filters placed over the grey scale display panel to reproduce color images. Color filters are needed because the typical photosensors detect light intensity with little or no wavelength specificity and therefore cannot separate color information.
DemosaicingA demosaicing (also de-mosaicing, demosaicking or debayering) algorithm is a used to reconstruct a full color image from the incomplete color samples output from an overlaid with a color filter array (CFA). It is also known as CFA interpolation or color reconstruction. Most modern digital cameras acquire images using a single image sensor overlaid with a CFA, so demosaicing is part of the required to render these images into a viewable format.
Active-pixel sensorAn active-pixel sensor (APS) is an , which was invented by Peter J.W. Noble in 1968, where each pixel sensor unit cell has a photodetector (typically a pinned photodiode) and one or more active transistors. In a metal–oxide–semiconductor (MOS) active-pixel sensor, MOS field-effect transistors (MOSFETs) are used as amplifiers. There are different types of APS, including the early NMOS APS and the now much more common complementary MOS (CMOS) APS, also known as the CMOS sensor.
Image sensorAn image sensor or imager is a sensor that detects and conveys information used to form an . It does so by converting the variable attenuation of light waves (as they pass through or reflect off objects) into signals, small bursts of current that convey the information. The waves can be light or other electromagnetic radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, camera phones, optical mouse devices, medical imaging equipment, night vision equipment such as thermal imaging devices, radar, sonar, and others.
ColorColor (American English) or colour (Commonwealth English) is the visual perception based on the electromagnetic spectrum. Though color is not an inherent property of matter, color perception is related to an object's light absorption, reflection, emission spectra and interference. For most humans, color are perceived in the visible light spectrum with three types of cone cells (trichromacy). Other animals may have a different number of cone cell types or have eyes sensitive to different wavelength, such as bees that can distinguish ultraviolet, and thus have a different color sensitivity range.
Digital imageA digital image is an composed of picture elements, also known as pixels, each with finite, discrete quantities of numeric representation for its intensity or gray level that is an output from its two-dimensional functions fed as input by its spatial coordinates denoted with x, y on the x-axis and y-axis, respectively. Depending on whether the is fixed, it may be of vector or raster type. Raster image Raster images have a finite set of digital values, called picture elements or pixels.
Image processorAn image processor, also known as an image processing engine, image processing unit (IPU), or image signal processor (ISP), is a type of media processor or specialized digital signal processor (DSP) used for , in digital cameras or other devices. Image processors often employ parallel computing even with SIMD or MIMD technologies to increase speed and efficiency. The processing engine can perform a range of tasks. To increase the system integration on embedded devices, often it is a system on a chip with multi-core processor architecture.
Digital image processingDigital image processing is the use of a digital computer to process s through an algorithm. As a subcategory or field of digital signal processing, digital image processing has many advantages over . It allows a much wider range of algorithms to be applied to the input data and can avoid problems such as the build-up of noise and distortion during processing. Since images are defined over two dimensions (perhaps more) digital image processing may be modeled in the form of multidimensional systems.
Color spaceA color space is a specific organization of colors. In combination with color profiling supported by various physical devices, it supports reproducible representations of color - whether such representation entails an analog or a digital representation. A color space may be arbitrary, i.e. with physically realized colors assigned to a set of physical color swatches with corresponding assigned color names (including discrete numbers in - for example - the Pantone collection), or structured with mathematical rigor (as with the NCS System, Adobe RGB and sRGB).