Bayer filterA Bayer filter mosaic is a color filter array (CFA) for arranging RGB color filters on a square grid of photosensors. Its particular arrangement of color filters is used in most single-chip digital s used in digital cameras, camcorders, and scanners to create a color image. The filter pattern is half green, one quarter red and one quarter blue, hence is also called BGGR, RGBG, GRBG, or RGGB. It is named after its inventor, Bryce Bayer of Eastman Kodak. Bayer is also known for his recursively defined matrix used in ordered dithering.
Image sensorAn image sensor or imager is a sensor that detects and conveys information used to form an . It does so by converting the variable attenuation of light waves (as they pass through or reflect off objects) into signals, small bursts of current that convey the information. The waves can be light or other electromagnetic radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, camera phones, optical mouse devices, medical imaging equipment, night vision equipment such as thermal imaging devices, radar, sonar, and others.
Color filter arrayIn digital imaging, a color filter array (CFA), or color filter mosaic (CFM), is a mosaic of tiny color filters placed over the pixel sensors of an to capture color information. The term is also used in reference to e paper devices where it means a mosaic of tiny color filters placed over the grey scale display panel to reproduce color images. Color filters are needed because the typical photosensors detect light intensity with little or no wavelength specificity and therefore cannot separate color information.
Active-pixel sensorAn active-pixel sensor (APS) is an , which was invented by Peter J.W. Noble in 1968, where each pixel sensor unit cell has a photodetector (typically a pinned photodiode) and one or more active transistors. In a metal–oxide–semiconductor (MOS) active-pixel sensor, MOS field-effect transistors (MOSFETs) are used as amplifiers. There are different types of APS, including the early NMOS APS and the now much more common complementary MOS (CMOS) APS, also known as the CMOS sensor.
Image resolutionImage resolution is the level of detail an holds. The term applies to digital images, film images, and other types of images. "Higher resolution" means more image detail. Image resolution can be measured in various ways. Resolution quantifies how close lines can be to each other and still be visibly resolved. Resolution units can be tied to physical sizes (e.g. lines per mm, lines per inch), to the overall size of a picture (lines per picture height, also known simply as lines, TV lines, or TVL), or to angular subtense.
Back-illuminated sensorA back-illuminated sensor, also known as backside illumination (BI) sensor, is a type of digital that uses a novel arrangement of the imaging elements to increase the amount of light captured and thereby improve low-light performance. The technique was used for some time in specialized roles like low-light security cameras and astronomy sensors, but was complex to build and required further refinement to become widely used. Sony was the first to reduce these problems and their costs sufficiently to introduce a 5-megapixel 1.
PixelIn digital imaging, a pixel (abbreviated px), pel, or picture element is the smallest addressable element in a raster image, or the smallest addressable element in a dot matrix display device. In most digital display devices, pixels are the smallest element that can be manipulated through software. Each pixel is a sample of an original or synthetic image; more samples typically provide more accurate representations of the original. The intensity of each pixel is variable.
RGB color modelThe RGB color model is an additive color model in which the red, green and blue primary colors of light are added together in various ways to reproduce a broad array of colors. The name of the model comes from the initials of the three additive primary colors, red, green, and blue. The main purpose of the RGB color model is for the sensing, representation, and display of images in electronic systems, such as televisions and computers, though it has also been used in conventional photography.
Color photographyColor photography is photography that uses media capable of capturing and reproducing colors. By contrast, black-and-white or gray-monochrome photography records only a single channel of luminance (brightness) and uses media capable only of showing shades of gray. In color photography, electronic sensors or light-sensitive chemicals record color information at the time of exposure. This is usually done by analyzing the spectrum of colors into three channels of information, one dominated by red, another by green and the third by blue, in imitation of the way the normal human eye senses color.
Raw image formatA camera raw image file contains unprocessed or minimally processed data from the of either a digital camera, a motion picture film scanner, or other . Raw files are so named because they are not yet processed, and contain large amounts of potentially redundant data. Normally, the image is processed by a raw converter, in a wide-gamut internal color space where precise adjustments can be made before to a viewable file format such as JPEG or PNG for storage, printing, or further manipulation.