Active-pixel sensorAn active-pixel sensor (APS) is an , which was invented by Peter J.W. Noble in 1968, where each pixel sensor unit cell has a photodetector (typically a pinned photodiode) and one or more active transistors. In a metal–oxide–semiconductor (MOS) active-pixel sensor, MOS field-effect transistors (MOSFETs) are used as amplifiers. There are different types of APS, including the early NMOS APS and the now much more common complementary MOS (CMOS) APS, also known as the CMOS sensor.
Image sensorAn image sensor or imager is a sensor that detects and conveys information used to form an . It does so by converting the variable attenuation of light waves (as they pass through or reflect off objects) into signals, small bursts of current that convey the information. The waves can be light or other electromagnetic radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, camera phones, optical mouse devices, medical imaging equipment, night vision equipment such as thermal imaging devices, radar, sonar, and others.
Color filter arrayIn digital imaging, a color filter array (CFA), or color filter mosaic (CFM), is a mosaic of tiny color filters placed over the pixel sensors of an to capture color information. The term is also used in reference to e paper devices where it means a mosaic of tiny color filters placed over the grey scale display panel to reproduce color images. Color filters are needed because the typical photosensors detect light intensity with little or no wavelength specificity and therefore cannot separate color information.
Optical aberrationIn optics, aberration is a property of optical systems, such as lenses, that causes light to be spread out over some region of space rather than focused to a point. Aberrations cause the image formed by a lens to be blurred or distorted, with the nature of the distortion depending on the type of aberration. Aberration can be defined as a departure of the performance of an optical system from the predictions of paraxial optics.
Digital imageA digital image is an composed of picture elements, also known as pixels, each with finite, discrete quantities of numeric representation for its intensity or gray level that is an output from its two-dimensional functions fed as input by its spatial coordinates denoted with x, y on the x-axis and y-axis, respectively. Depending on whether the is fixed, it may be of vector or raster type. Raster image Raster images have a finite set of digital values, called picture elements or pixels.
Bayer filterA Bayer filter mosaic is a color filter array (CFA) for arranging RGB color filters on a square grid of photosensors. Its particular arrangement of color filters is used in most single-chip digital s used in digital cameras, camcorders, and scanners to create a color image. The filter pattern is half green, one quarter red and one quarter blue, hence is also called BGGR, RGBG, GRBG, or RGGB. It is named after its inventor, Bryce Bayer of Eastman Kodak. Bayer is also known for his recursively defined matrix used in ordered dithering.
LensA lens is a transmissive optical device that focuses or disperses a light beam by means of refraction. A simple lens consists of a single piece of transparent material, while a compound lens consists of several simple lenses (elements), usually arranged along a common axis. Lenses are made from materials such as glass or plastic and are ground, polished, or molded to the required shape. A lens can focus light to form an , unlike a prism, which refracts light without focusing.
Focal lengthThe focal length of an optical system is a measure of how strongly the system converges or diverges light; it is the inverse of the system's optical power. A positive focal length indicates that a system converges light, while a negative focal length indicates that the system diverges light. A system with a shorter focal length bends the rays more sharply, bringing them to a focus in a shorter distance or diverging them more quickly.
Digital image processingDigital image processing is the use of a digital computer to process s through an algorithm. As a subcategory or field of digital signal processing, digital image processing has many advantages over . It allows a much wider range of algorithms to be applied to the input data and can avoid problems such as the build-up of noise and distortion during processing. Since images are defined over two dimensions (perhaps more) digital image processing may be modeled in the form of multidimensional systems.
EyepieceAn eyepiece, or ocular lens, is a type of lens that is attached to a variety of optical devices such as telescopes and microscopes. It is named because it is usually the lens that is closest to the eye when someone looks through an optical device to observe an object or sample. The objective lens or mirror collects light from an object or sample and brings it to focus creating an image of the object. The eyepiece is placed near the focal point of the objective to magnify this image to the eyes.