MicroscopyMicroscopy is the technical field of using microscopes to view objects and areas of objects that cannot be seen with the naked eye (objects that are not within the resolution range of the normal eye). There are three well-known branches of microscopy: optical, electron, and scanning probe microscopy, along with the emerging field of X-ray microscopy. Optical microscopy and electron microscopy involve the diffraction, reflection, or refraction of electromagnetic radiation/electron beams interacting with the specimen, and the collection of the scattered radiation or another signal in order to create an image.
Equalization (audio)Equalization, or simply EQ, in sound recording and reproduction is the process of adjusting the volume of different frequency bands within an audio signal. The circuit or equipment used to achieve this is called an equalizer. Most hi-fi equipment uses relatively simple filters to make bass and treble adjustments. Graphic and parametric equalizers have much more flexibility in tailoring the frequency content of an audio signal.
Blob detectionIn computer vision, blob detection methods are aimed at detecting regions in a that differ in properties, such as brightness or color, compared to surrounding regions. Informally, a blob is a region of an image in which some properties are constant or approximately constant; all the points in a blob can be considered in some sense to be similar to each other. The most common method for blob detection is convolution.
Active contour modelActive contour model, also called snakes, is a framework in computer vision introduced by Michael Kass, Andrew Witkin, and Demetri Terzopoulos for delineating an object outline from a possibly 2D . The snakes model is popular in computer vision, and snakes are widely used in applications like object tracking, shape recognition, , edge detection and stereo matching. A snake is an energy minimizing, deformable spline influenced by constraint and image forces that pull it towards object contours and internal forces that resist deformation.
Statistical inferenceStatistical inference is the process of using data analysis to infer properties of an underlying distribution of probability. Inferential statistical analysis infers properties of a population, for example by testing hypotheses and deriving estimates. It is assumed that the observed data set is sampled from a larger population. Inferential statistics can be contrasted with descriptive statistics. Descriptive statistics is solely concerned with properties of the observed data, and it does not rest on the assumption that the data come from a larger population.
Super-resolution microscopySuper-resolution microscopy is a series of techniques in optical microscopy that allow such images to have resolutions higher than those imposed by the diffraction limit, which is due to the diffraction of light. Super-resolution imaging techniques rely on the near-field (photon-tunneling microscopy as well as those that use the Pendry Superlens and near field scanning optical microscopy) or on the far-field.
Confocal microscopyConfocal microscopy, most frequently confocal laser scanning microscopy (CLSM) or laser scanning confocal microscopy (LSCM), is an optical imaging technique for increasing optical resolution and contrast of a micrograph by means of using a spatial pinhole to block out-of-focus light in image formation. Capturing multiple two-dimensional images at different depths in a sample enables the reconstruction of three-dimensional structures (a process known as optical sectioning) within an object.
Direct manipulation interfaceIn computer science, human–computer interaction, and interaction design, direct manipulation is an approach to interfaces which involves continuous representation of objects of interest together with rapid, reversible, and incremental actions and feedback. As opposed to other interaction styles, for example, the command language, the intention of direct manipulation is to allow a user to manipulate objects presented to them, using actions that correspond at least loosely to manipulation of physical objects.
Program optimizationIn computer science, program optimization, code optimization, or software optimization, is the process of modifying a software system to make some aspect of it work more efficiently or use fewer resources. In general, a computer program may be optimized so that it executes more rapidly, or to make it capable of operating with less memory storage or other resources, or draw less power. Although the word "optimization" shares the same root as "optimal", it is rare for the process of optimization to produce a truly optimal system.
Beta distributionIn probability theory and statistics, the beta distribution is a family of continuous probability distributions defined on the interval [0, 1] or (0, 1) in terms of two positive parameters, denoted by alpha (α) and beta (β), that appear as exponents of the variable and its complement to 1, respectively, and control the shape of the distribution. The beta distribution has been applied to model the behavior of random variables limited to intervals of finite length in a wide variety of disciplines.