Independent component analysisIn signal processing, independent component analysis (ICA) is a computational method for separating a multivariate signal into additive subcomponents. This is done by assuming that at most one subcomponent is Gaussian and that the subcomponents are statistically independent from each other. ICA is a special case of blind source separation. A common example application is the "cocktail party problem" of listening in on one person's speech in a noisy room.
MagnetoencephalographyMagnetoencephalography (MEG) is a functional neuroimaging technique for mapping brain activity by recording magnetic fields produced by electrical currents occurring naturally in the brain, using very sensitive magnetometers. Arrays of SQUIDs (superconducting quantum interference devices) are currently the most common magnetometer, while the SERF (spin exchange relaxation-free) magnetometer is being investigated for future machines.
Repeated measures designRepeated measures design is a research design that involves multiple measures of the same variable taken on the same or matched subjects either under different conditions or over two or more time periods. For instance, repeated measurements are collected in a longitudinal study in which change over time is assessed. Crossover study A popular repeated-measures design is the crossover study. A crossover study is a longitudinal study in which subjects receive a sequence of different treatments (or exposures).
Hazard (computer architecture)In the domain of central processing unit (CPU) design, hazards are problems with the instruction pipeline in CPU microarchitectures when the next instruction cannot execute in the following clock cycle, and can potentially lead to incorrect computation results. Three common types of hazards are data hazards, structural hazards, and control hazards (branching hazards). There are several methods used to deal with hazards, including pipeline stalls/pipeline bubbling, operand forwarding, and in the case of out-of-order execution, the scoreboarding method and the Tomasulo algorithm.
Pipeline (software)In software engineering, a pipeline consists of a chain of processing elements (processes, threads, coroutines, functions, etc.), arranged so that the output of each element is the input of the next; the name is by analogy to a physical pipeline. Usually some amount of buffering is provided between consecutive elements. The information that flows in these pipelines is often a stream of records, bytes, or bits, and the elements of a pipeline may be called filters; this is also called the pipe(s) and filters design pattern.
Population controlPopulation control is the practice of artificially maintaining the size of any population. It simply refers to the act of limiting the size of an animal population so that it remains manageable, as opposed to the act of protecting a species from excessive rates of extinction, which is referred to as conservation biology. While many abiotic and biotic factors influence population control, humans are notably influential against animal populations.
Pipeline (Unix)In Unix-like computer operating systems, a pipeline is a mechanism for inter-process communication using message passing. A pipeline is a set of processes chained together by their standard streams, so that the output text of each process (stdout) is passed directly as input (stdin) to the next one. The second process is started as the first process is still executing, and they are executed concurrently. The concept of pipelines was championed by Douglas McIlroy at Unix's ancestral home of Bell Labs, during the development of Unix, shaping its toolbox philosophy.
Typology (archaeology)In archaeology, a typology is the result of the classification of things according to their physical characteristics. The products of the classification, i.e. the classes, are also called types. Most archaeological typologies organize portable artifacts into types, but typologies of larger structures, including buildings, field monuments, fortifications or roads, are equally possible. A typology helps to manage a large mass of archaeological data.
Lithic analysisIn archaeology, lithic analysis is the analysis of stone tools and other chipped stone artifacts using basic scientific techniques. At its most basic level, lithic analyses involve an analysis of the artifact's Morphology (archaeology), the measurement of various physical attributes, and examining other visible features (such as noting the presence or absence of cortex, for example). The term 'lithic analysis' can technically refer to the study of any anthropogenic (human-created) stone, but in its usual sense it is applied to archaeological material that was produced through lithic reduction (knapping) or ground stone.