Hyperparameter optimizationIn machine learning, hyperparameter optimization or tuning is the problem of choosing a set of optimal hyperparameters for a learning algorithm. A hyperparameter is a parameter whose value is used to control the learning process. By contrast, the values of other parameters (typically node weights) are learned. The same kind of machine learning model can require different constraints, weights or learning rates to generalize different data patterns.
Response surface methodologyIn statistics, response surface methodology (RSM) explores the relationships between several explanatory variables and one or more response variables. The method was introduced by George E. P. Box and K. B. Wilson in 1951. The main idea of RSM is to use a sequence of designed experiments to obtain an optimal response. Box and Wilson suggest using a second-degree polynomial model to do this. They acknowledge that this model is only an approximation, but they use it because such a model is easy to estimate and apply, even when little is known about the process.
Meta-optimizationIn numerical optimization, meta-optimization is the use of one optimization method to tune another optimization method. Meta-optimization is reported to have been used as early as in the late 1970s by Mercer and Sampson for finding optimal parameter settings of a genetic algorithm. Meta-optimization and related concepts are also known in the literature as meta-evolution, super-optimization, automated parameter calibration, hyper-heuristics, etc.
Automated machine learningAutomated machine learning (AutoML) is the process of automating the tasks of applying machine learning to real-world problems. AutoML potentially includes every stage from beginning with a raw dataset to building a machine learning model ready for deployment. AutoML was proposed as an artificial intelligence-based solution to the growing challenge of applying machine learning. The high degree of automation in AutoML aims to allow non-experts to make use of machine learning models and techniques without requiring them to become experts in machine learning.
ExperimentAn experiment is a procedure carried out to support or refute a hypothesis, or determine the efficacy or likelihood of something previously untried. Experiments provide insight into cause-and-effect by demonstrating what outcome occurs when a particular factor is manipulated. Experiments vary greatly in goal and scale but always rely on repeatable procedure and logical analysis of the results. There also exist natural experimental studies.
Magnitude (astronomy)In astronomy, magnitude is measure of the brightness of an object, usually in a defined passband. An imprecise but systematic determination of the magnitude of objects was introduced in ancient times by Hipparchus. Magnitude values do not have a unit. The scale is logarithmic and defined such that a magnitude 1 star is exactly 100 times brighter than a magnitude 6 star. Thus each step of one magnitude is times brighter than the magnitude 1 higher.
Apparent magnitudeApparent magnitude (m) is a measure of the brightness of a star or other astronomical object. An object's apparent magnitude depends on its intrinsic luminosity, its distance, and any extinction of the object's light caused by interstellar dust along the line of sight to the observer. The word magnitude in astronomy, unless stated otherwise, usually refers to a celestial object's apparent magnitude. The magnitude scale dates back to the ancient Roman astronomer Claudius Ptolemy, whose star catalog listed stars from 1st magnitude (brightest) to 6th magnitude (dimmest).
Scientific controlInformal improvements in any process or enquiry have been made by comparison between what was done previously and the new method for thousands of years. A scientific control is a modern formal experiment or observation designed to minimize the effects of variables other than the independent variable (i.e. confounding variables). This increases the reliability of the results, often through a comparison between control measurements and the other measurements. Scientific controls are a part of the scientific method.
Absorption spectroscopyAbsorption spectroscopy refers to spectroscopic techniques that measure the absorption of electromagnetic radiation, as a function of frequency or wavelength, due to its interaction with a sample. The sample absorbs energy, i.e., photons, from the radiating field. The intensity of the absorption varies as a function of frequency, and this variation is the absorption spectrum. Absorption spectroscopy is performed across the electromagnetic spectrum.
Absorption (electromagnetic radiation)In physics, absorption of electromagnetic radiation is how matter (typically electrons bound in atoms) takes up a photon's energy — and so transforms electromagnetic energy into internal energy of the absorber (for example, thermal energy). A notable effect is attenuation, or the gradual reduction of the intensity of light waves as they propagate through a medium. Although the absorption of waves does not usually depend on their intensity (linear absorption), in certain conditions (optics) the medium's transparency changes by a factor that varies as a function of wave intensity, and saturable absorption (or nonlinear absorption) occurs.