Bone densityBone density, or bone mineral density, is the amount of bone mineral in bone tissue. The concept is of mass of mineral per volume of bone (relating to density in the physics sense), although clinically it is measured by proxy according to optical density per square centimetre of bone surface upon imaging. Bone density measurement is used in clinical medicine as an indirect indicator of osteoporosis and fracture risk. It is measured by a procedure called densitometry, often performed in the radiology or nuclear medicine departments of hospitals or clinics.
Low-density lipoproteinLow-density lipoprotein (LDL) is one of the five major groups of lipoprotein that transport all fat molecules around the body in extracellular water. These groups, from least dense to most dense, are chylomicrons (aka ULDL by the overall density naming convention), very low-density lipoprotein (VLDL), intermediate-density lipoprotein (IDL), low-density lipoprotein (LDL) and high-density lipoprotein (HDL). LDL delivers fat molecules to cells. LDL is involved in atherosclerosis, a process in which it is oxidized within the walls of arteries.
OsteopeniaOsteopenia, known as "low bone mass" or "low bone density", is a condition in which bone mineral density is low. Because their bones are weaker, people with osteopenia may have a higher risk of fractures, and some people may go on to develop osteoporosis. In 2010, 43 million older adults in the US had osteopenia. Unlike osteoporosis, osteopenia does not usually cause symptoms, and losing bone density in itself does not cause pain.
High-density lipoproteinHigh-density lipoprotein (HDL) is one of the five major groups of lipoproteins. Lipoproteins are complex particles composed of multiple proteins which transport all fat molecules (lipids) around the body within the water outside cells. They are typically composed of 80–100 proteins per particle (organized by one, two or three ApoA). HDL particles enlarge while circulating in the blood, aggregating more fat molecules and transporting up to hundreds of fat molecules per particle.
Boosting (machine learning)In machine learning, boosting is an ensemble meta-algorithm for primarily reducing bias, and also variance in supervised learning, and a family of machine learning algorithms that convert weak learners to strong ones. Boosting is based on the question posed by Kearns and Valiant (1988, 1989): "Can a set of weak learners create a single strong learner?" A weak learner is defined to be a classifier that is only slightly correlated with the true classification (it can label examples better than random guessing).
Modern philosophyModern philosophy is philosophy developed in the modern era and associated with modernity. It is not a specific doctrine or school (and thus should not be confused with Modernism), although there are certain assumptions common to much of it, which helps to distinguish it from earlier philosophy. The 17th and early 20th centuries roughly mark the beginning and the end of modern philosophy. How much of the Renaissance should be included is a matter for dispute; likewise modernity may or may not have ended in the twentieth century and been replaced by postmodernity.
AdaBoostAdaBoost, short for Adaptive Boosting, is a statistical classification meta-algorithm formulated by Yoav Freund and Robert Schapire in 1995, who won the 2003 Gödel Prize for their work. It can be used in conjunction with many other types of learning algorithms to improve performance. The output of the other learning algorithms ('weak learners') is combined into a weighted sum that represents the final output of the boosted classifier. Usually, AdaBoost is presented for binary classification, although it can be generalized to multiple classes or bounded intervals on the real line.
EstimatorIn statistics, an estimator is a rule for calculating an estimate of a given quantity based on observed data: thus the rule (the estimator), the quantity of interest (the estimand) and its result (the estimate) are distinguished. For example, the sample mean is a commonly used estimator of the population mean. There are point and interval estimators. The point estimators yield single-valued results. This is in contrast to an interval estimator, where the result would be a range of plausible values.
AutoencoderAn autoencoder is a type of artificial neural network used to learn efficient codings of unlabeled data (unsupervised learning). An autoencoder learns two functions: an encoding function that transforms the input data, and a decoding function that recreates the input data from the encoded representation. The autoencoder learns an efficient representation (encoding) for a set of data, typically for dimensionality reduction. Variants exist, aiming to force the learned representations to assume useful properties.
EstimationEstimation (or estimating) is the process of finding an estimate or approximation, which is a value that is usable for some purpose even if input data may be incomplete, uncertain, or unstable. The value is nonetheless usable because it is derived from the best information available. Typically, estimation involves "using the value of a statistic derived from a sample to estimate the value of a corresponding population parameter".