Central tendencyIn statistics, a central tendency (or measure of central tendency) is a central or typical value for a probability distribution. Colloquially, measures of central tendency are often called averages. The term central tendency dates from the late 1920s. The most common measures of central tendency are the arithmetic mean, the median, and the mode. A middle tendency can be calculated for either a finite set of values or for a theoretical distribution, such as the normal distribution.
Reduced chi-squared statisticIn statistics, the reduced chi-square statistic is used extensively in goodness of fit testing. It is also known as mean squared weighted deviation (MSWD) in isotopic dating and variance of unit weight in the context of weighted least squares. Its square root is called regression standard error, standard error of the regression, or standard error of the equation (see ) It is defined as chi-square per degree of freedom: where the chi-squared is a weighted sum of squared deviations: with inputs: variance , observations O, and calculated data C.
Deformation (physics)In physics and continuum mechanics, deformation is the transformation of a body from a reference configuration to a current configuration. A configuration is a set containing the positions of all particles of the body. A deformation can occur because of external loads, intrinsic activity (e.g. muscle contraction), body forces (such as gravity or electromagnetic forces), or changes in temperature, moisture content, or chemical reactions, etc. Strain is related to deformation in terms of relative displacement of particles in the body that excludes rigid-body motions.
Euler methodIn mathematics and computational science, the Euler method (also called the forward Euler method) is a first-order numerical procedure for solving ordinary differential equations (ODEs) with a given initial value. It is the most basic explicit method for numerical integration of ordinary differential equations and is the simplest Runge–Kutta method. The Euler method is named after Leonhard Euler, who first proposed it in his book Institutionum calculi integralis (published 1768–1870).
Log-normal distributionIn probability theory, a log-normal (or lognormal) distribution is a continuous probability distribution of a random variable whose logarithm is normally distributed. Thus, if the random variable X is log-normally distributed, then Y = ln(X) has a normal distribution. Equivalently, if Y has a normal distribution, then the exponential function of Y, X = exp(Y), has a log-normal distribution. A random variable which is log-normally distributed takes only positive real values.
Problem solvingProblem solving is the process of achieving a goal by overcoming obstacles, a frequent part of most activities. Problems in need of solutions range from simple personal tasks (e.g. how to turn on an appliance) to complex issues in business and technical fields. The former is an example of simple problem solving (SPS) addressing one issue, whereas the latter is complex problem solving (CPS) with multiple interrelated obstacles.
Regular measureIn mathematics, a regular measure on a topological space is a measure for which every measurable set can be approximated from above by open measurable sets and from below by compact measurable sets. Let (X, T) be a topological space and let Σ be a σ-algebra on X. Let μ be a measure on (X, Σ). A measurable subset A of X is said to be inner regular if and said to be outer regular if A measure is called inner regular if every measurable set is inner regular.
Least absolute deviationsLeast absolute deviations (LAD), also known as least absolute errors (LAE), least absolute residuals (LAR), or least absolute values (LAV), is a statistical optimality criterion and a statistical optimization technique based on minimizing the sum of absolute deviations (also sum of absolute residuals or sum of absolute errors) or the L1 norm of such values. It is analogous to the least squares technique, except that it is based on absolute values instead of squared values.
Propagation of uncertaintyIn statistics, propagation of uncertainty (or propagation of error) is the effect of variables' uncertainties (or errors, more specifically random errors) on the uncertainty of a function based on them. When the variables are the values of experimental measurements they have uncertainties due to measurement limitations (e.g., instrument precision) which propagate due to the combination of variables in the function. The uncertainty u can be expressed in a number of ways. It may be defined by the absolute error Δx.
Jordan measureIn mathematics, the Peano–Jordan measure (also known as the Jordan content) is an extension of the notion of size (length, area, volume) to shapes more complicated than, for example, a triangle, disk, or parallelepiped. It turns out that for a set to have Jordan measure it should be well-behaved in a certain restrictive sense. For this reason, it is now more common to work with the Lebesgue measure, which is an extension of the Jordan measure to a larger class of sets.