Synthetic geometrySynthetic geometry (sometimes referred to as axiomatic geometry or even pure geometry) is geometry without the use of coordinates. It relies on the axiomatic method for proving all results from a few basic properties initially called postulate, and at present called axioms. The term "synthetic geometry" has been coined only after the 17th century, and the introduction by René Descartes of the coordinate method, which was called analytic geometry.
Beltrami–Klein modelIn geometry, the Beltrami–Klein model, also called the projective model, Klein disk model, and the Cayley–Klein model, is a model of hyperbolic geometry in which points are represented by the points in the interior of the unit disk (or n-dimensional unit ball) and lines are represented by the chords, straight line segments with ideal endpoints on the boundary sphere. The Beltrami–Klein model is named after the Italian geometer Eugenio Beltrami and the German Felix Klein while "Cayley" in Cayley–Klein model refers to the English geometer Arthur Cayley.
Sensitivity analysisSensitivity analysis is the study of how the uncertainty in the output of a mathematical model or system (numerical or otherwise) can be divided and allocated to different sources of uncertainty in its inputs. A related practice is uncertainty analysis, which has a greater focus on uncertainty quantification and propagation of uncertainty; ideally, uncertainty and sensitivity analysis should be run in tandem.
Measurement uncertaintyIn metrology, measurement uncertainty is the expression of the statistical dispersion of the values attributed to a measured quantity. All measurements are subject to uncertainty and a measurement result is complete only when it is accompanied by a statement of the associated uncertainty, such as the standard deviation. By international agreement, this uncertainty has a probabilistic basis and reflects incomplete knowledge of the quantity value. It is a non-negative parameter.
Uncertainty quantificationUncertainty quantification (UQ) is the science of quantitative characterization and estimation of uncertainties in both computational and real world applications. It tries to determine how likely certain outcomes are if some aspects of the system are not exactly known. An example would be to predict the acceleration of a human body in a head-on crash with another car: even if the speed was exactly known, small differences in the manufacturing of individual cars, how tightly every bolt has been tightened, etc.
Two-port networkIn electronics, a two-port network (a kind of four-terminal network or quadripole) is an electrical network (i.e. a circuit) or device with two pairs of terminals to connect to external circuits. Two terminals constitute a port if the currents applied to them satisfy the essential requirement known as the port condition: the current entering one terminal must equal the current emerging from the other terminal on the same port. The ports constitute interfaces where the network connects to other networks, the points where signals are applied or outputs are taken.
Foundations of geometryFoundations of geometry is the study of geometries as axiomatic systems. There are several sets of axioms which give rise to Euclidean geometry or to non-Euclidean geometries. These are fundamental to the study and of historical importance, but there are a great many modern geometries that are not Euclidean which can be studied from this viewpoint. The term axiomatic geometry can be applied to any geometry that is developed from an axiom system, but is often used to mean Euclidean geometry studied from this point of view.
Geometrization conjectureIn mathematics, Thurston's geometrization conjecture states that each of certain three-dimensional topological spaces has a unique geometric structure that can be associated with it. It is an analogue of the uniformization theorem for two-dimensional surfaces, which states that every simply connected Riemann surface can be given one of three geometries (Euclidean, spherical, or hyperbolic). In three dimensions, it is not always possible to assign a single geometry to a whole topological space.
Propagation of uncertaintyIn statistics, propagation of uncertainty (or propagation of error) is the effect of variables' uncertainties (or errors, more specifically random errors) on the uncertainty of a function based on them. When the variables are the values of experimental measurements they have uncertainties due to measurement limitations (e.g., instrument precision) which propagate due to the combination of variables in the function. The uncertainty u can be expressed in a number of ways. It may be defined by the absolute error Δx.
UncertaintyUncertainty refers to epistemic situations involving imperfect or unknown information. It applies to predictions of future events, to physical measurements that are already made, or to the unknown. Uncertainty arises in partially observable or stochastic environments, as well as due to ignorance, indolence, or both. It arises in any number of fields, including insurance, philosophy, physics, statistics, economics, finance, medicine, psychology, sociology, engineering, metrology, meteorology, ecology and information science.