Reconstruction AmendmentsThe , or the , are the Thirteenth, Fourteenth, and Fifteenth amendments to the United States Constitution, adopted between 1865 and 1870. The amendments were a part of the implementation of the Reconstruction of the American South which occurred after the war. The Thirteenth Amendment (proposed in 1864 and ratified in 1865) abolished slavery and involuntary servitude, except for those duly convicted of a crime. The Fourteenth Amendment (proposed in 1866 and ratified in 1868) addresses citizenship rights and equal protection of the laws for all persons.
Probability interpretationsThe word probability has been used in a variety of ways since it was first applied to the mathematical study of games of chance. Does probability measure the real, physical, tendency of something to occur, or is it a measure of how strongly one believes it will occur, or does it draw on both these elements? In answering such questions, mathematicians interpret the probability values of probability theory. There are two broad categories of probability interpretations which can be called "physical" and "evidential" probabilities.
MeasurementMeasurement is the quantification of attributes of an object or event, which can be used to compare with other objects or events. In other words, measurement is a process of determining how large or small a physical quantity is as compared to a basic reference quantity of the same kind. The scope and application of measurement are dependent on the context and discipline. In natural sciences and engineering, measurements do not apply to nominal properties of objects or events, which is consistent with the guidelines of the International vocabulary of metrology published by the International Bureau of Weights and Measures.
Conditional probabilityIn probability theory, conditional probability is a measure of the probability of an event occurring, given that another event (by assumption, presumption, assertion or evidence) has already occurred. This particular method relies on event B occurring with some sort of relationship with another event A. In this event, the event B can be analyzed by a conditional probability with respect to A. If the event of interest is A and the event B is known or assumed to have occurred, "the conditional probability of A given B", or "the probability of A under the condition B", is usually written as P(AB) or occasionally P_B(A).
Measurement uncertaintyIn metrology, measurement uncertainty is the expression of the statistical dispersion of the values attributed to a measured quantity. All measurements are subject to uncertainty and a measurement result is complete only when it is accompanied by a statement of the associated uncertainty, such as the standard deviation. By international agreement, this uncertainty has a probabilistic basis and reflects incomplete knowledge of the quantity value. It is a non-negative parameter.
Frame bundleIn mathematics, a frame bundle is a principal fiber bundle F(E) associated to any vector bundle E. The fiber of F(E) over a point x is the set of all ordered bases, or frames, for Ex. The general linear group acts naturally on F(E) via a change of basis, giving the frame bundle the structure of a principal GL(k, R)-bundle (where k is the rank of E). The frame bundle of a smooth manifold is the one associated to its tangent bundle. For this reason it is sometimes called the tangent frame bundle.
Associative arrayIn computer science, an associative array, map, symbol table, or dictionary is an abstract data type that stores a collection of (key, value) pairs, such that each possible key appears at most once in the collection. In mathematical terms, an associative array is a function with finite domain. It supports 'lookup', 'remove', and 'insert' operations. The dictionary problem is the classic problem of designing efficient data structures that implement associative arrays.
Time complexityIn computer science, the time complexity is the computational complexity that describes the amount of computer time it takes to run an algorithm. Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform. Thus, the amount of time taken and the number of elementary operations performed by the algorithm are taken to be related by a constant factor.