One-dimensional spaceIn physics and mathematics, a sequence of n numbers can specify a location in n-dimensional space. When n = 1, the set of all such locations is called a one-dimensional space. An example of a one-dimensional space is the number line, where the position of each point on it can be described by a single number. In algebraic geometry there are several structures that are technically one-dimensional spaces but referred to in other terms. A field k is a one-dimensional vector space over itself.
Proton decayIn particle physics, proton decay is a hypothetical form of particle decay in which the proton decays into lighter subatomic particles, such as a neutral pion and a positron. The proton decay hypothesis was first formulated by Andrei Sakharov in 1967. Despite significant experimental effort, proton decay has never been observed. If it does decay via a positron, the proton's half-life is constrained to be at least 1.67e34 years.
Particle detectorIn experimental and applied particle physics, nuclear physics, and nuclear engineering, a particle detector, also known as a radiation detector, is a device used to detect, track, and/or identify ionizing particles, such as those produced by nuclear decay, cosmic radiation, or reactions in a particle accelerator. Detectors can measure the particle energy and other attributes such as momentum, spin, charge, particle type, in addition to merely registering the presence of the particle.
68–95–99.7 ruleIn statistics, the 68–95–99.7 rule, also known as the empirical rule, is a shorthand used to remember the percentage of values that lie within an interval estimate in a normal distribution: 68%, 95%, and 99.7% of the values lie within one, two, and three standard deviations of the mean, respectively. In mathematical notation, these facts can be expressed as follows, where Pr() is the probability function, Χ is an observation from a normally distributed random variable, μ (mu) is the mean of the distribution, and σ (sigma) is its standard deviation: The usefulness of this heuristic especially depends on the question under consideration.
Sampling (statistics)In statistics, quality assurance, and survey methodology, sampling is the selection of a subset or a statistical sample (termed sample for short) of individuals from within a statistical population to estimate characteristics of the whole population. Statisticians attempt to collect samples that are representative of the population. Sampling has lower costs and faster data collection compared to recording data from the entire population, and thus, it can provide insights in cases where it is infeasible to measure an entire population.
Physics beyond the Standard ModelPhysics beyond the Standard Model (BSM) refers to the theoretical developments needed to explain the deficiencies of the Standard Model, such as the inability to explain the fundamental parameters of the standard model, the strong CP problem, neutrino oscillations, matter–antimatter asymmetry, and the nature of dark matter and dark energy. Another problem lies within the mathematical framework of the Standard Model itself: the Standard Model is inconsistent with that of general relativity, and one or both theories break down under certain conditions, such as spacetime singularities like the Big Bang and black hole event horizons.
M-theoryM-theory is a theory in physics that unifies all consistent versions of superstring theory. Edward Witten first conjectured the existence of such a theory at a string theory conference at the University of Southern California in 1995 (M-Theory - Edward Witten (1995)). Witten's announcement initiated a flurry of research activity known as the second superstring revolution. Prior to Witten's announcement, string theorists had identified five versions of superstring theory.
ParameterA parameter (), generally, is any characteristic that can help in defining or classifying a particular system (meaning an event, project, object, situation, etc.). That is, a parameter is an element of a system that is useful, or critical, when identifying the system, or when evaluating its performance, status, condition, etc. Parameter has more specific meanings within various disciplines, including mathematics, computer programming, engineering, statistics, logic, linguistics, and electronic musical composition.
Stratified samplingIn statistics, stratified sampling is a method of sampling from a population which can be partitioned into subpopulations. In statistical surveys, when subpopulations within an overall population vary, it could be advantageous to sample each subpopulation (stratum) independently. Stratification is the process of dividing members of the population into homogeneous subgroups before sampling. The strata should define a partition of the population.
Margin of errorThe margin of error is a statistic expressing the amount of random sampling error in the results of a survey. The larger the margin of error, the less confidence one should have that a poll result would reflect the result of a census of the entire population. The margin of error will be positive whenever a population is incompletely sampled and the outcome measure has positive variance, which is to say, whenever the measure varies. The term margin of error is often used in non-survey contexts to indicate observational error in reporting measured quantities.