Dark energyIn physical cosmology and astronomy, dark energy is an unknown form of energy that affects the universe on the largest scales. The first observational evidence for its existence came from measurements of supernovas, which showed that the universe does not expand at a constant rate; rather, the universe's expansion is accelerating. Understanding the universe's evolution requires knowledge of its starting conditions and composition. Before these observations, scientists thought that all forms of matter and energy in the universe would only cause the expansion to slow down over time.
Dimensionless physical constantIn physics, a dimensionless physical constant is a physical constant that is dimensionless, i.e. a pure number having no units attached and having a numerical value that is independent of whatever system of units may be used. In aerodynamics for example, if one considers one particular airfoil, the Reynolds number value of the laminar–turbulent transition is one relevant dimensionless physical constant of the problem. However, it is strictly related to the particular problem: for example, it is related to the airfoil being considered and also to the type of fluid in which it moves.
Anthropic principleThe anthropic principle, also known as the "observation selection effect", is the hypothesis, first proposed in 1957 by Robert Dicke, that the range of possible observations that could be made about the universe is limited by the fact that observations could only happen in a universe capable of developing intelligent life in the first place. Proponents of the anthropic principle argue that it explains why this universe has the age and the fundamental physical constants necessary to accommodate conscious life, since if either had been different, no one would have been around to make observations.
Sample mean and covarianceThe sample mean (sample average) or empirical mean (empirical average), and the sample covariance or empirical covariance are statistics computed from a sample of data on one or more random variables. The sample mean is the average value (or mean value) of a sample of numbers taken from a larger population of numbers, where "population" indicates not number of people but the entirety of relevant data, whether collected or not. A sample of 40 companies' sales from the Fortune 500 might be used for convenience instead of looking at the population, all 500 companies' sales.
Sample size determinationSample size determination is the act of choosing the number of observations or replicates to include in a statistical sample. The sample size is an important feature of any empirical study in which the goal is to make inferences about a population from a sample. In practice, the sample size used in a study is usually determined based on the cost, time, or convenience of collecting the data, and the need for it to offer sufficient statistical power.
B-factoryIn particle physics, a B-factory, or sometimes a beauty factory, is a particle collider experiment designed to produce and detect a large number of B mesons so that their properties and behavior can be measured with small statistical uncertainty. Tau leptons and D mesons are also copiously produced at B-factories. A sort of "prototype" or "precursor" B-factory was the HERA-B experiment at DESY that was planned to study B-meson physics in the 1990–2000s, before the actual B-factories were constructed/operational.
Exponential decayA quantity is subject to exponential decay if it decreases at a rate proportional to its current value. Symbolically, this process can be expressed by the following differential equation, where N is the quantity and λ (lambda) is a positive rate called the exponential decay constant, disintegration constant, rate constant, or transformation constant: The solution to this equation (see derivation below) is: where N(t) is the quantity at time t, N0 = N(0) is the initial quantity, that is, the quantity at time t = 0.
Central limit theoremIn probability theory, the central limit theorem (CLT) establishes that, in many situations, for independent and identically distributed random variables, the sampling distribution of the standardized sample mean tends towards the standard normal distribution even if the original variables themselves are not normally distributed. The theorem is a key concept in probability theory because it implies that probabilistic and statistical methods that work for normal distributions can be applicable to many problems involving other types of distributions.
Quantum field theoryIn theoretical physics, quantum field theory (QFT) is a theoretical framework that combines classical field theory, special relativity, and quantum mechanics. QFT is used in particle physics to construct physical models of subatomic particles and in condensed matter physics to construct models of quasiparticles. QFT treats particles as excited states (also called quanta) of their underlying quantum fields, which are more fundamental than the particles.
Proton–proton chainThe proton–proton chain, also commonly referred to as the p–p chain, is one of two known sets of nuclear fusion reactions by which stars convert hydrogen to helium. It dominates in stars with masses less than or equal to that of the Sun, whereas the CNO cycle, the other known reaction, is suggested by theoretical models to dominate in stars with masses greater than about 1.3 times that of the Sun. In general, proton–proton fusion can occur only if the kinetic energy (i.e.