Effects of climate change on agricultureThe effects of climate change on agriculture can result in lower crop yields and nutritional quality due to drought, heat waves and flooding as well as increases in pests and plant diseases. Climate change impacts are making it harder for agricultural activities to meet human needs. The effects are unevenly distributed across the world and are caused by changes in temperature, precipitation and atmospheric carbon dioxide levels due to global climate change. In 2019, millions were already suffering from food insecurity due to climate change.
Effects of climate change on the water cycleThe effects of climate change on the water cycle are profound and have been described as an intensification or a strengthening of the water cycle (also called hydrologic cycle). This effect has been observed since at least 1980. One example is the intensification of heavy precipitation events. This has important negative effects on the availability of freshwater resources, as well as other water reservoirs such as oceans, ice sheets, atmosphere and land surface.
PrecipitationIn meteorology, precipitation is any product of the condensation of atmospheric water vapor that falls from clouds due to gravitational pull. The main forms of precipitation include drizzle, rain, sleet, snow, ice pellets, graupel and hail. Precipitation occurs when a portion of the atmosphere becomes saturated with water vapor (reaching 100% relative humidity), so that the water condenses and "precipitates" or falls. Thus, fog and mist are not precipitation but colloids, because the water vapor does not condense sufficiently to precipitate.
Global temperature recordThe global temperature record shows the fluctuations of the temperature of the atmosphere and the oceans through various spans of time. There are numerous estimates of temperatures since the end of the Pleistocene glaciation, particularly during the current Holocene epoch. Some temperature information is available through geologic evidence, going back millions of years. More recently, information from ice cores covers the period from 800,000 years before the present time until now.
Joint probability distributionGiven two random variables that are defined on the same probability space, the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs. The joint distribution can just as well be considered for any given number of random variables. The joint distribution encodes the marginal distributions, i.e. the distributions of each of the individual random variables. It also encodes the conditional probability distributions, which deal with how the outputs of one random variable are distributed when given information on the outputs of the other random variable(s).
Probability theoryProbability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space.
Poisson distributionIn probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant mean rate and independently of the time since the last event. It is named after French mathematician Siméon Denis Poisson ('pwɑːsɒn; pwasɔ̃). The Poisson distribution can also be used for the number of events in other specified interval types such as distance, area, or volume.
Maximum entropy probability distributionIn statistics and information theory, a maximum entropy probability distribution has entropy that is at least as great as that of all other members of a specified class of probability distributions. According to the principle of maximum entropy, if nothing is known about a distribution except that it belongs to a certain class (usually defined in terms of specified properties or measures), then the distribution with the largest entropy should be chosen as the least-informative default.
Indecomposable distributionIn probability theory, an indecomposable distribution is a probability distribution that cannot be represented as the distribution of the sum of two or more non-constant independent random variables: Z ≠ X + Y. If it can be so expressed, it is decomposable: Z = X + Y. If, further, it can be expressed as the distribution of the sum of two or more independent identically distributed random variables, then it is divisible: Z = X1 + X2. The simplest examples are Bernoulli-distributeds: if then the probability distribution of X is indecomposable.
Prior probabilityA prior probability distribution of an uncertain quantity, often simply called the prior, is its assumed probability distribution before some evidence is taken into account. For example, the prior could be the probability distribution representing the relative proportions of voters who will vote for a particular politician in a future election. The unknown quantity may be a parameter of the model or a latent variable rather than an observable variable.