Hamming boundIn mathematics and computer science, in the field of coding theory, the Hamming bound is a limit on the parameters of an arbitrary block code: it is also known as the sphere-packing bound or the volume bound from an interpretation in terms of packing balls in the Hamming metric into the space of all possible words. It gives an important limitation on the efficiency with which any error-correcting code can utilize the space in which its code words are embedded. A code that attains the Hamming bound is said to be a perfect code.
Logistic regressionIn statistics, the logistic model (or logit model) is a statistical model that models the probability of an event taking place by having the log-odds for the event be a linear combination of one or more independent variables. In regression analysis, logistic regression (or logit regression) is estimating the parameters of a logistic model (the coefficients in the linear combination).
Stable count distributionIn probability theory, the stable count distribution is the conjugate prior of a one-sided stable distribution. This distribution was discovered by Stephen Lihn (Chinese: 藺鴻圖) in his 2017 study of daily distributions of the S&P 500 and the VIX. The stable distribution family is also sometimes referred to as the Lévy alpha-stable distribution, after Paul Lévy, the first mathematician to have studied it. Of the three parameters defining the distribution, the stability parameter is most important.
Lempel–Ziv–Markov chain algorithmThe Lempel–Ziv–Markov chain algorithm (LZMA) is an algorithm used to perform lossless data compression. It has been under development since either 1996 or 1998 by Igor Pavlov and was first used in the 7z format of the 7-Zip archiver. This algorithm uses a dictionary compression scheme somewhat similar to the LZ77 algorithm published by Abraham Lempel and Jacob Ziv in 1977 and features a high compression ratio (generally higher than bzip2) and a variable compression-dictionary size (up to 4 GB), while still maintaining decompression speed similar to other commonly used compression algorithms.
Market distortionIn neoclassical economics, a market distortion is any event in which a market reaches a market clearing price for an item that is substantially different from the price that a market would achieve while operating under conditions of perfect competition and state enforcement of legal contracts and the ownership of private property. A distortion is "any departure from the ideal of perfect competition that therefore interferes with economic agents maximizing social welfare when they maximize their own".
Entropy rateIn the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate is the limit of the joint entropy of members of the process divided by , as tends to infinity: when the limit exists. An alternative, related quantity is: For strongly stationary stochastic processes, .
Excess burden of taxationIn economics, the excess burden of taxation, also known as the deadweight cost or deadweight loss of taxation, is one of the economic losses that society suffers as the result of taxes or subsidies. Economic theory posits that distortions change the amount and type of economic behavior from that which would occur in a free market without the tax. Excess burdens can be measured using the average cost of funds or the marginal cost of funds (MCF). Excess burdens were first discussed by Adam Smith.