Structural analysisStructural analysis is a branch of solid mechanics which uses simplified models for solids like bars, beams and shells for engineering decision making. Its main objective is to determine the effect of loads on the physical structures and their components. In contrast to theory of elasticity, the models used in structure analysis are often differential equations in one spatial variable. Structures subject to this type of analysis include all that must withstand loads, such as buildings, bridges, aircraft and ships.
ErgodicityIn mathematics, ergodicity expresses the idea that a point of a moving system, either a dynamical system or a stochastic process, will eventually visit all parts of the space that the system moves in, in a uniform and random sense. This implies that the average behavior of the system can be deduced from the trajectory of a "typical" point. Equivalently, a sufficiently large collection of random samples from a process can represent the average statistical properties of the entire process.
Markov chain Monte CarloIn statistics, Markov chain Monte Carlo (MCMC) methods comprise a class of algorithms for sampling from a probability distribution. By constructing a Markov chain that has the desired distribution as its equilibrium distribution, one can obtain a sample of the desired distribution by recording states from the chain. The more steps that are included, the more closely the distribution of the sample matches the actual desired distribution. Various algorithms exist for constructing chains, including the Metropolis–Hastings algorithm.
Market riskMarket risk is the risk of losses in positions arising from movements in market variables like prices and volatility. There is no unique classification as each classification may refer to different aspects of market risk. Nevertheless, the most commonly used types of market risk are: Equity risk, the risk that stock or stock indices (e.g. Euro Stoxx 50, etc.) prices or their implied volatility will change. Interest rate risk, the risk that interest rates (e.g. Libor, Euribor, etc.) or their implied volatility will change.
Type systemIn computer programming, a type system is a logical system comprising a set of rules that assigns a property called a type (for example, integer, floating point, string) to every "term" (a word, phrase, or other set of symbols). Usually the terms are various constructs of a computer program, such as variables, expressions, functions, or modules. A type system dictates the operations that can be performed on a term. For variables, the type system determines the allowed values of that term.
Financial risk modelingFinancial risk modeling is the use of formal mathematical and econometric techniques to measure, monitor and control the market risk, credit risk, and operational risk on a firm's balance sheet, on a bank's trading book, or re a fund manager's portfolio value; see Financial risk management. Risk modeling is one of many subtasks within the broader area of financial modeling. Risk modeling uses a variety of techniques including market risk, value at risk (VaR), historical simulation (HS), or extreme value theory (EVT) in order to analyze a portfolio and make forecasts of the likely losses that would be incurred for a variety of risks.
Convex optimizationConvex optimization is a subfield of mathematical optimization that studies the problem of minimizing convex functions over convex sets (or, equivalently, maximizing concave functions over convex sets). Many classes of convex optimization problems admit polynomial-time algorithms, whereas mathematical optimization is in general NP-hard.
Maximum likelihood estimationIn statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data. This is achieved by maximizing a likelihood function so that, under the assumed statistical model, the observed data is most probable. The point in the parameter space that maximizes the likelihood function is called the maximum likelihood estimate. The logic of maximum likelihood is both intuitive and flexible, and as such the method has become a dominant means of statistical inference.
Estimation theoryEstimation theory is a branch of statistics that deals with estimating the values of parameters based on measured empirical data that has a random component. The parameters describe an underlying physical setting in such a way that their value affects the distribution of the measured data. An estimator attempts to approximate the unknown parameters using the measurements.
Loss functionIn mathematical optimization and decision theory, a loss function or cost function (sometimes also called an error function) is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost" associated with the event. An optimization problem seeks to minimize a loss function. An objective function is either a loss function or its opposite (in specific domains, variously called a reward function, a profit function, a utility function, a fitness function, etc.