Markov modelIn probability theory, a Markov model is a stochastic model used to model pseudo-randomly changing systems. It is assumed that future states depend only on the current state, not on the events that occurred before it (that is, it assumes the Markov property). Generally, this assumption enables reasoning and computation with the model that would otherwise be intractable. For this reason, in the fields of predictive modelling and probabilistic forecasting, it is desirable for a given model to exhibit the Markov property.
Process (computing)In computing, a process is the instance of a computer program that is being executed by one or many threads. There are many different process models, some of which are light weight, but almost all processes (even entire virtual machines) are rooted in an operating system (OS) process which comprises the program code, assigned system resources, physical and logical access permissions, and data structures to initiate, control and coordinate execution activity.
Efficient frontierIn modern portfolio theory, the efficient frontier (or portfolio frontier) is an investment portfolio which occupies the "efficient" parts of the risk–return spectrum. Formally, it is the set of portfolios which satisfy the condition that no other portfolio exists with a higher expected return but with the same standard deviation of return (i.e., the risk). The efficient frontier was first formulated by Harry Markowitz in 1952; see Markowitz model. A combination of assets, i.e.
Mathematics educationIn contemporary education, mathematics education—known in Europe as the didactics or pedagogy of mathematics—is the practice of teaching, learning, and carrying out scholarly research into the transfer of mathematical knowledge. Although research into mathematics education is primarily concerned with the tools, methods, and approaches that facilitate practice or the study of practice, it also covers an extensive field of study encompassing a variety of different concepts, theories and methods.
Experimental economicsExperimental economics is the application of experimental methods to study economic questions. Data collected in experiments are used to estimate effect size, test the validity of economic theories, and illuminate market mechanisms. Economic experiments usually use cash to motivate subjects, in order to mimic real-world incentives. Experiments are used to help understand how and why markets and other exchange systems function as they do. Experimental economics have also expanded to understand institutions and the law (experimental law and economics).
Price discoveryIn economics and finance, the price discovery process (also called price discovery mechanism) is the process of determining the price of an asset in the marketplace through the interactions of buyers and sellers. Price discovery is different from valuation. Price discovery process involves buyers and sellers arriving at a transaction price for a specific item at a given time.
Light-weight processIn computer operating systems, a light-weight process (LWP) is a means of achieving multitasking. In the traditional meaning of the term, as used in Unix System V and Solaris, a LWP runs in user space on top of a single kernel thread and shares its address space and system resources with other LWPs within the same process. Multiple user-level threads, managed by a thread library, can be placed on top of one or many LWPs - allowing multitasking to be done at the user level, which can have some performance benefits.
Adverse selectionIn economics, insurance, and risk management, adverse selection is a market situation where buyers and sellers have different information. The result is the unequal distribution of benefits to both parties, with the party having the key information benefiting more. In an ideal world, buyers should pay a price which reflects their willingness to pay and the value to them of the product or service, and sellers should sell at a price which reflects the quality of their goods and services.
Information engineering (field)_Information engineering Information engineering is the engineering discipline that deals with the generation, distribution, analysis, and use of information, data, and knowledge in systems. The field first became identifiable in the early 21st century. The components of information engineering include more theoretical fields such as machine learning, artificial intelligence, control theory, signal processing, and information theory, and more applied fields such as computer vision, natural language processing, bioinformatics, , cheminformatics, autonomous robotics, mobile robotics, and telecommunications.
Beta (finance)In finance, the beta (β or market beta or beta coefficient) is a statistic that measures the expected increase or decrease of an individual stock price in proportion to movements of the Stock market as a whole. Beta can be used to indicate the contribution of an individual asset to the market risk of a portfolio when it is added in small quantity. It is referred to as an asset's non-diversifiable risk, systematic risk, or market risk. Beta is not a measure of idiosyncratic risk.