Statistical theoryThe theory of statistics provides a basis for the whole range of techniques, in both study design and data analysis, that are used within applications of statistics. The theory covers approaches to statistical-decision problems and to statistical inference, and the actions and deductions that satisfy the basic principles stated for these different approaches. Within a given approach, statistical theory gives ways of comparing statistical procedures; it can find a best possible procedure within a given context for given statistical problems, or can provide guidance on the choice between alternative procedures.
Graph databaseA graph database (GDB) is a database that uses graph structures for semantic queries with nodes, edges, and properties to represent and store data. A key concept of the system is the graph (or edge or relationship). The graph relates the data items in the store to a collection of nodes and edges, the edges representing the relationships between the nodes. The relationships allow data in the store to be linked together directly and, in many cases, retrieved with one operation.
Level (logarithmic quantity)In science and engineering, a power level and a field level (also called a root-power level) are logarithmic magnitudes of certain quantities referenced to a standard reference value of the same type. A power level is a logarithmic quantity used to measure power, power density or sometimes energy, with commonly used unit decibel (dB). A field level (or root-power level) is a logarithmic quantity used to measure quantities of which the square is typically proportional to power (for instance, the square of voltage is proportional to power by the inverse of the conductor's resistance), etc.
Mathematical statisticsMathematical statistics is the application of probability theory, a branch of mathematics, to statistics, as opposed to techniques for collecting statistical data. Specific mathematical techniques which are used for this include mathematical analysis, linear algebra, stochastic analysis, differential equations, and measure theory. Statistical data collection is concerned with the planning of studies, especially with the design of randomized experiments and with the planning of surveys using random sampling.
Java Database ConnectivityJava Database Connectivity (JDBC) is an application programming interface (API) for the Java programming language which defines how a client may access a database. It is a Java-based data access technology used for Java database connectivity. It is part of the Java Standard Edition platform, from Oracle Corporation. It provides methods to query and update data in a database, and is oriented toward relational databases. A JDBC-to-ODBC bridge enables connections to any ODBC-accessible data source in the Java virtual machine (JVM) host environment.
Financial engineeringFinancial engineering is a multidisciplinary field involving financial theory, methods of engineering, tools of mathematics and the practice of programming. It has also been defined as the application of technical methods, especially from mathematical finance and computational finance, in the practice of finance. Financial engineering plays a key role in a bank's customer-driven derivatives business — delivering bespoke OTC-contracts and "exotics", and implementing various structured products — which encompasses quantitative modelling, quantitative programming and risk managing financial products in compliance with the regulations and Basel capital/liquidity requirements.
Database theoryDatabase theory encapsulates a broad range of topics related to the study and research of the theoretical realm of databases and database management systems. Theoretical aspects of data management include, among other areas, the foundations of query languages, computational complexity and expressive power of queries, finite model theory, database design theory, dependency theory, foundations of concurrency control and database recovery, deductive databases, temporal and spatial databases, real-time databases, managing uncertain data and probabilistic databases, and Web data.
All models are wrongAll models are wrong is a common aphorism and anapodoton in statistics; it is often expanded as "All models are wrong, but some are useful". The aphorism acknowledges that statistical models always fall short of the complexities of reality but can still be useful nonetheless. The aphorism originally referred just to statistical models, but it is now sometimes used for scientific models in general. The aphorism is generally attributed to the statistician George Box. The underlying concept, though, predates Box's writings.
Archival appraisalIn archival science and archive administration, appraisal is a process usually conducted by members of the record-holding institution (often professional archivists) in which a body of records is examined to determine its value for that institution. It also involves determining how long this value will last. The activity is one of the central tasks of an archivist, to determine the archival value of specific records. When it occurs prior to acquisition, the appraisal process involves assessing records for inclusion in the archives.
Panel dataIn statistics and econometrics, panel data and longitudinal data are both multi-dimensional data involving measurements over time. Panel data is a subset of longitudinal data where observations are for the same subjects each time. Time series and cross-sectional data can be thought of as special cases of panel data that are in one dimension only (one panel member or individual for the former, one time point for the latter). A literature search often involves time series, cross-sectional, or panel data.