Nonlinear systemIn mathematics and science, a nonlinear system (or a non-linear system) is a system in which the change of the output is not proportional to the change of the input. Nonlinear problems are of interest to engineers, biologists, physicists, mathematicians, and many other scientists since most systems are inherently nonlinear in nature. Nonlinear dynamical systems, describing changes in variables over time, may appear chaotic, unpredictable, or counterintuitive, contrasting with much simpler linear systems.
Principal component analysisPrincipal component analysis (PCA) is a popular technique for analyzing large datasets containing a high number of dimensions/features per observation, increasing the interpretability of data while preserving the maximum amount of information, and enabling the visualization of multidimensional data. Formally, PCA is a statistical technique for reducing the dimensionality of a dataset. This is accomplished by linearly transforming the data into a new coordinate system where (most of) the variation in the data can be described with fewer dimensions than the initial data.
Nonlinear opticsNonlinear optics (NLO) is the branch of optics that describes the behaviour of light in nonlinear media, that is, media in which the polarization density P responds non-linearly to the electric field E of the light. The non-linearity is typically observed only at very high light intensities (when the electric field of the light is >108 V/m and thus comparable to the atomic electric field of ~1011 V/m) such as those provided by lasers. Above the Schwinger limit, the vacuum itself is expected to become nonlinear.
Z-transformIn mathematics and signal processing, the Z-transform converts a discrete-time signal, which is a sequence of real or complex numbers, into a complex frequency-domain (z-domain or z-plane) representation. It can be considered as a discrete-time equivalent of the Laplace transform (s-domain). This similarity is explored in the theory of time-scale calculus. Whereas the continuous-time Fourier transform is evaluated on the Laplace s-domain's imaginary line, the discrete-time Fourier transform is evaluated over the unit circle of the z-domain.
Tensor–vector–scalar gravityTensor–vector–scalar gravity (TeVeS), developed by Jacob Bekenstein in 2004, is a relativistic generalization of Mordehai Milgrom's Modified Newtonian dynamics (MOND) paradigm. The main features of TeVeS can be summarized as follows: As it is derived from the action principle, TeVeS respects conservation laws; In the weak-field approximation of the spherically symmetric, static solution, TeVeS reproduces the MOND acceleration formula; TeVeS avoids the problems of earlier attempts to generalize MOND, such as superluminal propagation; As it is a relativistic theory it can accommodate gravitational lensing.
Soil retrogression and degradationSoil retrogression and degradation are two regressive evolution processes associated with the loss of equilibrium of a stable soil. Retrogression is primarily due to soil erosion and corresponds to a phenomenon where succession reverts the land to its natural physical state. Degradation is an evolution, different from natural evolution, related to the local climate and vegetation. It is due to the replacement of primary plant communities (known as climax vegetation) by the secondary communities.
Nonlinear programmingIn mathematics, nonlinear programming (NLP) is the process of solving an optimization problem where some of the constraints or the objective function are nonlinear. An optimization problem is one of calculation of the extrema (maxima, minima or stationary points) of an objective function over a set of unknown real variables and conditional to the satisfaction of a system of equalities and inequalities, collectively termed constraints. It is the sub-field of mathematical optimization that deals with problems that are not linear.
Computer performanceIn computing, computer performance is the amount of useful work accomplished by a computer system. Outside of specific contexts, computer performance is estimated in terms of accuracy, efficiency and speed of executing computer program instructions. When it comes to high computer performance, one or more of the following factors might be involved: Short response time for a given piece of work. High throughput (rate of processing work). Low utilization of computing resource(s). Fast (or highly compact) data compression and decompression.
Time evolutionTime evolution is the change of state brought about by the passage of time, applicable to systems with internal state (also called stateful systems). In this formulation, time is not required to be a continuous parameter, but may be discrete or even finite. In classical physics, time evolution of a collection of rigid bodies is governed by the principles of classical mechanics. In their most rudimentary form, these principles express the relationship between forces acting on the bodies and their acceleration given by Newton's laws of motion.
Kernel density estimationIn statistics, kernel density estimation (KDE) is the application of kernel smoothing for probability density estimation, i.e., a non-parametric method to estimate the probability density function of a random variable based on kernels as weights. KDE answers a fundamental data smoothing problem where inferences about the population are made, based on a finite data sample. In some fields such as signal processing and econometrics it is also termed the Parzen–Rosenblatt window method, after Emanuel Parzen and Murray Rosenblatt, who are usually credited with independently creating it in its current form.