Admittance parametersAdmittance parameters or Y-parameters (the elements of an admittance matrix or Y-matrix) are properties used in many areas of electrical engineering, such as power, electronics, and telecommunications. These parameters are used to describe the electrical behavior of linear electrical networks. They are also used to describe the small-signal (linearized) response of non-linear networks. Y parameters are also known as short circuited admittance parameters.
Electromagnetic fieldAn electromagnetic field (also EM field or EMF) is a classical (i.e. non-quantum) field produced by moving electric charges. It is the field described by classical electrodynamics (a classical field theory) and is the classical counterpart to the quantized electromagnetic field tensor in quantum electrodynamics (a quantum field theory). The electromagnetic field propagates at the speed of light (in fact, this field can be identified as light) and interacts with charges and currents.
Magnetic dipoleIn electromagnetism, a magnetic dipole is the limit of either a closed loop of electric current or a pair of poles as the size of the source is reduced to zero while keeping the magnetic moment constant. It is a magnetic analogue of the electric dipole, but the analogy is not perfect. In particular, a true magnetic monopole, the magnetic analogue of an electric charge, has never been observed in nature. However, magnetic monopole quasiparticles have been observed as emergent properties of certain condensed matter systems.
Deep belief networkIn machine learning, a deep belief network (DBN) is a generative graphical model, or alternatively a class of deep neural network, composed of multiple layers of latent variables ("hidden units"), with connections between the layers but not between units within each layer. When trained on a set of examples without supervision, a DBN can learn to probabilistically reconstruct its inputs. The layers then act as feature detectors. After this learning step, a DBN can be further trained with supervision to perform classification.
Interplanetary magnetic fieldThe interplanetary magnetic field (IMF), now more commonly referred to as the heliospheric magnetic field (HMF), is the component of the solar magnetic field that is dragged out from the solar corona by the solar wind flow to fill the Solar System. The coronal and solar wind plasmas are highly electrically conductive, meaning the magnetic field lines and the plasma flows are effectively "frozen" together and the magnetic field cannot diffuse through the plasma on time scales of interest.
Boltzmann distributionIn statistical mechanics and mathematics, a Boltzmann distribution (also called Gibbs distribution) is a probability distribution or probability measure that gives the probability that a system will be in a certain state as a function of that state's energy and the temperature of the system. The distribution is expressed in the form: where pi is the probability of the system being in state i, exp is the exponential function, εi is the energy of that state, and a constant kT of the distribution is the product of the Boltzmann constant k and thermodynamic temperature T.
Magnetic pressureIn physics, magnetic pressure is an energy density associated with a magnetic field. In SI units, the energy density of a magnetic field with strength can be expressed as where is the vacuum permeability. Any magnetic field has an associated magnetic pressure contained by the boundary conditions on the field. It is identical to any other physical pressure except that it is carried by the magnetic field rather than (in the case of a gas) by the kinetic energy of gas molecules.
Variational autoencoderIn machine learning, a variational autoencoder (VAE) is an artificial neural network architecture introduced by Diederik P. Kingma and Max Welling. It is part of the families of probabilistic graphical models and variational Bayesian methods. Variational autoencoders are often associated with the autoencoder model because of its architectural affinity, but with significant differences in the goal and mathematical formulation. Variational autoencoders are probabilistic generative models that require neural networks as only a part of their overall structure.
Finite fieldIn mathematics, a finite field or Galois field (so-named in honor of Évariste Galois) is a field that contains a finite number of elements. As with any field, a finite field is a set on which the operations of multiplication, addition, subtraction and division are defined and satisfy certain basic rules. The most common examples of finite fields are given by the integers mod p when p is a prime number. The order of a finite field is its number of elements, which is either a prime number or a prime power.
Finite ringIn mathematics, more specifically abstract algebra, a finite ring is a ring that has a finite number of elements. Every finite field is an example of a finite ring, and the additive part of every finite ring is an example of an abelian finite group, but the concept of finite rings in their own right has a more recent history. Although rings have more structure than groups, the theory of finite rings is simpler than that of finite groups.