Memory cell (computing)The memory cell is the fundamental building block of computer memory. The memory cell is an electronic circuit that stores one bit of binary information and it must be set to store a logic 1 (high voltage level) and reset to store a logic 0 (low voltage level). Its value is maintained/stored until it is changed by the set/reset process. The value in the memory cell can be accessed by reading it. Over the history of computing, different memory cell architectures have been used, including core memory and bubble memory.
AI acceleratorAn AI accelerator is a class of specialized hardware accelerator or computer system designed to accelerate artificial intelligence and machine learning applications, including artificial neural networks and machine vision. Typical applications include algorithms for robotics, Internet of Things, and other data-intensive or sensor-driven tasks. They are often manycore designs and generally focus on low-precision arithmetic, novel dataflow architectures or in-memory computing capability.
Deep learningDeep learning is part of a broader family of machine learning methods, which is based on artificial neural networks with representation learning. The adjective "deep" in deep learning refers to the use of multiple layers in the network. Methods used can be either supervised, semi-supervised or unsupervised.
MemristorA memristor (ˈmɛmrᵻstər; a portmanteau of memory resistor) is a non-linear two-terminal electrical component relating electric charge and magnetic flux linkage. It was described and named in 1971 by Leon Chua, completing a theoretical quartet of fundamental electrical components which comprises also the resistor, capacitor and inductor. Chua and Kang later generalized the concept to memristive systems. Such a system comprises a circuit, of multiple conventional components, which mimics key properties of the ideal memristor component and is also commonly referred to as a memristor.
Unconventional computingUnconventional computing is computing by any of a wide range of new or unusual methods. It is also known as alternative computing. The term unconventional computation was coined by Cristian S. Calude and John Casti and used at the First International Conference on Unconventional Models of Computation in 1998. The general theory of computation allows for a variety of models. Computing technology first developed using mechanical systems and then evolved into the use of electronic devices.
RpropRprop, short for resilient backpropagation, is a learning heuristic for supervised learning in feedforward artificial neural networks. This is a first-order optimization algorithm. This algorithm was created by Martin Riedmiller and Heinrich Braun in 1992. Similarly to the Manhattan update rule, Rprop takes into account only the sign of the partial derivative over all patterns (not the magnitude), and acts independently on each "weight".
Multilayer perceptronA multilayer perceptron (MLP) is a misnomer for a modern feedforward artificial neural network, consisting of fully connected neurons with a nonlinear kind of activation function, organized in at least three layers, notable for being able to distinguish data that is not linearly separable. It is a misnomer because the original perceptron used a Heaviside step function, instead of a nonlinear kind of activation function (used by modern networks).
Electrical resistance and conductanceThe electrical resistance of an object is a measure of its opposition to the flow of electric current. Its reciprocal quantity is , measuring the ease with which an electric current passes. Electrical resistance shares some conceptual parallels with mechanical friction. The SI unit of electrical resistance is the ohm (Ω), while electrical conductance is measured in siemens (S) (formerly called the 'mho' and then represented by ℧). The resistance of an object depends in large part on the material it is made of.
Quantum computingA quantum computer is a computer that exploits quantum mechanical phenomena. At small scales, physical matter exhibits properties of both particles and waves, and quantum computing leverages this behavior, specifically quantum superposition and entanglement, using specialized hardware that supports the preparation and manipulation of quantum states. Classical physics cannot explain the operation of these quantum devices, and a scalable quantum computer could perform some calculations exponentially faster than any modern "classical" computer.
Magnetic-core memoryMagnetic-core memory was the predominant form of random-access computer memory for 20 years between about 1955 and 1975. Such memory is often just called core memory, or, informally, core. Core memory uses toroids (rings) of a hard magnetic material (usually a semi-hard ferrite) as transformer cores, where each wire threaded through the core serves as a transformer winding. Two or more wires pass through each core. Magnetic hysteresis allows each of the cores to "remember", or store a state.