Artificial neuronAn artificial neuron is a mathematical function conceived as a model of biological neurons, a neural network. Artificial neurons are elementary units in an artificial neural network. The artificial neuron receives one or more inputs (representing excitatory postsynaptic potentials and inhibitory postsynaptic potentials at neural dendrites) and sums them to produce an output (or , representing a neuron's action potential which is transmitted along its axon).
Electrotonic potentialIn physiology, electrotonus refers to the passive spread of charge inside a neuron and between cardiac muscle cells or smooth muscle cells. Passive means that voltage-dependent changes in membrane conductance do not contribute. Neurons and other excitable cells produce two types of electrical potential: Electrotonic potential (or graded potential), a non-propagated local potential, resulting from a local change in ionic conductance (e.g. synaptic or sensory that engenders a local current).
Axon terminalAxon terminals (also called synaptic boutons, terminal boutons, or end-feet) are distal terminations of the telodendria (branches) of an axon. An axon, also called a nerve fiber, is a long, slender projection of a nerve cell, or neuron, that conducts electrical impulses called action potentials away from the neuron's cell body, or soma, in order to transmit those impulses to other neurons, muscle cells or glands.
Long-term potentiationIn neuroscience, long-term potentiation (LTP) is a persistent strengthening of synapses based on recent patterns of activity. These are patterns of synaptic activity that produce a long-lasting increase in signal transmission between two neurons. The opposite of LTP is long-term depression, which produces a long-lasting decrease in synaptic strength. It is one of several phenomena underlying synaptic plasticity, the ability of chemical synapses to change their strength.
Synaptic vesicleIn a neuron, synaptic vesicles (or neurotransmitter vesicles) store various neurotransmitters that are released at the synapse. The release is regulated by a voltage-dependent calcium channel. Vesicles are essential for propagating nerve impulses between neurons and are constantly recreated by the cell. The area in the axon that holds groups of vesicles is an axon terminal or "terminal bouton". Up to 130 vesicles can be released per bouton over a ten-minute period of stimulation at 0.2 Hz.
Models of neural computationModels of neural computation are attempts to elucidate, in an abstract and mathematical fashion, the core principles that underlie information processing in biological nervous systems, or functional components thereof. This article aims to provide an overview of the most definitive models of neuro-biological computation as well as the tools commonly used to construct and analyze them.
Neural oscillationNeural oscillations, or brainwaves, are rhythmic or repetitive patterns of neural activity in the central nervous system. Neural tissue can generate oscillatory activity in many ways, driven either by mechanisms within individual neurons or by interactions between neurons. In individual neurons, oscillations can appear either as oscillations in membrane potential or as rhythmic patterns of action potentials, which then produce oscillatory activation of post-synaptic neurons.
Homeostatic plasticityIn neuroscience, homeostatic plasticity refers to the capacity of neurons to regulate their own excitability relative to network activity. The term homeostatic plasticity derives from two opposing concepts: 'homeostatic' (a product of the Greek words for 'same' and 'state' or 'condition') and plasticity (or 'change'), thus homeostatic plasticity means "staying the same through change". Homeostatic synaptic plasticity is a means of maintaining the synaptic basis for learning, respiration, and locomotion, in contrast to the Hebbian plasticity associated with learning and memory.
NeuroplasticityNeuroplasticity, also known as neural plasticity, or brain plasticity, is the ability of neural networks in the brain to change through growth and reorganization. It is when the brain is rewired to function in some way that differs from how it previously functioned. These changes range from individual neuron pathways making new connections, to systematic adjustments like cortical remapping. Examples of neuroplasticity include circuit and network changes that result from learning a new ability, information acquisition, environmental influences, practice, and psychological stress.
Hopfield networkA Hopfield network (or Amari-Hopfield network, Ising model of a neural network or Ising–Lenz–Little model) is a form of recurrent artificial neural network and a type of spin glass system popularised by John Hopfield in 1982 as described by Shun'ichi Amari in 1972 and by Little in 1974 based on Ernst Ising's work with Wilhelm Lenz on the Ising model. Hopfield networks serve as content-addressable ("associative") memory systems with binary threshold nodes, or with continuous variables.