LearningLearning is the process of acquiring new understanding, knowledge, behaviors, skills, values, attitudes, and preferences. The ability to learn is possessed by humans, animals, and some machines; there is also evidence for some kind of learning in certain plants. Some learning is immediate, induced by a single event (e.g. being burned by a hot stove), but much skill and knowledge accumulate from repeated experiences. The changes induced by learning often last a lifetime, and it is hard to distinguish learned material that seems to be "lost" from that which cannot be retrieved.
Chemical synapseChemical synapses are biological junctions through which neurons' signals can be sent to each other and to non-neuronal cells such as those in muscles or glands. Chemical synapses allow neurons to form circuits within the central nervous system. They are crucial to the biological computations that underlie perception and thought. They allow the nervous system to connect to and control other systems of the body. At a chemical synapse, one neuron releases neurotransmitter molecules into a small space (the synaptic cleft) that is adjacent to another neuron.
Reinforcement learningReinforcement learning (RL) is an area of machine learning concerned with how intelligent agents ought to take actions in an environment in order to maximize the notion of cumulative reward. Reinforcement learning is one of three basic machine learning paradigms, alongside supervised learning and unsupervised learning. Reinforcement learning differs from supervised learning in not needing labelled input/output pairs to be presented, and in not needing sub-optimal actions to be explicitly corrected.
Machine learningMachine learning (ML) is an umbrella term for solving problems for which development of algorithms by human programmers would be cost-prohibitive, and instead the problems are solved by helping machines 'discover' their 'own' algorithms, without needing to be explicitly told what to do by any human-developed algorithms. Recently, generative artificial neural networks have been able to surpass results of many previous approaches.
Postsynaptic potentialPostsynaptic potentials are changes in the membrane potential of the postsynaptic terminal of a chemical synapse. Postsynaptic potentials are graded potentials, and should not be confused with action potentials although their function is to initiate or inhibit action potentials. They are caused by the presynaptic neuron releasing neurotransmitters from the terminal bouton at the end of an axon into the synaptic cleft. The neurotransmitters bind to receptors on the postsynaptic terminal, which may be a neuron or a muscle cell in the case of a neuromuscular junction.
Activity-dependent plasticityActivity-dependent plasticity is a form of functional and structural neuroplasticity that arises from the use of cognitive functions and personal experience; hence, it is the biological basis for learning and the formation of new memories. Activity-dependent plasticity is a form of neuroplasticity that arises from intrinsic or endogenous activity, as opposed to forms of neuroplasticity that arise from extrinsic or exogenous factors, such as electrical brain stimulation- or drug-induced neuroplasticity.
Inhibitory postsynaptic potentialAn inhibitory postsynaptic potential (IPSP) is a kind of synaptic potential that makes a postsynaptic neuron less likely to generate an action potential. IPSPs were first investigated in motorneurons by David P. C. Lloyd, John Eccles and Rodolfo Llinás in the 1950s and 1960s. The opposite of an inhibitory postsynaptic potential is an excitatory postsynaptic potential (EPSP), which is a synaptic potential that makes a postsynaptic neuron more likely to generate an action potential.
Complex networkIn the context of network theory, a complex network is a graph (network) with non-trivial topological features—features that do not occur in simple networks such as lattices or random graphs but often occur in networks representing real systems. The study of complex networks is a young and active area of scientific research (since 2000) inspired largely by empirical findings of real-world networks such as computer networks, biological networks, technological networks, brain networks, climate networks and social networks.
Complex systemA complex system is a system composed of many components which may interact with each other. Examples of complex systems are Earth's global climate, organisms, the human brain, infrastructure such as power grid, transportation or communication systems, complex software and electronic systems, social and economic organizations (like cities), an ecosystem, a living cell, and ultimately the entire universe.
Synaptic plasticityIn neuroscience, synaptic plasticity is the ability of synapses to strengthen or weaken over time, in response to increases or decreases in their activity. Since memories are postulated to be represented by vastly interconnected neural circuits in the brain, synaptic plasticity is one of the important neurochemical foundations of learning and memory (see Hebbian theory). Plastic change often results from the alteration of the number of neurotransmitter receptors located on a synapse.