A progenitor cell is a biological cell that can differentiate into a specific cell type. Stem cells and progenitor cells have this ability in common. However, stem cells are less specified than progenitor cells. Progenitor cells can only differentiate into their "target" cell type. The most important difference between stem cells and progenitor cells is that stem cells can replicate indefinitely, whereas progenitor cells can divide only a limited number of times. Controversy about the exact definition remains and the concept is still evolving.
Neural stem cells (NSCs) are self-renewing, multipotent cells that firstly generate the radial glial progenitor cells that generate the neurons and glia of the nervous system of all animals during embryonic development. Some neural progenitor stem cells persist in highly restricted regions in the adult vertebrate brain and continue to produce neurons throughout life. Differences in the size of the central nervous system are among the most important distinctions between the species and thus mutations in the genes that regulate the size of the neural stem cell compartment are among the most important drivers of vertebrate evolution.
The development of the nervous system, or neural development (neurodevelopment), refers to the processes that generate, shape, and reshape the nervous system of animals, from the earliest stages of embryonic development to adulthood. The field of neural development draws on both neuroscience and developmental biology to describe and provide insight into the cellular and molecular mechanisms by which complex nervous systems develop, from nematodes and fruit flies to mammals.
The central nervous system (CNS) is the part of the nervous system consisting primarily of the brain and spinal cord. The CNS is so named because the brain integrates the received information and coordinates and influences the activity of all parts of the bodies of bilaterally symmetric and triploblastic animals—that is, all multicellular animals except sponges and diploblasts. It is a structure composed of nervous tissue positioned along the rostral (nose end) to caudal (tail end) axis of the body and may have an enlarged section at the rostral end which is a brain.
A recurrent neural network (RNN) is one of the two broad types of artificial neural network, characterized by direction of the flow of information between its layers. In contrast to uni-directional feedforward neural network, it is a bi-directional artificial neural network, meaning that it allows the output from some nodes to affect subsequent input to the same nodes. Their ability to use internal state (memory) to process arbitrary sequences of inputs makes them applicable to tasks such as unsegmented, connected handwriting recognition or speech recognition.
In mathematics and computer algebra, automatic differentiation (auto-differentiation, autodiff, or AD), also called algorithmic differentiation, computational differentiation, is a set of techniques to evaluate the partial derivative of a function specified by a computer program. Automatic differentiation exploits the fact that every computer calculation, no matter how complicated, executes a sequence of elementary arithmetic operations (addition, subtraction, multiplication, division, etc.
Pyramidal cells, or pyramidal neurons, are a type of multipolar neuron found in areas of the brain including the cerebral cortex, the hippocampus, and the amygdala. Pyramidal cells are the primary excitation units of the mammalian prefrontal cortex and the corticospinal tract. Pyramidal neurons are also one of two cell types where the characteristic sign, Negri bodies, are found in post-mortem rabies infection. Pyramidal neurons were first discovered and studied by Santiago Ramón y Cajal.
Artificial neural networks (ANNs, also shortened to neural networks (NNs) or neural nets) are a branch of machine learning models that are built using principles of neuronal organization discovered by connectionism in the biological neural networks constituting animal brains. An ANN is based on a collection of connected units or nodes called artificial neurons, which loosely model the neurons in a biological brain. Each connection, like the synapses in a biological brain, can transmit a signal to other neurons.
Cellular differentiation is the process in which a stem cell changes from one type to a differentiated one. Usually, the cell changes to a more specialized type. Differentiation happens multiple times during the development of a multicellular organism as it changes from a simple zygote to a complex system of tissues and cell types. Differentiation continues in adulthood as adult stem cells divide and create fully differentiated daughter cells during tissue repair and during normal cell turnover.
In the context of artificial neural networks, the rectifier or ReLU (rectified linear unit) activation function is an activation function defined as the positive part of its argument: where x is the input to a neuron. This is also known as a ramp function and is analogous to half-wave rectification in electrical engineering. This activation function was introduced by Kunihiko Fukushima in 1969 in the context of visual feature extraction in hierarchical neural networks.