Deep brain stimulationDeep brain stimulation (DBS) is a neurosurgical procedure involving the placement of a medical device called a neurostimulator, which sends electrical impulses, through implanted electrodes, to specific targets in the brain (the brain nucleus) for the treatment of movement disorders, including Parkinson's disease, essential tremor, dystonia, and other conditions such as obsessive-compulsive disorder (OCD) and epilepsy. While its underlying principles and mechanisms are not fully understood, DBS directly changes brain activity in a controlled manner.
Spinal cord stimulatorA spinal cord stimulator (SCS) or dorsal column stimulator (DCS) is a type of implantable neuromodulation device (sometimes called a "pain pacemaker") that is used to send electrical signals to select areas of the spinal cord (dorsal columns) for the treatment of certain pain conditions. SCS is a consideration for people who have a pain condition that has not responded to more conservative therapy. There are also spinal cord stimulators under research and development that could enable patients with spinal cord injury to walk again via epidural electrical stimulation (EES).
Artificial neural networkArtificial neural networks (ANNs, also shortened to neural networks (NNs) or neural nets) are a branch of machine learning models that are built using principles of neuronal organization discovered by connectionism in the biological neural networks constituting animal brains. An ANN is based on a collection of connected units or nodes called artificial neurons, which loosely model the neurons in a biological brain. Each connection, like the synapses in a biological brain, can transmit a signal to other neurons.
Spiking neural networkArtificial neural network Spiking neural networks (SNNs) are artificial neural networks that more closely mimic natural neural networks. In addition to neuronal and synaptic state, SNNs incorporate the concept of time into their operating model. The idea is that neurons in the SNN do not transmit information at each propagation cycle (as it happens with typical multi-layer perceptron networks), but rather transmit information only when a membrane potential—an intrinsic quality of the neuron related to its membrane electrical charge—reaches a specific value, called the threshold.
Spinal cord injuryA spinal cord injury (SCI) is damage to the spinal cord that causes temporary or permanent changes in its function. Symptoms may include loss of muscle function, sensation, or autonomic function in the parts of the body served by the spinal cord below the level of the injury. Injury can occur at any level of the spinal cord and can be complete, with a total loss of sensation and muscle function at lower sacral segments, or incomplete, meaning some nervous signals are able to travel past the injured area of the cord up to the Sacral S4-5 spinal cord segments.
Spinal cord injury researchSpinal cord injury research seeks new ways to cure or treat spinal cord injury in order to lessen the debilitating effects of the injury in the short or long term. There is no cure for SCI, and current treatments are mostly focused on spinal cord injury rehabilitation and management of the secondary effects of the condition. Two major areas of research include neuroprotection, ways to prevent damage to cells caused by biological processes that take place in the body after the injury, and neuroregeneration, regrowing or replacing damaged neural circuits.
Rectifier (neural networks)In the context of artificial neural networks, the rectifier or ReLU (rectified linear unit) activation function is an activation function defined as the positive part of its argument: where x is the input to a neuron. This is also known as a ramp function and is analogous to half-wave rectification in electrical engineering. This activation function was introduced by Kunihiko Fukushima in 1969 in the context of visual feature extraction in hierarchical neural networks.
Convolutional neural networkConvolutional neural network (CNN) is a regularized type of feed-forward neural network that learns feature engineering by itself via filters (or kernel) optimization. Vanishing gradients and exploding gradients, seen during backpropagation in earlier neural networks, are prevented by using regularized weights over fewer connections. For example, for each neuron in the fully-connected layer 10,000 weights would be required for processing an image sized 100 × 100 pixels.
Neural oscillationNeural oscillations, or brainwaves, are rhythmic or repetitive patterns of neural activity in the central nervous system. Neural tissue can generate oscillatory activity in many ways, driven either by mechanisms within individual neurons or by interactions between neurons. In individual neurons, oscillations can appear either as oscillations in membrane potential or as rhythmic patterns of action potentials, which then produce oscillatory activation of post-synaptic neurons.
Microelectrode arrayMicroelectrode arrays (MEAs) (also referred to as multielectrode arrays) are devices that contain multiple (tens to thousands) microelectrodes through which neural signals are obtained or delivered, essentially serving as neural interfaces that connect neurons to electronic circuitry. There are two general classes of MEAs: implantable MEAs, used in vivo, and non-implantable MEAs, used in vitro. Neurons and muscle cells create ion currents through their membranes when excited, causing a change in voltage between the inside and the outside of the cell.