Deep learningDeep learning is part of a broader family of machine learning methods, which is based on artificial neural networks with representation learning. The adjective "deep" in deep learning refers to the use of multiple layers in the network. Methods used can be either supervised, semi-supervised or unsupervised.
Types of artificial neural networksThere are many types of artificial neural networks (ANN). Artificial neural networks are computational models inspired by biological neural networks, and are used to approximate functions that are generally unknown. Particularly, they are inspired by the behaviour of neurons and the electrical signals they convey between input (such as from the eyes or nerve endings in the hand), processing, and output from the brain (such as reacting to light, touch, or heat). The way neurons semantically communicate is an area of ongoing research.
Brain cellBrain cells make up the functional tissue of the brain. The rest of the brain tissue is structural or connective called the stroma which includes blood vessels. The two main types of cells in the brain are neurons, also known as nerve cells, and glial cells also known as neuroglia. Neurons are the excitable cells of the brain that function by communicating with other neurons and interneurons (via synapses), in neural circuits and larger brain networks.
Estimation theoryEstimation theory is a branch of statistics that deals with estimating the values of parameters based on measured empirical data that has a random component. The parameters describe an underlying physical setting in such a way that their value affects the distribution of the measured data. An estimator attempts to approximate the unknown parameters using the measurements.
Large-scale brain networkLarge-scale brain networks (also known as intrinsic brain networks) are collections of widespread brain regions showing functional connectivity by statistical analysis of the fMRI BOLD signal or other recording methods such as EEG, PET and MEG. An emerging paradigm in neuroscience is that cognitive tasks are performed not by individual brain regions working in isolation but by networks consisting of several discrete brain regions that are said to be "functionally connected".
Magnetic resonance imagingMagnetic resonance imaging (MRI) is a medical imaging technique used in radiology to form pictures of the anatomy and the physiological processes of the body. MRI scanners use strong magnetic fields, magnetic field gradients, and radio waves to generate images of the organs in the body. MRI does not involve X-rays or the use of ionizing radiation, which distinguishes it from computed tomography (CT) and positron emission tomography (PET) scans.
Molecular diffusionMolecular diffusion, often simply called diffusion, is the thermal motion of all (liquid or gas) particles at temperatures above absolute zero. The rate of this movement is a function of temperature, viscosity of the fluid and the size (mass) of the particles. Diffusion explains the net flux of molecules from a region of higher concentration to one of lower concentration. Once the concentrations are equal the molecules continue to move, but since there is no concentration gradient the process of molecular diffusion has ceased and is instead governed by the process of self-diffusion, originating from the random motion of the molecules.
ConnectomeA connectome (kəˈnɛktoʊm) is a comprehensive map of neural connections in the brain, and may be thought of as its "wiring diagram". An organism's nervous system is made up of neurons which communicate through synapses. A connectome is constructed by tracing the neuron in a nervous system and mapping where neurons are connected through synapses. The significance of the connectome stems from the realization that the structure and function of the human brain are intricately linked, through multiple levels and modes of brain connectivity.
Transformer (machine learning model)A transformer is a deep learning architecture that relies on the parallel multi-head attention mechanism. The modern transformer was proposed in the 2017 paper titled 'Attention Is All You Need' by Ashish Vaswani et al., Google Brain team. It is notable for requiring less training time than previous recurrent neural architectures, such as long short-term memory (LSTM), and its later variation has been prevalently adopted for training large language models on large (language) datasets, such as the Wikipedia corpus and Common Crawl, by virtue of the parallelized processing of input sequence.
Squid giant axonThe squid giant axon is the very large (up to 1.5 mm in diameter; typically around 0.5 mm) axon that controls part of the water jet propulsion system in squid. It was first described by L. W. Williams in 1909, but this discovery was forgotten until English zoologist and neurophysiologist J. Z. Young demonstrated the axon's function in the 1930s while working in the Stazione Zoologica in Naples, the Marine Biological Association in Plymouth and the Marine Biological Laboratory in Woods Hole.