Information theoryInformation theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering. A key measure in information theory is entropy.
Mutual informationIn probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.
Sensory-motor couplingSensory-motor coupling is the coupling or integration of the sensory system and motor system. Sensorimotor integration is not a static process. For a given stimulus, there is no one single motor command. "Neural responses at almost every stage of a sensorimotor pathway are modified at short and long timescales by biophysical and synaptic processes, recurrent and feedback connections, and learning, as well as many other internal and external variables".
Developmental coordination disorderDevelopmental coordination disorder (DCD), also known as developmental motor coordination disorder, developmental dyspraxia or simply dyspraxia (from Ancient Greek praxis 'activity'), is a neurodevelopmental disorder characterized by impaired coordination of physical movements as a result of brain messages not being accurately transmitted to the body. Deficits in fine or gross motor skills movements interfere with activities of daily living.
Motor coordinationIn physiology, motor coordination is the orchestrated movement of multiple body parts as required to accomplish intended actions, like walking. This coordination is achieved by adjusting kinematic and kinetic parameters associated with each body part involved in the intended movement. The modifications of these parameters typically relies on sensory feedback from one or more sensory modalities (see multisensory integration), such as proprioception and vision.
Sensory neuronSensory neurons, also known as afferent neurons, are neurons in the nervous system, that convert a specific type of stimulus, via their receptors, into action potentials or graded receptor potentials. This process is called sensory transduction. The cell bodies of the sensory neurons are located in the dorsal ganglia of the spinal cord. The sensory information travels on the afferent nerve fibers in a sensory nerve, to the brain via the spinal cord.
Sensory processingSensory processing is the process that organizes and distinguishes sensation (sensory information) from one's own body and the environment, thus making it possible to use the body effectively within the environment. Specifically, it deals with how the brain processes multiple sensory modality inputs, such as proprioception, vision, auditory system, tactile, olfactory, vestibular system, interoception, and taste into usable functional outputs. It has been believed for some time that inputs from different sensory organs are processed in different areas in the brain.
Entropy (information theory)In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable , which takes values in the alphabet and is distributed according to : where denotes the sum over the variable's possible values. The choice of base for , the logarithm, varies for different applications. Base 2 gives the unit of bits (or "shannons"), while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys".
Algorithmic information theoryAlgorithmic information theory (AIT) is a branch of theoretical computer science that concerns itself with the relationship between computation and information of computably generated objects (as opposed to stochastically generated), such as strings or any other data structure. In other words, it is shown within algorithmic information theory that computational incompressibility "mimics" (except for a constant that only depends on the chosen universal programming language) the relations or inequalities found in information theory.
Motor controlMotor control is the regulation of movement in organisms that possess a nervous system. Motor control includes reflexes as well as directed movement. To control movement, the nervous system must integrate multimodal sensory information (both from the external world as well as proprioception) and elicit the necessary signals to recruit muscles to carry out a goal. This pathway spans many disciplines, including multisensory integration, signal processing, coordination, biomechanics, and cognition, and the computational challenges are often discussed under the term sensorimotor control.