Extended evolutionary synthesisThe extended evolutionary synthesis consists of a set of theoretical concepts argued to be more comprehensive than the earlier modern synthesis of evolutionary biology that took place between 1918 and 1942. The extended evolutionary synthesis was called for in the 1950s by C. H. Waddington, argued for on the basis of punctuated equilibrium by Stephen Jay Gould and Niles Eldredge in the 1980s, and was reconceptualized in 2007 by Massimo Pigliucci and Gerd B. Müller. Notably, Dr.
Evolutionary developmental biologyEvolutionary developmental biology (informally, evo-devo) is a field of biological research that compares the developmental processes of different organisms to infer how developmental processes evolved. The field grew from 19th-century beginnings, where embryology faced a mystery: zoologists did not know how embryonic development was controlled at the molecular level. Charles Darwin noted that having similar embryos implied common ancestry, but little progress was made until the 1970s.
Genetic programmingIn artificial intelligence, genetic programming (GP) is a technique of evolving programs, starting from a population of unfit (usually random) programs, fit for a particular task by applying operations analogous to natural genetic processes to the population of programs. The operations are: selection of the fittest programs for reproduction (crossover), replication and/or mutation according to a predefined fitness measure, usually proficiency at the desired task.
Evolutionary algorithmIn computational intelligence (CI), an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions (see also loss function).
SynapseIn the nervous system, a synapse is a structure that permits a neuron (or nerve cell) to pass an electrical or chemical signal to another neuron or to the target effector cell. Synapses are essential to the transmission of nervous impulses from one neuron to another. Neurons are specialized to pass signals to individual target cells, and synapses are the means by which they do so. At a synapse, the plasma membrane of the signal-passing neuron (the presynaptic neuron) comes into close apposition with the membrane of the target (postsynaptic) cell.
Electrical synapseAn electrical synapse is a mechanical and electrically conductive link between two neighboring neurons that is formed at a narrow gap between the pre- and postsynaptic neurons known as a gap junction. At gap junctions, such cells approach within about 3.8 nm of each other, a much shorter distance than the 20- to 40-nanometer distance that separates cells at chemical synapse. In many animals, electrical synapse-based systems co-exist with chemical synapses.
Chemical synapseChemical synapses are biological junctions through which neurons' signals can be sent to each other and to non-neuronal cells such as those in muscles or glands. Chemical synapses allow neurons to form circuits within the central nervous system. They are crucial to the biological computations that underlie perception and thought. They allow the nervous system to connect to and control other systems of the body. At a chemical synapse, one neuron releases neurotransmitter molecules into a small space (the synaptic cleft) that is adjacent to another neuron.
Limit cycleIn mathematics, in the study of dynamical systems with two-dimensional phase space, a limit cycle is a closed trajectory in phase space having the property that at least one other trajectory spirals into it either as time approaches infinity or as time approaches negative infinity. Such behavior is exhibited in some nonlinear systems. Limit cycles have been used to model the behavior of many real-world oscillatory systems. The study of limit cycles was initiated by Henri Poincaré (1854–1912).
Lyapunov functionIn the theory of ordinary differential equations (ODEs), Lyapunov functions, named after Aleksandr Lyapunov, are scalar functions that may be used to prove the stability of an equilibrium of an ODE. Lyapunov functions (also called Lyapunov’s second method for stability) are important to stability theory of dynamical systems and control theory. A similar concept appears in the theory of general state space Markov chains, usually under the name Foster–Lyapunov functions.
Lyapunov stabilityVarious types of stability may be discussed for the solutions of differential equations or difference equations describing dynamical systems. The most important type is that concerning the stability of solutions near to a point of equilibrium. This may be discussed by the theory of Aleksandr Lyapunov. In simple terms, if the solutions that start out near an equilibrium point stay near forever, then is Lyapunov stable. More strongly, if is Lyapunov stable and all solutions that start out near converge to , then is said to be asymptotically stable (see asymptotic analysis).