Cartesian genetic programmingCartesian genetic programming is a form of genetic programming that uses a graph representation to encode computer programs. It grew from a method of evolving digital circuits developed by Julian F. Miller and Peter Thomson in 1997. The term ‘Cartesian genetic programming’ first appeared in 1999 and was proposed as a general form of genetic programming in 2000. It is called ‘Cartesian’ because it represents a program using a two-dimensional grid of nodes. Miller's keynote explains how CGP works.
EvolutionIn biology, evolution is the change in heritable characteristics of biological populations over successive generations. Evolution occurs when evolutionary processes such as natural selection (including sexual selection) and genetic drift act on genetic variation, resulting in certain characteristics becoming more or less common within a population over successive generations. The process of evolution has given rise to biodiversity at every level of biological organisation.
Extended evolutionary synthesisThe extended evolutionary synthesis consists of a set of theoretical concepts argued to be more comprehensive than the earlier modern synthesis of evolutionary biology that took place between 1918 and 1942. The extended evolutionary synthesis was called for in the 1950s by C. H. Waddington, argued for on the basis of punctuated equilibrium by Stephen Jay Gould and Niles Eldredge in the 1980s, and was reconceptualized in 2007 by Massimo Pigliucci and Gerd B. Müller. Notably, Dr.
Complexity classIn computational complexity theory, a complexity class is a set of computational problems "of related resource-based complexity". The two most commonly analyzed resources are time and memory. In general, a complexity class is defined in terms of a type of computational problem, a model of computation, and a bounded resource like time or memory. In particular, most complexity classes consist of decision problems that are solvable with a Turing machine, and are differentiated by their time or space (memory) requirements.
Recurrent neural networkA recurrent neural network (RNN) is one of the two broad types of artificial neural network, characterized by direction of the flow of information between its layers. In contrast to uni-directional feedforward neural network, it is a bi-directional artificial neural network, meaning that it allows the output from some nodes to affect subsequent input to the same nodes. Their ability to use internal state (memory) to process arbitrary sequences of inputs makes them applicable to tasks such as unsegmented, connected handwriting recognition or speech recognition.
Modern synthesis (20th century)The modern synthesis was the early 20th-century synthesis of Charles Darwin's theory of evolution and Gregor Mendel's ideas on heredity into a joint mathematical framework. Julian Huxley coined the term in his 1942 book, Evolution: The Modern Synthesis. The synthesis combined the ideas of natural selection, Mendelian genetics, and population genetics. It also related the broad-scale macroevolution seen by palaeontologists to the small-scale microevolution of local populations.
Evolutionary algorithmIn computational intelligence (CI), an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions (see also loss function).
Computational complexityIn computer science, the computational complexity or simply complexity of an algorithm is the amount of resources required to run it. Particular focus is given to computation time (generally measured by the number of needed elementary operations) and memory storage requirements. The complexity of a problem is the complexity of the best algorithms that allow solving the problem. The study of the complexity of explicitly given algorithms is called analysis of algorithms, while the study of the complexity of problems is called computational complexity theory.
Evolution of biological complexityThe evolution of biological complexity is one important outcome of the process of evolution. Evolution has produced some remarkably complex organisms – although the actual level of complexity is very hard to define or measure accurately in biology, with properties such as gene content, the number of cell types or morphology all proposed as possible metrics. Many biologists used to believe that evolution was progressive (orthogenesis) and had a direction that led towards so-called "higher organisms", despite a lack of evidence for this viewpoint.
Evolutionary programmingEvolutionary programming is one of the four major evolutionary algorithm paradigms. It is similar to genetic programming, but the structure of the program to be optimized is fixed, while its numerical parameters are allowed to evolve. It was first used by Lawrence J. Fogel in the US in 1960 in order to use simulated evolution as a learning process aiming to generate artificial intelligence. Fogel used finite-state machines as predictors and evolved them.