Quantum complexity theoryQuantum complexity theory is the subfield of computational complexity theory that deals with complexity classes defined using quantum computers, a computational model based on quantum mechanics. It studies the hardness of computational problems in relation to these complexity classes, as well as the relationship between quantum complexity classes and classical (i.e., non-quantum) complexity classes. Two important quantum complexity classes are BQP and QMA.
Splay treeA splay tree is a binary search tree with the additional property that recently accessed elements are quick to access again. Like self-balancing binary search trees, a splay tree performs basic operations such as insertion, look-up and removal in O(log n) amortized time. For random access patterns drawn from a non-uniform random distribution, their amortized time can be faster than logarithmic, proportional to the entropy of the access pattern.
Game complexityCombinatorial game theory measures game complexity in several ways: State-space complexity (the number of legal game positions from the initial position), Game tree size (total number of possible games), Decision complexity (number of leaf nodes in the smallest decision tree for initial position), Game-tree complexity (number of leaf nodes in the smallest full-width decision tree for initial position), Computational complexity (asymptotic difficulty of a game as it grows arbitrarily large).
Speculative executionSpeculative execution is an optimization technique where a computer system performs some task that may not be needed. Work is done before it is known whether it is actually needed, so as to prevent a delay that would have to be incurred by doing the work after it is known that it is needed. If it turns out the work was not needed after all, most changes made by the work are reverted and the results are ignored. The objective is to provide more concurrency if extra resources are available.
A* search algorithmA* (pronounced "A-star") is a graph traversal and path search algorithm, which is used in many fields of computer science due to its completeness, optimality, and optimal efficiency. One major practical drawback is its space complexity, as it stores all generated nodes in memory. Thus, in practical travel-routing systems, it is generally outperformed by algorithms that can pre-process the graph to attain better performance, as well as memory-bounded approaches; however, A* is still the best solution in many cases.
M-ary treeIn graph theory, an m-ary tree (for nonnegative integers m) (also known as n-ary, k-ary or k-way tree) is an arborescence (or, for some authors, an ordered tree) in which each node has no more than m children. A binary tree is the special case where m = 2, and a ternary tree is another case with m = 3 that limits its children to three. A full m-ary tree is an m-ary tree where within each level every node has 0 or m children. A complete m-ary tree (or, less commonly, a perfect m-ary tree) is a full m-ary tree in which all leaf nodes are at the same depth.
Persistent data structureIn computing, a persistent data structure or not ephemeral data structure is a data structure that always preserves the previous version of itself when it is modified. Such data structures are effectively immutable, as their operations do not (visibly) update the structure in-place, but instead always yield a new updated structure. The term was introduced in Driscoll, Sarnak, Sleator, and Tarjans' 1986 article. A data structure is partially persistent if all versions can be accessed but only the newest version can be modified.
Spectre (security vulnerability)Spectre refers to one of the two original transient execution CPU vulnerabilities (the other being Meltdown), which involve microarchitectural timing side-channel attacks. These affect modern microprocessors that perform branch prediction and other forms of speculation. On most processors, the speculative execution resulting from a branch misprediction may leave observable side effects that may reveal private data to attackers.
Dijkstra's algorithmDijkstra's algorithm (ˈdaɪkstrəz ) is an algorithm for finding the shortest paths between nodes in a weighted graph, which may represent, for example, road networks. It was conceived by computer scientist Edsger W. Dijkstra in 1956 and published three years later. The algorithm exists in many variants. Dijkstra's original algorithm found the shortest path between two given nodes, but a more common variant fixes a single node as the "source" node and finds shortest paths from the source to all other nodes in the graph, producing a shortest-path tree.
Irreducible complexityIrreducible complexity (IC) is the argument that certain biological systems with multiple interacting parts would not function if one of the parts were removed, so supposedly could not have evolved by successive small modifications from earlier less complex systems through natural selection, which would need all intermediate precursor systems to have been fully functional. This negative argument is then complemented by the claim that the only alternative explanation is a "purposeful arrangement of parts" inferring design by an intelligent agent.