Intersection graphIn graph theory, an intersection graph is a graph that represents the pattern of intersections of a family of sets. Any graph can be represented as an intersection graph, but some important special classes of graphs can be defined by the types of sets that are used to form an intersection representation of them. Formally, an intersection graph G is an undirected graph formed from a family of sets by creating one vertex v_i for each set S_i, and connecting two vertices v_i and v_j by an edge whenever the corresponding two sets have a nonempty intersection, that is, Any undirected graph G may be represented as an intersection graph.
Graph theoryIn mathematics, graph theory is the study of graphs, which are mathematical structures used to model pairwise relations between objects. A graph in this context is made up of vertices (also called nodes or points) which are connected by edges (also called links or lines). A distinction is made between undirected graphs, where edges link two vertices symmetrically, and directed graphs, where edges link two vertices asymmetrically. Graphs are one of the principal objects of study in discrete mathematics.
Optimizing compilerIn computing, an optimizing compiler is a compiler that tries to minimize or maximize some attributes of an executable computer program. Common requirements are to minimize a program's execution time, memory footprint, storage size, and power consumption (the last three being popular for portable computers). Compiler optimization is generally implemented using a sequence of optimizing transformations, algorithms which take a program and transform it to produce a semantically equivalent output program that uses fewer resources or executes faster.
Planar separator theoremIn graph theory, the planar separator theorem is a form of isoperimetric inequality for planar graphs, that states that any planar graph can be split into smaller pieces by removing a small number of vertices. Specifically, the removal of O(\sqrt{n}) vertices from an n-vertex graph (where the O invokes big O notation) can partition the graph into disjoint subgraphs each of which has at most 2n/3 vertices.
Graph operationsIn the mathematical field of graph theory, graph operations are operations which produce new graphs from initial ones. They include both unary (one input) and binary (two input) operations. Unary operations create a new graph from a single initial graph. Elementary operations or editing operations, which are also known as graph edit operations, create a new graph from one initial one by a simple local change, such as addition or deletion of a vertex or of an edge, merging and splitting of vertices, edge contraction, etc.
Adequate equivalence relationIn algebraic geometry, a branch of mathematics, an adequate equivalence relation is an equivalence relation on algebraic cycles of smooth projective varieties used to obtain a well-working theory of such cycles, and in particular, well-defined intersection products. Pierre Samuel formalized the concept of an adequate equivalence relation in 1958. Since then it has become central to theory of motives. For every adequate equivalence relation, one may define the of pure motives with respect to that relation.
Bridge (graph theory)In graph theory, a bridge, isthmus, cut-edge, or cut arc is an edge of a graph whose deletion increases the graph's number of connected components. Equivalently, an edge is a bridge if and only if it is not contained in any cycle. For a connected graph, a bridge can uniquely determine a cut. A graph is said to be bridgeless or isthmus-free if it contains no bridges. This type of bridge should be distinguished from an unrelated meaning of "bridge" in graph theory, a subgraph separated from the rest of the graph by a specified subset of vertices; see bridge.
Data-flow analysisData-flow analysis is a technique for gathering information about the possible set of values calculated at various points in a computer program. A program's control-flow graph (CFG) is used to determine those parts of a program to which a particular value assigned to a variable might propagate. The information gathered is often used by compilers when optimizing a program. A canonical example of a data-flow analysis is reaching definitions.
Time complexityIn computer science, the time complexity is the computational complexity that describes the amount of computer time it takes to run an algorithm. Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform. Thus, the amount of time taken and the number of elementary operations performed by the algorithm are taken to be related by a constant factor.
PlanarizationIn the mathematical field of graph theory, planarization is a method of extending graph drawing methods from planar graphs to graphs that are not planar, by embedding the non-planar graphs within a larger planar graph. Planarization may be performed by using any method to find a drawing (with crossings) for the given graph, and then replacing each crossing point by a new artificial vertex, causing each crossed edge to be subdivided into a path. The original graph will be represented as an immersion minor of its planarization.