Concurrency (computer science)In computer science, concurrency is the ability of different parts or units of a program, algorithm, or problem to be executed out-of-order or in partial order, without affecting the outcome. This allows for parallel execution of the concurrent units, which can significantly improve overall speed of the execution in multi-processor and multi-core systems. In more technical terms, concurrency refers to the decomposability of a program, algorithm, or problem into order-independent or partially-ordered components or units of computation.
Concurrent computingConcurrent computing is a form of computing in which several computations are executed concurrently—during overlapping time periods—instead of sequentially—with one completing before the next starts. This is a property of a system—whether a program, computer, or a network—where there is a separate execution point or "thread of control" for each process. A concurrent system is one where a computation can advance without waiting for all other computations to complete. Concurrent computing is a form of modular programming.
Trace theoryIn mathematics and computer science, trace theory aims to provide a concrete mathematical underpinning for the study of concurrent computation and process calculi. The underpinning is provided by an algebraic definition of the free partially commutative monoid or trace monoid, or equivalently, the history monoid, which provides a concrete algebraic foundation, analogous to the way that the free monoid provides the underpinning for formal languages.