In computing, a Las Vegas algorithm is a randomized algorithm that always gives correct results; that is, it always produces the correct result or it informs about the failure. However, the runtime of a Las Vegas algorithm differs depending on the input. The usual definition of a Las Vegas algorithm includes the restriction that the expected runtime be finite, where the expectation is carried out over the space of random information, or entropy, used in the algorithm. An alternative definition requires that a Las Vegas algorithm always terminates (is effective), but may output a symbol not part of the solution space to indicate failure in finding a solution. The nature of Las Vegas algorithms makes them suitable in situations where the number of possible solutions is limited, and where verifying the correctness of a candidate solution is relatively easy while finding a solution is complex.
Las Vegas algorithms are prominent in the field of artificial intelligence, and in other areas of computer science and operations research. In AI, stochastic local search (SLS) algorithms are considered to be of Las Vegas type. SLS algorithms have been used to address NP-complete decision problems and NP-hard combinatorial optimization problems. However, some systematic search methods, such as modern variants of the Davis–Putnam algorithm for propositional satisfiability (SAT), also utilize non-deterministic decisions, and can thus also be considered Las Vegas algorithms.
Las Vegas algorithms were introduced by László Babai in 1979, in the context of the graph isomorphism problem, as a dual to Monte Carlo algorithms. Babai introduced the term "Las Vegas algorithm" alongside an example involving coin flips: the algorithm depends on a series of independent coin flips, and there is a small chance of failure (no result). However, in contrast to Monte Carlo algorithms, the Las Vegas algorithm can guarantee the correctness of any reported result.
// Las Vegas algorithm
repeat:
k = RandInt(n)
if A[k] == 1,
return k;
As mentioned above, Las Vegas algorithms always return correct results.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
A randomized algorithm is an algorithm that employs a degree of randomness as part of its logic or procedure. The algorithm typically uses uniformly random bits as an auxiliary input to guide its behavior, in the hope of achieving good performance in the "average case" over all possible choices of random determined by the random bits; thus either the running time, or the output (or both) are random variables.
In computational complexity theory, a branch of computer science, bounded-error probabilistic polynomial time (BPP) is the class of decision problems solvable by a probabilistic Turing machine in polynomial time with an error probability bounded by 1/3 for all instances. BPP is one of the largest practical classes of problems, meaning most problems of interest in BPP have efficient probabilistic algorithms that can be run quickly on real modern machines.
In computational complexity theory, a complexity class is a set of computational problems "of related resource-based complexity". The two most commonly analyzed resources are time and memory. In general, a complexity class is defined in terms of a type of computational problem, a model of computation, and a bounded resource like time or memory. In particular, most complexity classes consist of decision problems that are solvable with a Turing machine, and are differentiated by their time or space (memory) requirements.
Introduction aux techniques de l'Intelligence Artificielle, complémentée par des exercices de programmation qui montrent les algorithmes et des exemples de leur application à des problèmes pratiques.
Background Significant efforts have been made in building large-scale kinetic models of cellular metabolism in the past two decades. However, most kinetic models published to date, remain focused around central carbon pathways or are built around ad hoc re ...
This paper is devoted to the distributed complexity of finding an approximation of the maximum cut in graphs. A classical algorithm consists in letting each vertex choose its side of the cut uniformly at random. This does not require any communication and ...
SPRINGER INTERNATIONAL PUBLISHING AG2019
This paper is devoted to the distributed complexity of finding an approximation of the maximum cut (MAXCUT) in graphs. A classical algorithm consists in letting each vertex choose its side of the cut uniformly at random. This does not require any communica ...