**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Computation & Algorithms I

Description

This lecture covers the complexity of algorithms, focusing on time complexity and asymptotic complexity. It explains how to compute the time complexity of an algorithm, estimating the number of basic instructions executed in terms of the input size. The lecture introduces the Landau notations (Big O, Big Omega, Big Theta) to analyze the behavior of algorithms as the input size grows. It also discusses examples of functions and their asymptotic complexities, demonstrating how to approximate the complexity using Big O/Theta notation.

Login to watch the video

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Instructors (2)

In course

Related concepts (75)

CS-119(g): Information, Computation, Communication

L'objectif de ce cours est d'initier les étudiants à la pensée algorithmique, de les familiariser avec les fondamentaux de
l'informatique et des communications et de développer une première compétence

Computational complexity

In computer science, the computational complexity or simply complexity of an algorithm is the amount of resources required to run it. Particular focus is given to computation time (generally measured by the number of needed elementary operations) and memory storage requirements. The complexity of a problem is the complexity of the best algorithms that allow solving the problem. The study of the complexity of explicitly given algorithms is called analysis of algorithms, while the study of the complexity of problems is called computational complexity theory.

Complexity class

In computational complexity theory, a complexity class is a set of computational problems "of related resource-based complexity". The two most commonly analyzed resources are time and memory. In general, a complexity class is defined in terms of a type of computational problem, a model of computation, and a bounded resource like time or memory. In particular, most complexity classes consist of decision problems that are solvable with a Turing machine, and are differentiated by their time or space (memory) requirements.

P (complexity)

In computational complexity theory, P, also known as PTIME or DTIME(nO(1)), is a fundamental complexity class. It contains all decision problems that can be solved by a deterministic Turing machine using a polynomial amount of computation time, or polynomial time. Cobham's thesis holds that P is the class of computational problems that are "efficiently solvable" or "tractable". This is inexact: in practice, some problems not known to be in P have practical solutions, and some that are in P do not, but this is a useful rule of thumb.

Big O notation

Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. Big O is a member of a family of notations invented by Paul Bachmann, Edmund Landau, and others, collectively called Bachmann–Landau notation or asymptotic notation. The letter O was chosen by Bachmann to stand for Ordnung, meaning the order of approximation. In computer science, big O notation is used to classify algorithms according to how their run time or space requirements grow as the input size grows.

Computational complexity theory

In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved by a computer. A computation problem is solvable by mechanical application of mathematical steps, such as an algorithm. A problem is regarded as inherently difficult if its solution requires significant resources, whatever the algorithm used.

Related lectures (15)

Algorithmic Complexity: Big-O Notation

Introduces Big-O notation for analyzing algorithm efficiency, covering logic, structures, and growth functions.

Quantum Chaos and Scrambling

Explores the concept of scrambling in quantum chaotic systems, connecting classical chaos to quantum chaos and emphasizing sensitivity to initial conditions.

Complexity & Induction: Algorithms & Proofs

Covers worst-case complexity, algorithms, and proofs including mathematical induction and recursion.

Algorithms & Growth of Functions

Covers optimization algorithms, stable matching, and Big-O notation for algorithm efficiency.

Harmonic Forms and Riemann Surfaces

Explores harmonic forms on Riemann surfaces, covering uniqueness of solutions and the Riemann bilinear identity.