**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Publication# Unambiguous DNFs and Alon-Saks-Seymour

Abstract

We exhibit an unambiguous k-DNF formula that requires CNF width (Omega) over tilde (k(2)), which is optimal up to logarithmic factors. As a consequence, we get a near-optimal solution to the Alon-Saks-Seymour problem in graph theory (posed in 1991), which asks: How large a gap can there be between the chromatic number of a graph and its biclique partition number? Our result is also known to imply several other improved separations in query and communication complexity.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related concepts (9)

Related publications (1)

Related MOOCs (12)

Graph theory

In mathematics, graph theory is the study of graphs, which are mathematical structures used to model pairwise relations between objects. A graph in this context is made up of vertices (also called nodes or points) which are connected by edges (also called links or lines). A distinction is made between undirected graphs, where edges link two vertices symmetrically, and directed graphs, where edges link two vertices asymmetrically. Graphs are one of the principal objects of study in discrete mathematics.

Communication complexity

In theoretical computer science, communication complexity studies the amount of communication required to solve a problem when the input to the problem is distributed among two or more parties. The study of communication complexity was first introduced by Andrew Yao in 1979, while studying the problem of computation distributed among several machines. The problem is usually stated as follows: two parties (traditionally called Alice and Bob) each receive a (potentially different) -bit string and .

Computational complexity theory

In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved by a computer. A computation problem is solvable by mechanical application of mathematical steps, such as an algorithm. A problem is regarded as inherently difficult if its solution requires significant resources, whatever the algorithm used.

Since the 2008 Global Financial Crisis, the financial market has become more unpredictable than ever before, and it seems set to remain so in the forseeable future. This means an investor faces unprecedented risks, hence the increasing need for robust portfolio optimization to protect them against uncertainty, which is potentially devastating if unattended yet ignored in the classical Markowitz model, whose another deficiency is the absence of higher moments in its assumption of the distribution of asset returns. We establish an equivalence between the Markowitz model and the portfolio return value-at-risk optimization problem under multivariate normality of asset returns, so that we can add these excluded features into the former implicitly by incorporating them into the latter. We also provide a probabilistic smoothing spline approximation method and a deterministic model within the location-scale framework under elliptical distribution of the asset returns to solve the robust portfolio return value-at-risk optimization problem. In particular for the deterministic model, we introduce a novel eigendecomposition uncertainty set which lives in the positive definite space for the scale matrix without compromising on the computational complexity and conservativeness of the optimization problem, invent a method to determine the size of the involved uncertainty sets, test it out on real data, and explore its diversification properties. Although the value-at-risk has been the standard risk measure adopted by the banking and insurance industry since the early nineties, it has since attracted many criticisms, in particular from McNeil et al. (2005) and the Basel Committee on Banking Supervision in 2012, also known as Basel 3.5. Basel 4 even suggests a move away from the

`what" value-at-risk to the `

what-if" conditional value-at-risk' measure. We shall see that the former may be replaced with the latter or even other risk measures in our formulations easily.Digital Signal Processing I

Basic signal processing concepts, Fourier analysis and filters. This module can
be used as a starting point or a basic refresher in elementary DSP

Digital Signal Processing II

Adaptive signal processing, A/D and D/A. This module provides the basic
tools for adaptive filtering and a solid mathematical framework for sampling and
quantization

Digital Signal Processing III

Advanced topics: this module covers real-time audio processing (with
examples on a hardware board), image processing and communication system design.