Publication

Wavelet packet best basis search using generalized Renyi entropy

Volkan Cevher
2002
Conference paper
Abstract

This paper introduces an approach to wavelet packet best basis searches using the generalized Renyi entropy. The approach extends work by R.R. Coifman and M.V. Wickerhauser who showed how Shannon entropy can be used as an additive cost function in the wavelet packet best basis selection (see IEEE Trans. on Inform. Theory, vol.38, no.2, p.713-18, 1992). This paper also extends the idea of an additive cost function to an arithmetic mean. These extensions allow for a redefinition of additive cost functions as arithmetic means in a way consistent with the approach of Coifman and Wickerhauser. The approach using an arithmetic mean is then extended to include the geometric mean. This extension to geometric means allows us to introduce the Renyi generalized entropy as a cost function in the best basis search. These two extensions also allow the use of incomplete probability distributions, whereas Coifman and Wickerhauser's entropy based cost function is limited to complete probability distributions.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related concepts (33)
Rényi entropy
In information theory, the Rényi entropy is a quantity that generalizes various notions of entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy. The Rényi entropy is named after Alfréd Rényi, who looked for the most general way to quantify information while preserving additivity for independent events. In the context of fractal dimension estimation, the Rényi entropy forms the basis of the concept of generalized dimensions. The Rényi entropy is important in ecology and statistics as index of diversity.
Entropy (information theory)
In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable , which takes values in the alphabet and is distributed according to : where denotes the sum over the variable's possible values. The choice of base for , the logarithm, varies for different applications. Base 2 gives the unit of bits (or "shannons"), while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys".
Min-entropy
The min-entropy, in information theory, is the smallest of the Rényi family of entropies, corresponding to the most conservative way of measuring the unpredictability of a set of outcomes, as the negative logarithm of the probability of the most likely outcome. The various Rényi entropies are all equal for a uniform distribution, but measure the unpredictability of a nonuniform distribution in different ways.
Show more
Related publications (44)

Information Spectrum Converse for Minimum Entropy Couplings and Functional Representations

Given two jointly distributed random variables (X,Y), a functional representation of X is a random variable Z independent of Y, and a deterministic function g(⋅,⋅) such that X=g(Y,Z). The problem of finding a minimum entropy functional representation is kn ...
2023

A Functional Perspective on Information Measures

Amedeo Roberto Esposito

Since the birth of Information Theory, researchers have defined and exploited various information measures, as well as endowed them with operational meanings. Some were born as a "solution to a problem", like Shannon's Entropy and Mutual Information. Other ...
EPFL2022

From Generalisation Error to Transportation-cost Inequalities and Back

Michael Christoph Gastpar, Amedeo Roberto Esposito

In this work, we connect the problem of bounding the expected generalisation error with transportation-cost inequalities. Exposing the underlying pattern behind both approaches we are able to generalise them and go beyond Kullback- Leibler Divergences/Mutu ...
2022
Show more
Related MOOCs (6)
Advanced statistical physics
We explore statistical physics in both classical and open quantum systems. Additionally, we will cover probabilistic data analysis that is extremely useful in many applications.
Advanced statistical physics
We explore statistical physics in both classical and open quantum systems. Additionally, we will cover probabilistic data analysis that is extremely useful in many applications.
Selected Topics on Discrete Choice
Discrete choice models are used extensively in many disciplines where it is important to predict human behavior at a disaggregate level. This course is a follow up of the online course “Introduction t
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.