The conditional mean is a fundamental and important quantity whose applications include the theories of estimation and rate-distortion. It is also notoriously difficult to work with. This paper establishes novel bounds on the differential entropy of the co ...
This paper considers an additive Gaussian noise channel with arbitrarily distributed finite variance input signals. It studies the differential entropy of the minimum mean-square error (MMSE) estimator and provides a new lower bound which connects the diff ...
Information concentration of probability measures have important implications in learning theory. Recently, it is discovered that the information content of a log-concave distribution concentrates around their differential entropy, albeit with an unpleasan ...
Pressurized fluid-distribution networks are key strategic elements of infrastructure. Drinking water is a precious resource and will become more and more important with the depletion of reserves. With the growth of the human population, challenges related ...
The entropy power inequality (EPI) yields lower bounds on the differential entropy of the sum of two independent real-valued random variables in terms of the individual entropies. Versions of the EPI for discrete random variables have been obtained for spe ...
Institute of Electrical and Electronics Engineers2014
Good prediction of the behavior of wind around buildings improves designs for natural ventilation in warm climates. However wind modeling is complex, predictions are often inaccurate due to the large uncertainties in parameter values. The goal of this work ...
We consider the N-relay Gaussian diamond network when the source and the destination have ns ≥ 2 and nd ≥ 2 antennas respectively. We show that when ns = nd = 2 and when the individual MISO channels from the source to each relay and the SIMO channels from ...