Investigation of Alternative Measures for Mutual Information

Bulut Kuskonmaz, Jaron Skovsted Gundersen, Rafal Wisniewski*

*Corresponding author for this work

Research output: Contribution to journalConference article in JournalResearchpeer-review

2 Citations (Scopus)
24 Downloads (Pure)

Abstract

Mutual information I(X; Y ) is a useful definition in information theory to estimate how much information the random variable Y holds about the random variable X. One way to define the mutual information is by comparing the joint distribution of X and Y with the product of the marginals through the Kullback-Leibler (KL) divergence. If the two distributions are close to each other there will be almost no leakage of X from Y since the two variables are close to being independent. In the discrete setting the mutual information has the nice interpretation of how many bits Y reveals about X. However, in the continuous case we do not have the same reasoning. This fact enables us to try different metrics or divergences to define the mutual information. In this paper, we are evaluating different metrics and divergences to form alternatives to the mutual information in the continuous case. We deploy different methods to estimate or bound these metrics and divergences and evaluate their performances.

Original languageEnglish
Book seriesIFAC-PapersOnLine
Volume55
Issue number16
Pages (from-to)154-159
Number of pages6
ISSN2405-8963
DOIs
Publication statusPublished - 1 Jul 2022
Event18th IFAC Workshop on Control Applications of Optimization CAO 2022 - Gif Sur Yvette, France
Duration: 18 Jul 202222 Jul 2022

Conference

Conference18th IFAC Workshop on Control Applications of Optimization CAO 2022
Country/TerritoryFrance
CityGif Sur Yvette
Period18/07/202222/07/2022

Fingerprint

Dive into the research topics of 'Investigation of Alternative Measures for Mutual Information'. Together they form a unique fingerprint.

Cite this