Investigation of Alternative Measures for Mutual Information

Publikation: Bidrag til tidsskriftKonferenceartikel i tidsskriftForskningpeer review

2 Citationer (Scopus)
48 Downloads (Pure)

Abstract

Mutual information I(X; Y ) is a useful definition in information theory to estimate how much information the random variable Y holds about the random variable X. One way to define the mutual information is by comparing the joint distribution of X and Y with the product of the marginals through the Kullback-Leibler (KL) divergence. If the two distributions are close to each other there will be almost no leakage of X from Y since the two variables are close to being independent. In the discrete setting the mutual information has the nice interpretation of how many bits Y reveals about X. However, in the continuous case we do not have the same reasoning. This fact enables us to try different metrics or divergences to define the mutual information. In this paper, we are evaluating different metrics and divergences to form alternatives to the mutual information in the continuous case. We deploy different methods to estimate or bound these metrics and divergences and evaluate their performances.

OriginalsprogEngelsk
BogserieIFAC-PapersOnLine
Vol/bind55
Udgave nummer16
Sider (fra-til)154-159
Antal sider6
ISSN2405-8963
DOI
StatusUdgivet - 1 jul. 2022
Begivenhed18th IFAC Workshop on Control Applications of Optimization CAO 2022 - Gif Sur Yvette, Frankrig
Varighed: 18 jul. 202222 jul. 2022

Konference

Konference18th IFAC Workshop on Control Applications of Optimization CAO 2022
Land/OmrådeFrankrig
ByGif Sur Yvette
Periode18/07/202222/07/2022

Fingeraftryk

Dyk ned i forskningsemnerne om 'Investigation of Alternative Measures for Mutual Information'. Sammen danner de et unikt fingeraftryk.

Citationsformater