Projects per year
Abstract
Mutual information I(X; Y ) is a useful definition in information theory to estimate how much information the random variable Y holds about the random variable X. One way to define the mutual information is by comparing the joint distribution of X and Y with the product of the marginals through the Kullback-Leibler (KL) divergence. If the two distributions are close to each other there will be almost no leakage of X from Y since the two variables are close to being independent. In the discrete setting the mutual information has the nice interpretation of how many bits Y reveals about X. However, in the continuous case we do not have the same reasoning. This fact enables us to try different metrics or divergences to define the mutual information. In this paper, we are evaluating different metrics and divergences to form alternatives to the mutual information in the continuous case. We deploy different methods to estimate or bound these metrics and divergences and evaluate their performances.
Original language | English |
---|---|
Book series | IFAC-PapersOnLine |
Volume | 55 |
Issue number | 16 |
Pages (from-to) | 154-159 |
Number of pages | 6 |
ISSN | 2405-8963 |
DOIs | |
Publication status | Published - 1 Jul 2022 |
Event | 18th IFAC Workshop on Control Applications of Optimization CAO 2022 - Gif Sur Yvette, France Duration: 18 Jul 2022 → 22 Jul 2022 |
Conference
Conference | 18th IFAC Workshop on Control Applications of Optimization CAO 2022 |
---|---|
Country/Territory | France |
City | Gif Sur Yvette |
Period | 18/07/2022 → 22/07/2022 |
Fingerprint
Dive into the research topics of 'Investigation of Alternative Measures for Mutual Information'. Together they form a unique fingerprint.Projects
- 1 Finished
-
SWIFT
Wisniewski, R. (PI), Misra, R. (Project Participant), Rathore, S. S. (Project Participant), Kuskonmaz, B. (Project Participant), Jessen, J. F. (Project Participant), Andersen, A. O. (CoPI) & Gundersen, J. S. (Project Participant)
01/10/2019 → 30/09/2024
Project: Research