Projekter pr. år
Abstract
Mutual information I(X; Y ) is a useful definition in information theory to estimate how much information the random variable Y holds about the random variable X. One way to define the mutual information is by comparing the joint distribution of X and Y with the product of the marginals through the Kullback-Leibler (KL) divergence. If the two distributions are close to each other there will be almost no leakage of X from Y since the two variables are close to being independent. In the discrete setting the mutual information has the nice interpretation of how many bits Y reveals about X. However, in the continuous case we do not have the same reasoning. This fact enables us to try different metrics or divergences to define the mutual information. In this paper, we are evaluating different metrics and divergences to form alternatives to the mutual information in the continuous case. We deploy different methods to estimate or bound these metrics and divergences and evaluate their performances.
Originalsprog | Engelsk |
---|---|
Bogserie | IFAC-PapersOnLine |
Vol/bind | 55 |
Udgave nummer | 16 |
Sider (fra-til) | 154-159 |
Antal sider | 6 |
ISSN | 2405-8963 |
DOI | |
Status | Udgivet - 1 jul. 2022 |
Begivenhed | 18th IFAC Workshop on Control Applications of Optimization CAO 2022 - Gif Sur Yvette, Frankrig Varighed: 18 jul. 2022 → 22 jul. 2022 |
Konference
Konference | 18th IFAC Workshop on Control Applications of Optimization CAO 2022 |
---|---|
Land/Område | Frankrig |
By | Gif Sur Yvette |
Periode | 18/07/2022 → 22/07/2022 |
Fingeraftryk
Dyk ned i forskningsemnerne om 'Investigation of Alternative Measures for Mutual Information'. Sammen danner de et unikt fingeraftryk.Projekter
- 1 Afsluttet
-
SWIFT
Wisniewski, R. (PI (principal investigator)), Misra, R. (Projektdeltager), Rathore, S. S. (Projektdeltager), Kuskonmaz, B. (Projektdeltager), Jessen, J. F. (Projektdeltager), Andersen, A. O. (CoPI) & Gundersen, J. S. (Projektdeltager)
01/10/2019 → 30/09/2024
Projekter: Projekt › Forskning