Decorrelation of Neutral Vector Variables: Theory and Applications

Zhanyu Ma, Jing-Hao Xue, Arne Leijon, Zheng-Hua Tan, Zhen Yang, Jun Guo

Research output: Contribution to journalJournal articleResearchpeer-review

91 Citations (Scopus)

Abstract

In this paper, we propose novel strategies for neutral vector variable decorrelation. Two fundamental invertible transformations, namely, serial nonlinear transformation and parallel nonlinear transformation, are proposed to carry out the decorrelation. For a neutral vector variable, which is not multivariate-Gaussian distributed, the conventional principal component analysis cannot yield mutually independent scalar variables. With the two proposed transformations, a highly negatively correlated neutral vector can be transformed to a set of mutually independent scalar variables with the same degrees of freedom. We also evaluate the decorrelation performances for the vectors generated from a single Dirichlet distribution and a mixture of Dirichlet distributions. The mutual independence is verified with the distance correlation measurement. The advantages of the proposed decorrelation strategies are intensively studied and demonstrated with synthesized data and practical application evaluations.

Original languageEnglish
Article number7676372
JournalI E E E Transactions on Neural Networks and Learning Systems
Volume29
Issue number1
Pages (from-to)129-143
Number of pages15
ISSN2162-237X
DOIs
Publication statusPublished - Jan 2018

Keywords

  • Decorrelation
  • Dirichlet variable
  • neutral vector
  • neutrality
  • non-Gaussian

Fingerprint

Dive into the research topics of 'Decorrelation of Neutral Vector Variables: Theory and Applications'. Together they form a unique fingerprint.

Cite this