TY - JOUR

T1 - On Hölder Projective Divergences

AU - Nielsen, Frank

AU - Sun, Ke

AU - Marchand-Maillet, Stephane

N1 - KAUST Repository Item: Exported on 2020-10-01
Acknowledgements: The authors gratefully thank the referees for their comments. Ke Sun is funded by King Abdullah University of Science and Technology (KAUST).

PY - 2017/3/16

Y1 - 2017/3/16

N2 - We describe a framework to build distances by measuring the tightness of inequalities and introduce the notion of proper statistical divergences and improper pseudo-divergences. We then consider the Holder ordinary and reverse inequalities and present two novel classes of Holder divergences and pseudo-divergences that both encapsulate the special case of the Cauchy-Schwarz divergence. We report closed-form formulas for those statistical dissimilarities when considering distributions belonging to the same exponential family provided that the natural parameter space is a cone (e.g., multivariate Gaussians) or affine (e.g., categorical distributions). Those new classes of Holder distances are invariant to rescaling and thus do not require distributions to be normalized. Finally, we show how to compute statistical Holder centroids with respect to those divergences and carry out center-based clustering toy experiments on a set of Gaussian distributions which demonstrate empirically that symmetrized Holder divergences outperform the symmetric Cauchy-Schwarz divergence.

AB - We describe a framework to build distances by measuring the tightness of inequalities and introduce the notion of proper statistical divergences and improper pseudo-divergences. We then consider the Holder ordinary and reverse inequalities and present two novel classes of Holder divergences and pseudo-divergences that both encapsulate the special case of the Cauchy-Schwarz divergence. We report closed-form formulas for those statistical dissimilarities when considering distributions belonging to the same exponential family provided that the natural parameter space is a cone (e.g., multivariate Gaussians) or affine (e.g., categorical distributions). Those new classes of Holder distances are invariant to rescaling and thus do not require distributions to be normalized. Finally, we show how to compute statistical Holder centroids with respect to those divergences and carry out center-based clustering toy experiments on a set of Gaussian distributions which demonstrate empirically that symmetrized Holder divergences outperform the symmetric Cauchy-Schwarz divergence.

UR - http://hdl.handle.net/10754/624040

UR - http://www.mdpi.com/1099-4300/19/3/122

UR - http://www.scopus.com/inward/record.url?scp=85024381563&partnerID=8YFLogxK

U2 - 10.3390/e19030122

DO - 10.3390/e19030122

M3 - Article

VL - 19

SP - 122

JO - Entropy

JF - Entropy

SN - 1099-4300

IS - 3

ER -