Appears in collection : GDR ISIS - Transport Optimal et Apprentissage Statistique
The Fisher information metric provides a Riemannian framework to compare probability distributions inside a parametric family. The most well-known example is the univariate gaussian family, on which the Fisher information geometry amounts to hyperbolic geometry. In this talk we will investigate the Fisher information geometry of Dirichlet distributions and show that it is negatively curved and geodesically complete. This guarantees the uniqueness of the notion of mean and makes it a suitable geometry to apply the Riemannian K-means algorithm. We illustrate the use of the Fisher information geometry of beta distributions, a particular case of Dirichlet distributions, to compare and classify histograms of medical data.