Information Geometry is the differential geometric study of the manifold of probability density functions. Divergence functions (such as Bregman or KL divergence), as measure of proximity on this manifold, play an important role in machine learning, statistical inference, optimization, etc. This talk will explain the various geometric structures induced from any divergence function. Most importantly, a Riemannian metric (Fisher information) with a family of torsion-free affine connections (alpha-connections) can be induced on the manifold, this is the so-called the “statistical structure” in Information Geometry. Divergence functions can induce other important structures/quantities, such as bi-orthogonal coordinates (namely expectation and natural parameters), parallel volume form (in modeling Bayesian priors), symplectic structure (for Hamiltonian dynamics as in HMC algorithm), and Kahler/para-Kahler structure that synthesizes the metric, the symplectic, and the complex structures on the manifold.

connect with us

         

© UC Irvine School of Social Sciences - 3151 Social Sciences Plaza, Irvine, CA 92697-5100 - 949.824.2766