Information Geometry: From Divergence Function to Metric, Equiaffine, Symplectic, and Kahler Structures on a Manifold
Monday, October 28, 2013 - 2:00pm - 2:50pm
Jun Zhang (University of Michigan)
Divergence functions, as a proximity measure on a smooth manifold and often surrogate to the (symmetric) metric function, play an important role in machine learning, statistical inference, optimization, etc. This talk will review the various geometric structures induced from a divergence function defined on a manifold. Most importantly, a Riemannian metric with a pair of torsion-free affine connections can be induced on the manifold, the so-called the “statistical structure” in Information Geometry. Additional structures may emerge depending on the functional form of the divergence. A general family of divergence functions can be constructed based on a smooth and strictly convex function (Zhang, 2004). Such divergence functions results in a manifold equipped with (i) a pair of bi-orthogonal coordinates, and therefore Hessian structure, reflecting “reference-representation biduality”; (ii) an equiaffine structure, so that parallel volume forms exist; (iii) a symplectic structure, so that Legendre maps are induced; (iv) a complex (and hence Kahler) structure, so that complex coordinates can be introduced for representing a pair of points on the manifold. Computational advantages of such divergence functions will be discussed.