Information Geometry: Unified Framework for Information, Machine Learning, and Statistical Inference – UROP Spring Symposium 2021

Information Geometry: Unified Framework for Information, Machine Learning, and Statistical Inference

Xun Wang

UROP Logo

Pronouns: Xun

Research Mentor(s): Jun Zhang, Research Assistant
Research Mentor School/College/Department: Psychology and Statistics, College of Literature, Science, and the Arts
Presentation Date: Thursday, April 22, 2021
Session: Session 1 (10am-10:50am)
Breakout Room: Room 15
Presenter: 5

Event Link

Abstract

This study aims at computing around KL-divergences of probability density distributions around different metrics. We try to compute 2nd/3rd derivatives for different metrics, affine connections and levi-civita Connections in particular, under these distributions in order to have an idea about the flatness of the different spaces we are working on. Given the nature of this independent study, most of these computations and results have already been proven before. We computed KL divergences, Jacobian and Fisher matrices under different metrics. We then derive some integral forms for Fisher Information matrices, affine connections, dual connections and derivative forms for coordinate transformation for these parameters. We found out that for metrics in exponentials, the affine connections turn out to be 0 everywhere, while in expectation coordinates, the dual connections are 0. This means that in either case, we are working with a flat space. Nothing spectacular has been found out around the normal metric(namely, (mean, standard deviation)).

Authors: Xun Wang, YIfan Lu
Research Method: Computer Programming

lsa logoum logo