Interpretable Maximal Discrepancies Metrics for Analyzing and Improving Generative Models

Office of Naval Research, Grant # N00014-21-1-2300, Principal Investigator: Austin J. Brockmeier, 4/2021–4/2024.

Overview

Divergence measures quantify the dissimilarity, including the distance, between distributions and are fundamental to hypothesis testing, information theory, and the estimation and criticism of statistical models. Recently, there has been renewed interest in divergences in the context of generative adversarial neural networks (GANs). While a multitude of divergences exist, they vary in their characteristics. Importantly, not all divergences are equally interpretable: a divergence between samples is considered interpretable if it directly answers the question “Which instances best exhibit the discrepancy between the samples?”

Where do the distributions differ.