Header menu link for other important links
X
Minimization problems based on relative $\alpha$-entropy I: Forward projection
, Rajesh Sundaresan
Published in
2015
Volume: 61
   
Issue: 9
Pages: 5063 - 5080
Abstract
Minimization problems with respect to a one-parameter family of generalized relative entropies are studied. These relative entropies, which we term relative $\alpha$-entropies (denoted I$\alpha$), arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (Kullback-Leibler divergence). Just like relative entropy, these relative $\alpha$-entropies behave like squared Euclidean distance and satisfy the Pythagorean property. Minimizers of these relative $\alpha$-entropies on closed and convex sets are shown to exist. Such minimizations generalize the maximum R{\'{e}}nyi or Tsallis entropy principle. The minimizing probability distribution (termed forward I$\alpha$-projection) for a linear family is shown to obey a power-law. Other results in connection with statistical inference, namely subspace transitivity and iterated projections, are also established. In a companion paper, a related minimization problem of interest in robust statistics that leads to a reverse I$\alpha$-projection is studied.
About the journal
JournalIEEE Transactions on Information Theory
ISSN00189448