### Details

#### Title

Information geometry of divergence functions#### Journal title

Bulletin of the Polish Academy of Sciences: Technical Sciences#### Yearbook

2010#### Numer

No 1 March#### Publication authors

#### Divisions of PAS

Nauki Techniczne#### Publisher

Polish Academy of Sciences#### Date

2010#### Identifier

ISSN 0239-7528, eISSN 2300-1917#### References

Amari S. (2000), Methods of Information Geometry. ; Cichocki A. (2009), Nonnegative Matrix and Tensor Factorizations. ; Nielsen F. (2009), Emerging trends in visual computing, Lecture Notes in Computer Science, 6. ; Bregman L. (1967), The relaxation method of finding a common point of convex sets and its application to the solution of problems in convex programming, Comp. Math. Phys., USSR, 7, 200. ; Banerjee A. (2005), Clustering with Bregman divergences, J. Machine Learning Research, 6, 1705. ; Ali M. (1966), A general class of coefficients of divergence of one distribution from another, J. Royal Statistical Society B, 28, 131. ; Csiszár I. (1967), Information-type measures of difference of probability distributions and indirect observations, Studia Sci. Math, 2, 299. ; Csiszár I. (1974), Information measures: a critical survey, null, 1, 83. ; Taneja I. (2004), Relative information of type <i>s</i>, Csiszár's <i>f</i>-divergence, and information inequalities, Information Sciences, 166, 105. ; Csiszár I. (1991), Why least squares and maximum entropy? An axiomatic approach to inference for linear problems, Annuals of Statistics, 19, 2032. ; Chentsov N. (1972), Statistical Decision Rules and Optimal Inference. ; Csiszár I. (2008), Axiomatic characterizations of information measures, Entropy, 10, 261. ; Pistone G. (1995), An infinite-dimensional geometric structure on the space of all the probability measures equivalent to a given on, Annals of Statistics, 23, 1543. ; Amari S. (2009), Alpha divergence is unique, belonging to both classes of <i>f</i>-divergence and Bregman divergence, IEEE Trans. Information Theory, B 55, 4925. ; Tsallis C. (1988), Possible generalization of Boltzmann-Gibbs statistics, J. Stat. Phys, 52, 479. ; Rényi A. (1961), On measures of entropy and information, null, 1, 547. ; Naudts J. (2004), Estimators, escort probabilities, and phiexponential families in statistical physics, J. Ineq. Pure App. Math, 5, 102. ; Naudts J. (2008), Generalized exponential families and associated entropy functions, Entropy, 10, 131. ; Suyari H. (2006), Mathematical structures derived from <i>q</i>-multinomial coefficient in Tsallis statistics, Physica, A 368, 63. ; Amari S. (2009), Information geometry and its applications: convex function and dually flat manifold, Emerging Trends in Visual Computing, A2, 5416. ; Grasselli M. (2004), Duality, monotonicity and Wigner-Yanase-Dyson metrics, Infinite Dimensional Analysis, Quantum Probability and Related Topics, 7, 215. ; Hasegawa H. (1993), α-divergence of the non-commutative information geometry, Reports on Mathematical Physics, 33, 87. ; Petz D. (1996), Monotone metrics on matrix spaces, Linear Algebra and its Applications, 244, 81. ; Dhillon I. (2007), Matrix nearness problem with Bregman divergences, SIAM J. on Matrix Analysis and Applications, 29, 1120. ; Nesterov Yu. (2002), On the Riemannian geometry defined by self-concordant barriers and interior-point methods, Foundations of Computational Mathematics, 2, 333. ; A. Ohara and T. Tsuchiya, <i>An Information Geometric Approach to Polynomial-time Interior-time Algorithms</i>, (to be published). ; Ohara A. (1999), Information geometric analysis of an interior point method for semidefinite programming, Geometry in Present Day Science, 1, 49. ; Murata N. (2004), Information geometry of <i>U</i>-boost and Bregman divergence, Neural Computation, 26, 1651. ; Eguchi S. (2002), A class of logistic type discriminant function, Biometrika, 89, 1. ; Minami M. (2004), Robust blind source separation by beta-divergence, Neural Commutation, 14, 1859. ; Fujisawa H. (2008), Robust parameter estimation with a small bias against heavy contamination, J. Multivariate Analysis, 99, 2053. ; Havrda J. (1967), Quantification method of classification process. Concept of structural α-entropy, Kybernetika, 3, 30. ; Chernoff H. (1952), A measure of asymptotic efficiency for tests of a hypothesis based on a sum of observations, Annals of Mathematical Statistics, 23, 493. ; Amari S. (2007), Integration of stochastic models by minimizing α-divergence, Neural Computation, 19, 2780. ; Matsuyama Y. (2002), The α-EM algorithm: Surrogate likelihood maximization using α-logarithmic information measures, IEEE Trans. on Information Theory, 49, 672. ; Zhang J. (2004), Divergence function, duality, and convex analysis, Neural Computation, 16, 159. ; Eguchi S. (1983), Second order efficiency of minimum contrast estimations in a curved exponential family, Annals of Statistics, 11, 793.#### DOI

10.2478/v10175-010-0019-1