Authors:
Shivakumar Jolad
1
;
Ahmed Roman
2
;
Mahesh C. Shastry
3
;
Mihir Gadgil
4
and
Ayanendranath Basu
5
Affiliations:
1
Indian Institute of Technology Gandhinagar, India
;
2
Virginia Tech, United States
;
3
Indian Institute of Science Education and Research Bhopal, India
;
4
Oregon Health & Science University, United States
;
5
Indian Statistical Institute, India
Keyword(s):
Divergence Measures, Bhattacharyya Distance, Error Probability, F-divergence, Pattern Recognition, Signal Detection, Signal Classification.
Related
Ontology
Subjects/Areas/Topics:
Applications
;
Bayesian Models
;
Classification
;
Gaussian Processes
;
Object Recognition
;
Pattern Recognition
;
Software Engineering
;
Theory and Methods
Abstract:
We introduce a new one-parameter family of divergence measures, called bounded Bhattacharyya distance (BBD) measures, for quantifying the dissimilarity between probability distributions. These measures are bounded, symmetric and positive semi-definite and do not require absolute continuity. In the asymptotic limit, BBD measure approaches the squared Hellinger distance. A generalized BBD measure for multiple distributions is also introduced. We prove an extension of a theorem of Bradt and Karlin for BBD relating Bayes error probability and divergence ranking. We show that BBD belongs to the class of generalized Csiszar f-divergence and derive some properties such as curvature and relation to Fisher Information. For distributions with vector valued parameters, the curvature matrix is related to the Fisher-Rao metric. We derive certain inequalities between BBD and well known measures such as Hellinger and Jensen-Shannon divergence. We also derive bounds on the Bayesian error probability
. We give an application of these measures to the problem of signal detection where we compare two monochromatic signals buried in white noise and differing in frequency and amplitude.
(More)