The Alpha-Beta-Symetric divergence and their positive definete kernels
In the field of statistical modelling, the distance or divergence measure is a criterion widely known and widely used tool for theoretical and applied statistical inference and data processing problems. In this paper, we deal with the well-known Alpha-Beta-divergences (which we shall refer to as the AB-divergences), which are a family of cost functions parametrized by two hyperparameters and their tight connections with the notions of Hilbertian metrics and positive definite (pd) kernels on probability measures. An attempt is made to describe this dissimilarity measure, which can be symmetrized using its two tuning parameters, alpha and beta. We compute the degree of symmetry of the AB-divergence on the basis of Hilbertian metrics. We investigate the desirable properties that the proposed approach needs to build a positive definite kernel corresponding to this symmetric AB-divergence.
We establish the effectiveness of our approach with experiments conducted on Support Vector Machine (SVM) and the applicability of this method is described in an algorithm from this symmetric divergence in image classification.
We perform experiments using the conditionally defined positive and the kernel transformed and show that these kernels have the same proportion of errors for the Euclidian divergence and the Hellinger divergence. We also observe large reductions in error for the Itakura-Saito divergence with the kernel in classifications than classical kernel methods.
Auteur(s) : Mactar Ndaw, Macoumba Ndour and Papa Ngom
Pages : 75-100.
Année de publication : 2018
Revue : Journal of Mathematical Sciences. Advances and Applications (J. Math. Sci. Adv. Appl.),
N° de volume : Vol. 53, Issue 1
Type : Article
Mise en ligne par : NGOM Papa