Distance Measure Machines - Laboratoire d'informatique fondamentale de Marseille Access content directly
Preprints, Working Papers, ... Year : 2018

Distance Measure Machines

Abstract

This paper presents a distance-based discriminative framework for learning with probability distributions. Instead of using kernel mean embeddings or generalized radial basis kernels, we introduce embeddings based on dissimilarity of distributions to some reference distributions denoted as templates. Our framework extends the theory of similarity of \citet{balcan2008theory} to the population distribution case and we show that, for some learning problems, some dissimilarity on distribution achieves low-error linear decision functions with high probability. Our key result is to prove that the theory also holds for empirical distributions. Algorithmically, the proposed approach consists in computing a mapping based on pairwise dissimilarity where learning a linear decision function is amenable. Our experimental results show that the Wasserstein distance embedding performs better than kernel mean embeddings and computing Wasserstein distance is far more tractable than estimating pairwise Kullback-Leibler divergence of empirical distributions.
Fichier principal
Vignette du fichier
WDMM.pdf (1.21 Mo) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-01717940 , version 1 (27-02-2018)
hal-01717940 , version 2 (07-11-2018)

Identifiers

Cite

Alain Rakotomamonjy, Abraham Traore, Maxime Berar, Rémi Flamary, Nicolas Courty. Distance Measure Machines. 2018. ⟨hal-01717940v1⟩
935 View
1091 Download

Altmetric

Share

Gmail Facebook X LinkedIn More