Distance Metric Learning through the Maximization of the Jeffrey Divergence (DMLMJ)

An information-theory based distance metric learning algorithm. It learns a transformation that maximizes the Jeffrey divergence between the gaussian distributions associated to the difference spaces for same-class neighbors and different-class neighbors, respectively. This algorithm is also useful for dimensionality reduction.

Watch the full DMLMJ documentation here.

References

Bac Nguyen, Carlos Morell and Bernard De Baets. “Supervised distance metric learning through maximization of the Jeffrey divergence”. In: Pattern Recognition 64 (2017), pages 215-225.