| Package | Description | 
|---|---|
| de.lmu.ifi.dbs.elki.distance.distancefunction.probabilistic | 
 Distance from probability theory, mostly divergences such as K-L-divergence,
 J-divergence, F-divergence, χ²-divergence, etc. 
 | 
| Class and Description | 
|---|
| ChiDistanceFunction
 χ distance function, symmetric version. 
 | 
| ChiSquaredDistanceFunction
 χ² distance function, symmetric version. 
 | 
| FisherRaoDistanceFunction
 Fisher-Rao riemannian metric for (discrete) probability distributions. 
 | 
| HellingerDistanceFunction
 Hellinger metric / affinity / kernel, Bhattacharyya coefficient, fidelity
 similarity, Matusita distance, Hellinger-Kakutani metric on a probability
 distribution. 
 | 
| JeffreyDivergenceDistanceFunction
 Jeffrey Divergence for  
NumberVectors is a symmetric, smoothened
 version of the KullbackLeiblerDivergenceAsymmetricDistanceFunction. | 
| JensenShannonDivergenceDistanceFunction
 Jensen-Shannon Divergence for  
NumberVectors is a symmetric,
 smoothened version of the
 KullbackLeiblerDivergenceAsymmetricDistanceFunction. | 
| KullbackLeiblerDivergenceAsymmetricDistanceFunction
 Kullback-Leibler divergence, also known as relative entropy,
 information deviation, or just KL-distance (albeit asymmetric). 
 | 
| KullbackLeiblerDivergenceReverseAsymmetricDistanceFunction
 Kullback-Leibler divergence, also known as relative entropy, information
 deviation or just KL-distance (albeit asymmetric). 
 | 
| SqrtJensenShannonDivergenceDistanceFunction
 The square root of Jensen-Shannon divergence is a metric. 
 | 
| TriangularDiscriminationDistanceFunction
 Triangular Discrimination has relatively tight upper and lower bounds to the
 Jensen-Shannon divergence, but is much less expensive. 
 | 
| TriangularDistanceFunction
 Triangular Distance has relatively tight upper and lower bounds to the
 (square root of the) Jensen-Shannon divergence, but is much less expensive. 
 | 
Copyright © 2019 ELKI Development Team. License information.