Package elki.distance.probabilistic
Distance from probability theory, mostly divergences such as K-L-divergence,
J-divergence, F-divergence, χ²-divergence, etc.
-
Class Summary Class Description ChiDistance χ distance function, symmetric version.ChiDistance.Par Parameterization class, using the static instance.ChiSquaredDistance χ² distance function, symmetric version.ChiSquaredDistance.Par Parameterization class, using the static instance.FisherRaoDistance Fisher-Rao riemannian metric for (discrete) probability distributions.FisherRaoDistance.Par Parameterization class.HellingerDistance Hellinger metric / affinity / kernel, Bhattacharyya coefficient, fidelity similarity, Matusita distance, Hellinger-Kakutani metric on a probability distribution.HellingerDistance.Par Parameterization class.JeffreyDivergenceDistance Jeffrey Divergence forNumberVector
s is a symmetric, smoothened version of theKullbackLeiblerDivergenceAsymmetricDistance
.JeffreyDivergenceDistance.Par Parameterization class, using the static instance.JensenShannonDivergenceDistance Jensen-Shannon Divergence forNumberVector
s is a symmetric, smoothened version of theKullbackLeiblerDivergenceAsymmetricDistance
.JensenShannonDivergenceDistance.Par Parameterization class, using the static instance.KullbackLeiblerDivergenceAsymmetricDistance Kullback-Leibler divergence, also known as relative entropy, information deviation, or just KL-distance (albeit asymmetric).KullbackLeiblerDivergenceAsymmetricDistance.Par Parameterization class, using the static instance.KullbackLeiblerDivergenceReverseAsymmetricDistance Kullback-Leibler divergence, also known as relative entropy, information deviation or just KL-distance (albeit asymmetric).KullbackLeiblerDivergenceReverseAsymmetricDistance.Par Parameterization class, using the static instance.SqrtJensenShannonDivergenceDistance The square root of Jensen-Shannon divergence is a metric.SqrtJensenShannonDivergenceDistance.Par Parameterization class, using the static instance.TriangularDiscriminationDistance Triangular Discrimination has relatively tight upper and lower bounds to the Jensen-Shannon divergence, but is much less expensive.TriangularDiscriminationDistance.Par Parameterization class, using the static instance.TriangularDistance Triangular Distance has relatively tight upper and lower bounds to the (square root of the) Jensen-Shannon divergence, but is much less expensive.TriangularDistance.Par Parameterization class, using the static instance.