Package elki.distance.probabilistic
Distance from probability theory, mostly divergences such as KLdivergence,
Jdivergence, Fdivergence, χ²divergence, etc.

Class Summary Class Description ChiDistance χ distance function, symmetric version.ChiDistance.Par Parameterization class, using the static instance.ChiSquaredDistance χ² distance function, symmetric version.ChiSquaredDistance.Par Parameterization class, using the static instance.FisherRaoDistance FisherRao riemannian metric for (discrete) probability distributions.FisherRaoDistance.Par Parameterization class.HellingerDistance Hellinger metric / affinity / kernel, Bhattacharyya coefficient, fidelity similarity, Matusita distance, HellingerKakutani metric on a probability distribution.HellingerDistance.Par Parameterization class.JeffreyDivergenceDistance Jeffrey Divergence forNumberVector
s is a symmetric, smoothened version of theKullbackLeiblerDivergenceAsymmetricDistance
.JeffreyDivergenceDistance.Par Parameterization class, using the static instance.JensenShannonDivergenceDistance JensenShannon Divergence forNumberVector
s is a symmetric, smoothened version of theKullbackLeiblerDivergenceAsymmetricDistance
.JensenShannonDivergenceDistance.Par Parameterization class, using the static instance.KullbackLeiblerDivergenceAsymmetricDistance KullbackLeibler divergence, also known as relative entropy, information deviation, or just KLdistance (albeit asymmetric).KullbackLeiblerDivergenceAsymmetricDistance.Par Parameterization class, using the static instance.KullbackLeiblerDivergenceReverseAsymmetricDistance KullbackLeibler divergence, also known as relative entropy, information deviation or just KLdistance (albeit asymmetric).KullbackLeiblerDivergenceReverseAsymmetricDistance.Par Parameterization class, using the static instance.SqrtJensenShannonDivergenceDistance The square root of JensenShannon divergence is a metric.SqrtJensenShannonDivergenceDistance.Par Parameterization class, using the static instance.TriangularDiscriminationDistance Triangular Discrimination has relatively tight upper and lower bounds to the JensenShannon divergence, but is much less expensive.TriangularDiscriminationDistance.Par Parameterization class, using the static instance.TriangularDistance Triangular Distance has relatively tight upper and lower bounds to the (square root of the) JensenShannon divergence, but is much less expensive.TriangularDistance.Par Parameterization class, using the static instance.