Class JensenShannonDivergenceDistance

  • All Implemented Interfaces:
    Distance<NumberVector>, NumberVectorDistance<NumberVector>, PrimitiveDistance<NumberVector>, SpatialPrimitiveDistance<NumberVector>
    Direct Known Subclasses:
    SqrtJensenShannonDivergenceDistance

    @Reference(authors="J. Lin",title="Divergence measures based on the Shannon entropy",booktitle="IEEE Transactions on Information Theory 37(1)",url="https://doi.org/10.1109/18.61115",bibkey="DBLP:journals/tit/Lin91") @Reference(authors="D. M. Endres, J. E. Schindelin",title="A new metric for probability distributions",booktitle="IEEE Transactions on Information Theory 49(7)",url="https://doi.org/10.1109/TIT.2003.813506",bibkey="DBLP:journals/tit/EndresS03") @Reference(authors="M.-M. Deza, E. Deza",title="Dictionary of distances",booktitle="Dictionary of distances",url="https://doi.org/10.1007/978-3-642-00234-2",bibkey="doi:10.1007/978-3-642-00234-2")
    public class JensenShannonDivergenceDistance
    extends JeffreyDivergenceDistance
    Jensen-Shannon Divergence for NumberVectors is a symmetric, smoothened version of the KullbackLeiblerDivergenceAsymmetricDistance.

    It essentially is the same as JeffreyDivergenceDistance, only scaled by half. For completeness, we include both.

    JS(x,y):=12ixilog2xixi+yi+yilog2yixi+yi=12KL(x,12(x+y))+12KL(y,12(x+y))

    There exists a variable definition where the two vectors are weighted with β and 1β, which for the common choice of β=12 yields this version.

    Reference:

    J. Lin
    Divergence measures based on the Shannon entropy
    IEEE Transactions on Information Theory 37(1)

    D. M. Endres, J. E. Schindelin
    A new metric for probability distributions
    IEEE Transactions on Information Theory 49(7)

    M.-M. Deza, E. Deza
    Dictionary of distances

    Since:
    0.6.0
    Author:
    Erich Schubert