Jensen–Shannon divergence(J-S散度) is a method of measuring the similarity between two probability distributions. It is based on the Kullback–Leibler divergence(K-L散度), with some notable (and useful) differences, including that it is symmetric and it is always a finite value. The Jensen–Shannon divergence (JSD) is a symmetrized and smoothed version of the Kullback–Leibler divergence . It is defined by: where The Jensen–Shannon divergence is bounded by 1 for two probability distributions, given that one uses the base 2 logarithm. For log base e, or ln, which is commonly used in statistical thermodynamics, the upper bound is ln(2):