Jensen–Shannon Divergence

In probability theory and statistics, the Jensen–Shannon divergence is a popular method of measuring the similarity between two probability distributions. It is also known as information radius (IRad) or total divergence to the average. It is based on the Kullback–Leibler divergence, with the notable (and useful) difference that it is always a finite value. The square root of the Jensen–Shannon divergence is a metric.

Read more about Jensen–Shannon Divergence:  Definition, Bounds, Relation To Mutual Information, Quantum Jensen–Shannon Divergence, Applications