Jensen Shannon Divergence Calculator Online

Jensen Shannon Divergence Calculator Description


The degree to which the label distributions of several facets diverge entropically from one another is gauged by the Jensen-Shannon divergence (JS). It is symmetric and is based on the Kullback-Leibler divergence.

Jensen Shannon Calculator Formula:

Jensen Shannon formula

Example Input X: 0.1, 0.2, 0.7 = These number sum must be upto 1 means 0.1 + 0.2 + 0.7 =1

Example Input Y: 0.2, 0.3, 0.5 = These number sum must be upto 1 means 0.2 + 0.3 + 0.5 = 1

Values must be in between 0 and 1

KL divergence does measure how similar two distributions are, but because it breaks both symmetryJensen Shannon formula and the triangle inequalityJensen Shannon formula, it is not a metric. Since Jenson-Shannon divergence is a metric, it is sometimes referred to as Jenson-Shannon distance.

In high dimensional environments, such as generative adversarial networks, JS distance is utilised to address the problems of KL divergence. Empirically, the majority of high-dimensional real-world data resembles a low-dimensional manifold. Since they rarely coincide, it would be like trying to discover a needle in a haystack when maximising KL (Kullback leibler) divergence of model distribution between data distribution.

When distributions do not or barely cross, KL divergence becomes infinite and the gradient signal is zero. When there is little to no overlap, when pθ(x)=0, JS divergence behaves better because it doesn't become infinite, but it still has the same issue of having zero gradient.

How to use Calculate Jensen Shannon divergence using Calculator?

  • You need to enter the value of X - Probabilities in the first input field given in the form. Remember each probability is seperated with comma and each value must be in between 0 and 1.
  • Then enter the value of Y - Probabilities in the second input field given in the form. Remember each probability is seperated with comma and each value must be in between 0 and 1.
  • Click on the button submit, and the results will automatically appear below in the result section.

More Calculation Examples (Jensen Shannon Divergence Examples)

  • Example 1
  • Probabilities - X : 0.1, 0.3, 0.6
  • Probabilities - Y : 0.1, 0.2, 0.7
  • Result : 0.006958856339414

  • Example 2
  • Probabilities - X : 0.2, 0.3, 0.5
  • Probabilities - Y : 0.6, 0.3, 0.1
  • Result : 0.12510060588455

  • Example 3
  • Probabilities - X : 0.5, 0.2, 0.3
  • Probabilities - Y : 0.2, 0.3, 0.5
  • Result : 0.050874612539598


  • shannon jensen divergence
  • jensen-shannon
  • jensen shannon divergence python
  • jensen-shannon divergence python
  • jensen shannon distance
  • How is Jensen-Shannon divergence calculated?
  • Why use Jensen-Shannon divergence?
  • Is Jensen-Shannon divergence a distance?
  • Jensen–Shannon divergence
Your Image Author

Hey there, I'm the developer of this website. As a Laravel developer, I'm proficient in building web applications using the Laravel PHP framework. I have a strong understanding of object-oriented programming principles and have experience with database design and management. I'm skilled in developing RESTful APIs, implementing authentication and authorization, and integrating third-party services. I'm also familiar with front-end technologies such as HTML, CSS, and JavaScript, and have experience with popular front-end frameworks such as Vue.js or React. I'm committed to writing clean, maintainable code and staying up-to-date with the latest industry trends and best practices. I hope this website help you best with your calculations. Visit the link for Python Tutorials and many other helpful material.