HyperAIHyperAI

Command Palette

Search for a command to run...

JS Divergence Jensen-Shannon Divergence

Date

7 years ago

JS divergence measures the similarity between two probability distributions. It is a variant of KL divergence and solves the asymmetric problem of KL divergence. Generally, JS divergence is symmetric and its value is between 0 and 1.

The definition is as follows:

There is a problem when measuring KL divergence and JS divergence:

If the two distributions P and Q are very far apart and have no overlap at all, then the KL divergence value is meaningless, and the JS divergence value is a constant, which means that the gradient of this point is 0.

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
JS Divergence Jensen-Shannon Divergence | Wiki | HyperAI