Jensen-Shannon Divergence JS-Divergence
Tags: #machine learningEquation
$$JS(P||Q)=\frac{1}{2}KL(P||\frac{(P+Q)}{2})+\frac{1}{2}KL(Q||\frac{(P+Q)}{2})$$Latex Code
JS(P||Q)=\frac{1}{2}KL(P||\frac{(P+Q)}{2})+\frac{1}{2}KL(Q||\frac{(P+Q)}{2})
Have Fun
Let's Vote for the Most Difficult Equation!
Introduction
Equation
Latex Code
JS(P||Q)=\frac{1}{2}KL(P||\frac{(P+Q)}{2})+\frac{1}{2}KL(Q||\frac{(P+Q)}{2})
Explanation
Latex code for the Jensen-Shannon Divergence(JS-Divergence). I will briefly introduce the notations in this formulation.
- : KL Divergence between P and Q
- : JS Divergence between P and Q, which is the symmetric divergence metric between distribution P and Q
- : Distribution of P(x) over x
- : Distribution of Q(x) over x
Related Documents
Related Videos
Discussion
Comment to Make Wishes Come True
Leave your wishes (e.g. Passing Exams) in the comments and earn as many upvotes as possible to make your wishes come true
-
Carl DuncanAiming for the stars with this test.Mitch Hart reply to Carl DuncanBest Wishes.2023-11-11 00:00:00.0 -
Stephanie RobinsonI'm trying my best to ensure I pass this exam.Dawn Hart reply to Stephanie RobinsonGooood Luck, Man!2024-03-20 00:00:00.0 -
Lauren SanchezI've got to pass this exam, there's no other option.Anne Brooks reply to Lauren SanchezNice~2023-07-08 00:00:00.0
Reply