Jensen-Shannon Divergence JS-Divergence

Tags: #machine learning

Equation

$$JS(P||Q)=\frac{1}{2}KL(P||\frac{(P+Q)}{2})+\frac{1}{2}KL(Q||\frac{(P+Q)}{2})$$

Latex Code

                                 JS(P||Q)=\frac{1}{2}KL(P||\frac{(P+Q)}{2})+\frac{1}{2}KL(Q||\frac{(P+Q)}{2})
                            

Have Fun

Let's Vote for the Most Difficult Equation!

Introduction

Equation



Latex Code

            JS(P||Q)=\frac{1}{2}KL(P||\frac{(P+Q)}{2})+\frac{1}{2}KL(Q||\frac{(P+Q)}{2})
        

Explanation

Latex code for the Jensen-Shannon Divergence(JS-Divergence). I will briefly introduce the notations in this formulation.

  • : KL Divergence between P and Q
  • : JS Divergence between P and Q, which is the symmetric divergence metric between distribution P and Q
  • : Distribution of P(x) over x
  • : Distribution of Q(x) over x

Related Documents

Related Videos

Discussion

Comment to Make Wishes Come True

Leave your wishes (e.g. Passing Exams) in the comments and earn as many upvotes as possible to make your wishes come true


  • Carl Duncan
    Aiming for the stars with this test.
    2023-10-27 00:00

    Reply


    Mitch Hart reply to Carl Duncan
    Best Wishes.
    2023-11-11 00:00:00.0

    Reply


  • Stephanie Robinson
    I'm trying my best to ensure I pass this exam.
    2024-03-16 00:00

    Reply


    Dawn Hart reply to Stephanie Robinson
    Gooood Luck, Man!
    2024-03-20 00:00:00.0

    Reply


  • Lauren Sanchez
    I've got to pass this exam, there's no other option.
    2023-07-08 00:00

    Reply


    Anne Brooks reply to Lauren Sanchez
    Nice~
    2023-07-08 00:00:00.0

    Reply