Kullback-Leibler Divergence

Tags: #machine learning #kl divergence

Equation

$$KL(P||Q)=\sum_{x}P(x)\log(\frac{P(x)}{Q(x)})$$

Latex Code

                                 KL(P||Q)=\sum_{x}P(x)\log(\frac{P(x)}{Q(x)})
                            

Have Fun

Let's Vote for the Most Difficult Equation!

Introduction

Equation



Latex Code

            KL(P||Q)=\sum_{x}P(x)\log(\frac{P(x)}{Q(x)})
        

Explanation

Latex code for the Kullback-Leibler Divergence. I will briefly introduce the notations in this formulation.

  • : KL Divergence between P and Q
  • : Distribution of P(x) over x
  • : Distribution of Q(x) over x

Related Documents

Related Videos

Discussion

Comment to Make Wishes Come True

Leave your wishes (e.g. Passing Exams) in the comments and earn as many upvotes as possible to make your wishes come true


  • Joel Harvey
    Let's hope I can clear this exam.
    2024-04-01 00:00

    Reply


    Donna Thompson reply to Joel Harvey
    Best Wishes.
    2024-04-13 00:00:00.0

    Reply


  • Thomas Wilson
    May luck be on my side to pass this exam.
    2023-12-13 00:00

    Reply


    Kathryn Olson reply to Thomas Wilson
    Best Wishes.
    2024-01-10 00:00:00.0

    Reply


  • Jean Carter
    The tension is high, but I'm hopeful I'll pass this exam.
    2024-04-19 00:00

    Reply


    Scott Phillips reply to Jean Carter
    Nice~
    2024-05-17 00:00:00.0

    Reply