KL-Divergence

Tags: #machine learning

Equation

$$KL(P||Q)=\sum_{x}P(x)\log(\frac{P(x)}{Q(x)})$$

Latex Code

                                 KL(P||Q)=\sum_{x}P(x)\log(\frac{P(x)}{Q(x)})
                            

Have Fun

Let's Vote for the Most Difficult Equation!

Introduction

Equation



Latex Code

            KL(P||Q)=\sum_{x}P(x)\log(\frac{P(x)}{Q(x)})
        

Explanation

Latex code for the Kullback-Leibler Divergence. I will briefly introduce the notations in this formulation.

  • : KL Divergence between P and Q
  • : Distribution of P(x) over x
  • : Distribution of Q(x) over x

Related Documents

Related Videos

Comments

Write Your Comment

Upload Pictures and Videos