KL-Divergence

Tags: #machine learning

Equation

$$KL(P||Q)=\sum_{x}P(x)\log(\frac{P(x)}{Q(x)})$$

Latex Code

                                 KL(P||Q)=\sum_{x}P(x)\log(\frac{P(x)}{Q(x)})
                            

Have Fun

Let's Vote for the Most Difficult Equation!

Introduction

Equation



Latex Code

            KL(P||Q)=\sum_{x}P(x)\log(\frac{P(x)}{Q(x)})
        

Explanation

Latex code for the Kullback-Leibler Divergence. I will briefly introduce the notations in this formulation.

  • : KL Divergence between P and Q
  • : Distribution of P(x) over x
  • : Distribution of Q(x) over x

Related Documents

Related Videos

Discussion

Comment to Make Wishes Come True

Leave your wishes (e.g. Passing Exams) in the comments and earn as many upvotes as possible to make your wishes come true


  • Lois Lane
    I'm wishing upon a star to pass this exam.
    2024-02-17 00:00

    Reply


    Elliot Stone reply to Lois Lane
    Best Wishes.
    2024-02-18 00:00:00.0

    Reply


  • Lauren Sanchez
    If only I could pass this exam with flying colors.
    2023-10-28 00:00

    Reply


    Theresa Rivera reply to Lauren Sanchez
    Best Wishes.
    2023-10-31 00:00:00.0

    Reply


  • Phyllis Collins
    I'm burning the midnight oil to pass this test.
    2023-05-30 00:00

    Reply


    Lillian Tucker reply to Phyllis Collins
    Gooood Luck, Man!
    2023-06-06 00:00:00.0

    Reply