Hacker News
Six (and a half) intuitions for KL divergence
abetusk
|next
[-]
Let's say you're a company that's providing an internet connection to a business. The company trusts you, so there's only compression of bits over the wire, not encryption, and you're aware of the compression scheme the company is using to send their bits to you. You're charging the company a premium for using the line you manage but you also lease the line, so it's in your interest to compress what they give you as best as possible so as to make a profit.
Say the companies compression scheme is imperfect. They have a Huffman coding of their (imperfect) model of tokens they send, call it q(x) (that is, they think token x shows up with probability q(x)). You've determined the true distribution, p(x) (token x shows up with actual probability p(x)).
The business has tokens that show up with probability p(x) but they encode them with lg(q(x)) bits, giving an average token bit size of:
-\sum _ x p(x) lg(q(x))
If you then use an optimal Huffman encoding, you will send tokens with average bit length of: -\sum _ x p(x) lg(p(x))
How many bits, on average, do you save? Just the difference: -\sum _ x p(x) lg(p(x)) - \sum _ x p(x) lg(q(x)) = -\sum _ x p(x) lg(p(x)/q(x))
Which is the Kullback-Leibler divergence.To me, this is a much more intuitive explanation. I made a blog post about it [0], if anyone cares.
[0] https://mechaelephant.com/dev/Kullback-Leibler-Divergence.ht...
dist-epoch
|root
|parent
[-]
ttul
|next
|previous
[-]
notrealyme123
|next
|previous
[-]
cubefox
|next
|previous
[-]
RickHull
|next
|previous
[-]
jey
|root
|parent
[-]
srean
|root
|parent
[-]
David Mackay's book hand holds a little more than Cover and Thomas, although it's remit is more than just information theory.
dist-epoch
|next
|previous
[-]
cubefox
|next
|previous
[-]
I don't think this particular interpretation actually makes sense or would explain why KL divergence is not symmetric.
First of all, the "difference" between P and Q would be the same independently of whether P, Q, or some other distribution is the "true" distribution.
For example, assume we have a coin and P(Heads)=0.4 and Q(Heads)=0.6. Now the difference between the two distributions is clearly the same irrespective of whether P, Q or neither is "true". So this interpretation doesn't explain why the KL divergence is asymmetric.
Second, there are plausible cases where it arguably doesn't even make sense to speak of a "true" distribution in the first place.
For example, consider the probability that there was once life on Mars. Assume P(Life)=0.4 and Q(Life)=0.6. What would it even mean for P to be "true"? P and Q could simply represent the subjective beliefs of two different people, without any requirement of assuming that one of these probabilities could be "correct".
Clearly the KL divergence can still be calculated and presumably sensibly interpreted even in the subjective case. But the interpretations in this article don't help us here since they require objective probabilities where one distribution is the "true" one.