Can KL-Divergence ever be greater than 1? – stats.stackexchange.com

I've been working on building some test statistics based on the KL-Divergence, \begin{equation} D_{KL}(p \| q) = \sum_i p(i) \log\left(\frac{p(i)}{q(i)}\right), \end{equation} And I ended up with a ...

from Hot Questions - Stack Exchange OnStackOverflow
via Blogspot

Share this

Artikel Terkait

0 Comment to "Can KL-Divergence ever be greater than 1? – stats.stackexchange.com"