Why is mean squared error the cross-entropy between the empirical distribution and a Gaussian model? – stats.stackexchange.com

In 5.5, Deep Learning (by Ian Goodfellow, Yoshua Bengio and Aaron Courville), it states that Any loss consisting of a negative log-likelihood is a cross-entropy between the empirical distribution ...

from Hot Questions - Stack Exchange OnStackOverflow
via Blogspot

Share this

Artikel Terkait

0 Comment to "Why is mean squared error the cross-entropy between the empirical distribution and a Gaussian model? – stats.stackexchange.com"