Entropy is a measure of uncertainty of a random variable given a
probability model. In most cases, the true probably is unknown, hence
cross entropy --- the entropy calculated with respect to an estimated
probability model, is used to estimate the true entropy and estimate
how close our estimated probability model is to the actual
model. Mutual information is a measure of the amount of information
that one random variable contains about another random variable. It is
the reduction in the uncertainty of one random variable due to the
knowledge of the other.
In this talk, I will introduce:
Additional items of interest: