Date: Fri, 8 Oct 1999 10:37:54 -0500 (CDT) X-Authentication-Warning: isip18.isip.msstate.edu: zhao set sender to zhao@isip.msstate.edu using -f From: Jie Zhao To: picone@isip.msstate.edu In-reply-to: <199910081058.FAA19873@isip02.isip.msstate.edu> (message from Joe Picone - The Terminal Man on Fri, 8 Oct 1999 05:58:28 -0500 (CDT)) Subject: Re: ece_8000: schedule Reply-to: zhao@ISIP.MsState.EDU Content-Type: text Content-Length: 1360 > 11/10 03b. Cross-entropy, mutual information,... JZ Abstract: Information theory plays an important role in the field of natural language processing. A language model is a statistical model to predict the next word given the previous word sequences. And it's necessary to measure how accurate this language model is. Cross entropy and mutual information can be used to estimate the model quality. Entropy is a measure of uncertainty of a random variable given a probability model. In most cases, the true probably is unknown, hence cross entropy --- the entropy calculated with respect to an estimated probability model, is used to estimate the true entropy and estimate how close our estimated probability model is to the actual model. Mutual information is a measure of the amount of information that one random variable contains about another random variable. It is the reduction in the uncertainty of one random variable due to the knowledge of the other. In this talk, I will introduce: 1. fundamentals of Information theory entropy, cross entropy, relative entropy and mutual information 2. languge models 3. applications of infomation theory in automatic speech recognition 4. applications of information theory in machine translation -- translation one natural language to another natural language. -zhao