Search for Information Bearing Components in Speech
Yang, Howard Hua, Hermansky, Hynek
–Neural Information Processing Systems
In this paper, we use mutual information to characterize the distributions ofphonetic and speaker/channel information in a timefrequency space. The mutual information (MI) between the phonetic label and one feature, and the joint mutual information (JMI) between the phonetic label and two or three features are estimated. The Miller's bias formulas for entropy and mutual information estimates areextended to include higher order terms. The MI and the JMI for speaker/channel recognition are also estimated. The results are complementary to those for phonetic classification. Our results show how the phonetic information is locally spread and how the speaker/channel information is globally spread in time and frequency.
Neural Information Processing Systems
Dec-31-2000