Search for Information Bearing Components in Speech

Yang, Howard Hua, Hermansky, Hynek

Neural Information Processing Systems 

In this paper, we use mutual information to characterize the distributions of phonetic and speaker/channel information in a timefrequency space. The mutual information (MI) between the phonetic label and one feature, and the joint mutual information (JMI) between the phonetic label and two or three features are estimated. The Miller's bias formulas for entropy and mutual information estimates are extended to include higher order terms. The MI and the JMI for speaker/channel recognition are also estimated. The results are complementary to those for phonetic classification. Our results show how the phonetic information is locally spread and how the speaker/channel information is globally spread in time and frequency.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found