Deformed Statistics Formulation of the Information Bottleneck Method

Venkatesan, R. C., Plastino, A.

arXiv.org Machine Learning 

The theoretical basis for a candidate variational principle for the information bottleneck (IB) method is formulated within the ambit of the generalized nonadditive statistics of Tsallis. Given a nonadditivity parameter $ q $, the role of the \textit{additive duality} of nonadditive statistics ($ q^*=2-q $) in relating Tsallis entropies for ranges of the nonadditivity parameter $ q < 1 $ and $ q > 1 $ is described. Defining $ X $, $ \tilde X $, and $ Y $ to be the source alphabet, the compressed reproduction alphabet, and, the \textit{relevance variable} respectively, it is demonstrated that minimization of a generalized IB (gIB) Lagrangian defined in terms of the nonadditivity parameter $ q^* $ self-consistently yields the \textit{nonadditive effective distortion measure} to be the \textit{$ q $-deformed} generalized Kullback-Leibler divergence: $ D_{K-L}^{q}[p(Y|X)||p(Y|\tilde X)] $. This result is achieved without enforcing any \textit{a-priori} assumptions. Next, it is proven that the $q^*-deformed $ nonadditive free energy of the system is non-negative and convex. Finally, the update equations for the gIB method are derived. These results generalize critical features of the IB method to the case of Tsallis statistics.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found