Partial information decomposition

From Wikipedia, the free encyclopedia

Partial Information Decomposition is an extension of information theory, that aims to generalize the pairwise relations described by information theory to the interaction of multiple variables.[1]

Motivation[edit]

Information theory can quantify the amount of information a single source variable has about a target variable via the mutual information . If we now consider a second source variable , classical information theory can only describe the mutual information of the joint variable with , given by . In general however, it would be interesting to know how exactly the individual variables and and their interactions relate to .

Consider that we are given two source variables and a target variable . In this case the total mutual information , while the individual mutual information . That is, there is synergistic information arising from the interaction of about , which cannot be easily captured with classical information theoretic quantities.

Definition[edit]

Partial information decomposition further decomposes the mutual information between the source variables with the target variable as

Here the individual information atoms are defined as

  • is the unique information that has about , which is not in
  • is the synergistic information that is in the interaction of and about
  • is the redundant information that is in both or about

There is, thus far, no universal agreement on how these terms should be defined, with different approaches that decompose information into redundant, unique, and synergistic components appearing in the literature.[1][2][3][4]

Applications[edit]

Despite the lack of universal agreement, partial information decomposition has been applied to diverse fields, including climatology,[5] neuroscience[6][7][8] sociology,[9] and machine learning[10] Partial information decomposition has also been proposed as a possible foundation on which to build a mathematically robust definition of emergence in complex systems[11] and may be relevant to formal theories of consciousness.[12]

See also[edit]

References[edit]

  1. ^ a b Williams PL, Beer RD (2010-04-14). "Nonnegative Decomposition of Multivariate Information". arXiv:1004.2515 [cs.IT].
  2. ^ Quax R, Har-Shemesh O, Sloot PM (February 2017). "Quantifying Synergistic Information Using Intermediate Stochastic Variables". Entropy. 19 (2): 85. arXiv:1602.01265. doi:10.3390/e19020085. ISSN 1099-4300.
  3. ^ Rosas FE, Mediano PA, Rassouli B, Barrett AB (2020-12-04). "An operational information decomposition via synergistic disclosure". Journal of Physics A: Mathematical and Theoretical. 53 (48): 485001. arXiv:2001.10387. Bibcode:2020JPhA...53V5001R. doi:10.1088/1751-8121/abb723. ISSN 1751-8113. S2CID 210932609.
  4. ^ Kolchinsky A (March 2022). "A Novel Approach to the Partial Information Decomposition". Entropy. 24 (3): 403. arXiv:1908.08642. Bibcode:2022Entrp..24..403K. doi:10.3390/e24030403. PMC 8947370. PMID 35327914.
  5. ^ Goodwell AE, Jiang P, Ruddell BL, Kumar P (February 2020). "Debates—Does Information Theory Provide a New Paradigm for Earth Science? Causality, Interaction, and Feedback". Water Resources Research. 56 (2). Bibcode:2020WRR....5624940G. doi:10.1029/2019WR024940. ISSN 0043-1397. S2CID 216201598.
  6. ^ Newman EL, Varley TF, Parakkattu VK, Sherrill SP, Beggs JM (July 2022). "Revealing the Dynamics of Neural Information Processing with Multivariate Information Decomposition". Entropy. 24 (7): 930. Bibcode:2022Entrp..24..930N. doi:10.3390/e24070930. PMC 9319160. PMID 35885153.
  7. ^ Luppi AI, Mediano PA, Rosas FE, Holland N, Fryer TD, O'Brien JT, et al. (June 2022). "A synergistic core for human brain evolution and cognition". Nature Neuroscience. 25 (6): 771–782. doi:10.1038/s41593-022-01070-0. PMC 7614771. PMID 35618951. S2CID 249096746.
  8. ^ Wibral M, Priesemann V, Kay JW, Lizier JT, Phillips WA (March 2017). "Partial information decomposition as a unified approach to the specification of neural goal functions". Brain and Cognition. Perspectives on Human Probabilistic Inferences and the 'Bayesian Brain'. 112: 25–38. arXiv:1510.00831. doi:10.1016/j.bandc.2015.09.004. PMID 26475739. S2CID 4394452.
  9. ^ Varley TF, Kaminski P (October 2022). "Untangling Synergistic Effects of Intersecting Social Identities with Partial Information Decomposition". Entropy. 24 (10): 1387. Bibcode:2022Entrp..24.1387V. doi:10.3390/e24101387. ISSN 1099-4300. PMC 9611752. PMID 37420406.
  10. ^ Tax TM, Mediano PA, Shanahan M (September 2017). "The Partial Information Decomposition of Generative Neural Network Models". Entropy. 19 (9): 474. Bibcode:2017Entrp..19..474T. doi:10.3390/e19090474. hdl:10044/1/50586. ISSN 1099-4300.
  11. ^ Mediano PA, Rosas FE, Luppi AI, Jensen HJ, Seth AK, Barrett AB, et al. (July 2022). "Greater than the parts: a review of the information decomposition approach to causal emergence". Philosophical Transactions. Series A, Mathematical, Physical, and Engineering Sciences. 380 (2227): 20210246. doi:10.1098/rsta.2021.0246. PMC 9125226. PMID 35599558.
  12. ^ Luppi AI, Mediano PA, Rosas FE, Harrison DJ, Carhart-Harris RL, Bor D, Stamatakis EA (2021). "What it is like to be a bit: an integrated information decomposition account of emergent mental phenomena". Neuroscience of Consciousness. 2021 (2): niab027. doi:10.1093/nc/niab027. PMC 8600547. PMID 34804593.