信息理论相关python包

前言

两个python包,可以调用对应函数计算互信息、条件互信息、共信息、双向总信息、残差信息等。附带一些其他信息理论研究文章。

dit

This python package for discrete information theory provides a standard bivariate case based on:

  1. Basic Shannon measure of mutual information for bivariate distributions

  2. Measures for multivariate distributions

    • Co-Information: quantifies amount of information all variable participates in
    • Total Correlation: amount of information each individual variable carries above and beyond joint entropy
    • Dual Total Correlation: Also known as binding information is the amount of information shared among the variables.
    • Cohesion: spans total correlation to dual total correlation
    • CAEKL Mutual Information:
      Generalized as the smallest quantity that can be subtracted from the joint, and from each part of a partition of all the variables, such that the joint entropy minus this quantity is equal to the sum of each partition entropy minus this quantity.
    • Interaction Information: Equal in magnitude to co-information, however for odd number of variables takes the opposite sign
    • DeWeese-like Measures
      local modification of a single variable can not increase the amount of correlation or dependence it has with the other variables.

pyitlib

Library in python for information-theoretic methods.

Below are the mutual information measures found in package pyitlib:

  • Mutual information
  • Normalised mutual information (7 variants)
  • Variation of information
  • Lautum information
  • Conditional mutual information
  • Co-information
  • Interaction information
  • Multi-information
  • Binding information
  • Residual entropy
  • Exogenous local information
  • Enigmatic information

Other measures in Research Communities

  1. Part mutual information: This new measure is based on information theory that accurately quantify nonlinearly direct associations between measured variables. For more information, part mutual information for quantifying direct associations
  2. Calculate mutual information using recursive adaptive partitioning: This paper ideally focuses on mutual information between discrete variables with many categories using Recursive Adaptive Partitioning
  3. Comparative redundancy calculations: A comparative study of existing redundancy calculations with new measure of bivariate redundancy measure. A Bivariate Measure of Redundant Information
  4. Synergistic mutual information: briefly explains about how single PI-region is either redundant, unique or synergistic. Research paper: Quantifying synergistic mutual information
  5. Partial Information Decomposition: a redundancy measure as proposed by Williams and Beer which typically introduce partial information atoms(PI-atoms) to decompose multivariate mutual information into non-negative terms. Refer to Nonnegative Decomposition of Multivariate Information
  6. Absolute mutual information: This measure is calculated using algorithmic complexity
  7. Pairwise adjusted mutual information
  8. partial correlation: However it can only measure linear direct associations
  9. Conditional mutual information quantify nonlinear direct relationships among variables, ideally for more than 2 variables

转载自:https://datascience.stackexchange.com/questions/97775/a-measure-of-redundancy-in-mutual-information

你可能感兴趣的:(信息理论,笔记整理,python)