Information Theory and Computing Advances

The field of information theory and computing is moving towards a more nuanced understanding of complex systems and their interactions. Researchers are developing new frameworks and models to capture the intricacies of information processing, belief systems, and computational performance. A key direction is the integration of information theory with other disciplines, such as graph theory and combinatorial designs, to create more efficient and effective methods for analyzing and optimizing complex systems. Notable papers in this area include:

  • A new approach to measuring semantic information based on the unit circle, which resolves the Bar-Hillel-Carnap paradox.
  • A graph-theoretic model of belief systems that distinguishes between credibility and confidence, enabling a richer classification of epistemic states.
  • A novel computing performance unit grounded in information theory, which measures the mutual information between a system's inputs and outputs. These advances have the potential to significantly impact our understanding of complex systems and improve the performance of computational models.

Sources

Towards a Measure Theory of Semantic Information

Toward a Graph-Theoretic Model of Belief: Confidence, Credibility, and Structural Coherence

On the entropy growth of sums of iid discrete random variables

Communication-Efficient Distributed Computing Through Combinatorial Multi-Access Models

Multivariate Partial Information Decomposition: Constructions, Inconsistencies, and Alternative Measures

Back to Bits: Extending Shannon's communication performance framework to computing

Built with on top of