The field of information theory is moving towards the development of capacity-achieving codes and a deeper understanding of channel behavior. Researchers are exploring new frameworks and tools to analyze and optimize the performance of various coding schemes, including sparse superposition codes, systematic Bernoulli generator matrix codes, and Reed-Muller codes. The focus is on deriving analytic capacity formulas, asymptotically optimal input distributions, and low-complexity receiver structures to achieve near-capacity performance. Additionally, there is a growing interest in understanding the properties of binary symmetric channels, binary erasure channels, and other binary memoryless symmetric channels. Noteworthy papers in this area include:
- The paper on bridging Bayesian asymptotics and information theory to analyze the asymptotic Shannon capacity of large-scale MIMO channels, which presents a unifying framework and derives an analytic capacity formula.
- The paper on spatially coupled VAMP decoder for sparse superposition codes, which demonstrates capacity-achieving performance and outperforms previous decoders.
- The paper on systematic Bernoulli generator matrix codes, which proves that these codes are capacity-achieving over binary-input output-symmetric channels in terms of frame-error rate.