Advances in Information Retrieval

The field of Information Retrieval is moving towards more efficient and effective methods for querying and retrieving information. Recent developments have focused on improving the trade-off between efficiency and effectiveness, with a particular emphasis on sparse retrieval methods. The use of pre-trained language models and deep learning techniques has shown promise in enhancing retrieval accuracy, while also reducing the computational resources required. Another area of research has been the application of natural language processing techniques to information retrieval, including the use of reinforcement learning and pretrained transformer models. Additionally, there has been a growing interest in exploring new approaches for information retrieval, such as rational retrieval acts, which leverage pragmatic reasoning to improve sparse retrieval. Noteworthy papers include:

  • Effective Inference-Free Retrieval for Learned Sparse Representations, which proposes a new approach for inference-free retrieval that yields state-of-the-art effectiveness for both in-domain and out-of-domain evaluation.
  • Rational Retrieval Acts: Leveraging Pragmatic Reasoning to Improve Sparse Retrieval, which adapts the Rational Speech Acts framework to improve sparse retrieval models and achieves state-of-the-art performance on out-of-domain datasets.

Sources

Effective Inference-Free Retrieval for Learned Sparse Representations

Exploring new Approaches for Information Retrieval through Natural Language Processing

Running a Data Integration Lab in the Context of the EHRI Project: Challenges, Lessons Learnt and Future Directions

Rational Retrieval Acts: Leveraging Pragmatic Reasoning to Improve Sparse Retrieval

Artifact Sharing for Information Retrieval Research

Built with on top of