Efficient Named Entity Recognition and Entity Disambiguation

The field of Natural Language Processing (NLP) is moving towards more efficient and accurate methods for Named Entity Recognition (NER) and Entity Disambiguation (ED). Researchers are exploring ways to reduce the computational cost and training time of state-of-the-art models like BERT, while maintaining their high accuracy. One direction is the integration of external knowledge sources, such as Knowledge Graphs (KGs), to enhance the performance of Large Language Models (LLMs) in zero-shot learning scenarios. Another approach is the development of innovative frameworks that leverage positional attention mechanisms and differentiable decoding to improve NER performance. Noteworthy papers include: Positional Attention for Efficient BERT-Based Named Entity Recognition, which proposes a cost-efficient approach for NER using pre-trained parameters. Knowledge Graphs for Enhancing Large Language Models in Entity Disambiguation, which leverages KGs to enhance LLMs for zero-shot ED.

Sources

Positional Attention for Efficient BERT-Based Named Entity Recognition

Knowledge Graphs for Enhancing Large Language Models in Entity Disambiguation

Logits-Constrained Framework with RoBERTa for Ancient Chinese NER

Evaluation of LLMs on Long-tail Entity Linking in Historical Documents

Built with on top of