The field of natural language processing is moving towards more integrated and nuanced approaches to text analysis, combining multiple features and techniques to achieve more accurate and effective results. This is evident in the development of models that jointly consider entities and discourse relations for coherence assessment, as well as the use of attention-based models for functional syntax analysis. These approaches are enabling more sophisticated and automated analysis of text, with applications in areas such as literary studies and news analytics. Notable papers include:
- A RoBERTa-Based Functional Syntax Annotation Model for Chinese Texts, which introduces a new method for automated Chinese functional syntax analysis.
- Joint Modeling of Entities and Discourse Relations for Coherence Assessment, which demonstrates the benefits of integrating both entity and discourse relation features for coherence evaluation.
- Modelling Intertextuality with N-gram Embeddings, which proposes a new quantitative model for analyzing intertextual relationships.
- JEL: A Novel Model Linking Knowledge Graph entities to News Mentions, which presents a novel entity linking model that beats current state-of-the-art models.