Advances in Word Sense Disambiguation and Semantic Understanding

The field of natural language processing is moving towards more fine-grained semantic understanding, with a focus on word sense disambiguation, contextualized embeddings, and the representation of nuanced meaning. Recent work has emphasized the importance of capturing subtle differences in word meaning, particularly in low-resource languages. The development of novel frameworks and datasets has enabled more accurate modeling of semantic relations, including those involving idiomatic and figurative language. Notable papers include: QA-Noun, which introduces a QA-based framework for capturing noun-centered semantic relations, and LANE, which proposes a novel adversarial training strategy for word sense disambiguation. ViConBERT is also noteworthy, as it presents a novel framework for learning Vietnamese contextualized embeddings that integrates contrastive learning and gloss-based distillation.

Sources

Adverbs Revisited: Enhancing WordNet Coverage of Adverbs with a Supersense Taxonomy

LANE: Lexical Adversarial Negative Examples for Word Sense Disambiguation

ViConBERT: Context-Gloss Aligned Vietnamese Word Embedding for Polysemous and Sense-Aware Representations

QA-Noun: Representing Nominal Semantics via Natural Language Question-Answer Pairs

NLP Datasets for Idiom and Figurative Language Tasks

Anatomy of an Idiom: Tracing Non-Compositionality in Language Models

Integrating Symbolic Natural Language Understanding and Language Models for Word Sense Disambiguation

Built with on top of