Continual Learning in Large Language Models

The field of large language models (LLMs) is moving towards addressing the challenge of catastrophic forgetting in continual learning. Researchers are exploring innovative strategies to mitigate this issue, including model growth, parameter-efficient fine-tuning, and novel pruning methods. These approaches aim to improve the retention of previously learned capabilities while adapting to new tasks and domains. Noteworthy papers in this area include: Mitigating Catastrophic Forgetting in Continual Learning through Model Growth, which explores the use of model growth to structure the training of larger models. Forward-Only Continual Learning, which proposes a forward-only, gradient-free continual learning method. LAMDAS: LLM as an Implicit Classifier for Domain-specific Data Selection, which introduces a novel approach to strategic data selection using the pre-trained LLM as an implicit classifier. Mitigating Catastrophic Forgetting in Large Language Models with Forgetting-aware Pruning, which proposes a novel pruning-based approach to balance catastrophic forgetting and downstream task performance.

Sources

Mitigating Catastrophic Forgetting in Continual Learning through Model Growth

DaMoC: Efficiently Selecting the Optimal Large Language Model for Fine-tuning Domain Taks Based on Data and Model Compression

Can Smaller LLMs do better? Unlocking Cross-Domain Potential through Parameter-Efficient Fine-Tuning for Text Summarization

Forward-Only Continual Learning

How Instruction-Tuning Imparts Length Control: A Cross-Lingual Mechanistic Analysis

SelfAug: Mitigating Catastrophic Forgetting in Retrieval-Augmented Generation via Distribution Self-Alignment

Delta Activations: A Representation for Finetuned Large Language Models

Accelerate Scaling of LLM Alignment via Quantifying the Coverage and Depth of Instruction Set

LAMDAS: LLM as an Implicit Classifier for Domain-specific Data Selection

Mitigating Catastrophic Forgetting in Large Language Models with Forgetting-aware Pruning

Built with on top of