A prompt regularization approach to enhance few-shot class-incremental learning with Two-Stage ClassifierShow others and affiliations
2025 (English)In: Neural Networks, ISSN 0893-6080, E-ISSN 1879-2782, Vol. 188, p. 1-11, article id 107453Article in journal (Refereed) In press
Abstract [en]
With a limited number of labeled samples, Few-Shot Class-Incremental Learning (FSCIL) seeks to efficiently train and update models without forgetting previously learned tasks. Because pre-trained models can learn extensive feature representations from big existing datasets, they offer strong knowledge foundations and transferability, which makes them useful in both few-shot and incremental learning scenarios. Additionally, Prompt Learning improves pre-trained deep learning models’ performance on downstream tasks, particularly in large-scale language or vision models. In this paper, we propose a novel Prompt Regularization (PrRe) approach to maximize the fusion of prompts by embedding two different prompts, the Task Prompt and the Global Prompt, inside a pre-trained Vision Transformer (ViT). In the classification phase, we propose a Two-Stage Classifier (TSC), utilizing K-Nearest Neighbors for base session and a Prototype Classifier for incremental sessions, integrated with a global self-attention module. Through experiments on multiple benchmark tests, we demonstrate the effectiveness and superiority of our method. The code is available at https://github.com/gyzzzzzzzz/PrRe. © 2025 Elsevier Ltd
Place, publisher, year, edition, pages
Oxford: Elsevier, 2025. Vol. 188, p. 1-11, article id 107453
Keywords [en]
Few-shot class-incremental learning, Global self-attention module, Prompt regularization, Two-Stage Classifier
National Category
Natural Language Processing Signal Processing
Identifiers
URN: urn:nbn:se:hh:diva-55930DOI: 10.1016/j.neunet.2025.107453ISI: 001469409400001Scopus ID: 2-s2.0-105002288715OAI: oai:DiVA.org:hh-55930DiVA, id: diva2:1955438
2025-04-302025-04-302025-04-30Bibliographically approved