Learning optimal inter-class margin adaptively for few-shot class-incremental learning via neural collapse-based meta-learningShow others and affiliations
2024 (English)In: Information Processing & Management, ISSN 0306-4573, E-ISSN 1873-5371, Vol. 61, no 3, article id 103664Article in journal (Refereed) Published
Abstract [en]
Few-Shot Class-Incremental Learning (FSCIL) aims to learn new classes incrementally with a limited number of samples per class. It faces issues of forgetting previously learned classes and overfitting on few-shot classes. An efficient strategy is to learn features that are discriminative in both base and incremental sessions. Current methods improve discriminability by manually designing inter-class margins based on empirical observations, which can be suboptimal. The emerging Neural Collapse (NC) theory provides a theoretically optimal inter-class margin for classification, serving as a basis for adaptively computing the margin. Yet, it is designed for closed, balanced data, not for sequential or few-shot imbalanced data. To address this gap, we propose a Meta-learning- and NC-based FSCIL method, MetaNC-FSCIL, to compute the optimal margin adaptively and maintain it at each incremental session. Specifically, we first compute the theoretically optimal margin based on the NC theory. Then we introduce a novel loss function to ensure that the loss value is minimized precisely when the inter-class margin reaches its theoretically best. Motivated by the intuition that “learn how to preserve the margin” matches the meta-learning's goal of “learn how to learn”, we embed the loss function in base-session meta-training to preserve the margin for future meta-testing sessions. Experimental results demonstrate the effectiveness of MetaNC-FSCIL, achieving superior performance on multiple datasets. The code is available at https://github.com/qihangran/metaNC-FSCIL. © 2024 The Author(s)
Place, publisher, year, edition, pages
London: Elsevier, 2024. Vol. 61, no 3, article id 103664
Keywords [en]
Few-shot class-incremental learning, Meta-learning, Neural collapse
National Category
Computer and Information Sciences
Identifiers
URN: urn:nbn:se:hh:diva-52738DOI: 10.1016/j.ipm.2024.103664ISI: 001170976700001Scopus ID: 2-s2.0-85183769285OAI: oai:DiVA.org:hh-52738DiVA, id: diva2:1840416
Note
his work is supported by the National Natural Science Foundation of China (No. 62373343); and the Beijing Natural Science Foundation, China (No. L233036).
2024-02-232024-02-232024-06-27Bibliographically approved