3-D Gravity Anomaly Inversion for Imaging Salt Structures, With Applicati...
Full-Stokes polarization photodetector based on the hexagonal lattice chi...
Multiple k-Point Nonadiabatic Molecular Dynamics for Ultrafast Excitation...
Low-loss silica-based 90 deg optical hybrid in c band based on 4 x 4 mult...
Ultra-broad bandwidth array waveguide grating for high-speed backbone net...
Optical time domain reflectometry based on a self-chaotic circular-sided ...
 ICGNet: An intensity-controllable generation network based on covering ...
Dual-polarization near-infrared narrow-band unidirectional nonreciprocal ...
Design and Optimization of a High-Efficiency 3D Multi-Tip Edge Coupler Ba...
 Numerical Simulation and Experimental Investigation of ps Pulsed Laser ...
官方微信
友情链接

Learning optimal inter-class margin adaptively for few-shot class-incremental learning via neural collapse-based meta-learning

2024-03-22


Author(s): Ran, H (Ran, Hang); Li, WJ (Li, Weijun); Li, LS (Li, Lusi); Tian, SS (Tian, Songsong); Ning, X (Ning, Xin); Tiwari, P (Tiwari, Prayag)

Source: INFORMATION PROCESSING & MANAGEMENTVolume: 61  Issue: 3Article Number: 103664  DOI: 10.1016/j.ipm.2024.103664  Published: MAY 2024

Abstract: Few -Shot Class -Incremental Learning (FSCIL) aims to learn new classes incrementally with a limited number of samples per class. It faces issues of forgetting previously learned classes and overfitting on few -shot classes. An efficient strategy is to learn features that are discriminative in both base and incremental sessions. Current methods improve discriminability by manually designing inter -class margins based on empirical observations, which can be suboptimal. The emerging Neural Collapse (NC) theory provides a theoretically optimal inter -class margin for classification, serving as a basis for adaptively computing the margin. Yet, it is designed for closed, balanced data, not for sequential or few -shot imbalanced data. To address this gap, we propose a Meta -learning- and NC -based FSCIL method, MetaNC-FSCIL, to compute the optimal margin adaptively and maintain it at each incremental session. Specifically, we first compute the theoretically optimal margin based on the NC theory. Then we introduce a novel loss function to ensure that the loss value is minimized precisely when the inter -class margin reaches its theoretically best. Motivated by the intuition that "learn how to preserve the margin"matches the meta-learning's goal of "learn how to learn", we embed the loss function in base -session meta -training to preserve the margin for future meta -testing sessions. Experimental results demonstrate the effectiveness of MetaNC-FSCIL, achieving superior performance on multiple datasets. The code is available at https://github.com/qihangran/metaNC-FSCIL.

Accession Number: WOS:001170976700001

ISSN: 0306-4573

eISSN: 1873-5371




关于我们
下载视频观看
联系方式
通信地址

北京市海淀区清华东路甲35号(林大北路中段) 北京912信箱 (100083)

电话

010-82304210/010-82305052(传真)

E-mail

semi@semi.ac.cn

交通地图
版权所有 中国科学院半导体研究所

备案号:京ICP备05085259-1号 京公网安备110402500052 中国科学院半导体所声明