Few-shot class incremental learning via robust transformer approach
Penulis/Author
Naeem Paeedeh (1); Mahardhika Pratama (2); Dr.Eng. Ir. Sunu Wibirama, S.T., M.Eng., IPM. (3); Wolfgang Mayer (4); Zehong Cao (5); Ryszard Kowalczyk (6)
Tanggal/Date
22 2024
Kata Kunci/Keyword
Abstrak/Abstract
Few-Shot Class-Incremental Learning (FSCIL)presents an extension of the Class Incremental Learning (CIL)problem where a model is faced with the problem of data scarcity while addressing the Catastrophic Forgetting (CF)problem. This problem remains an open problem because all recent works are built upon the Convolutional Neural Networks (CNNs)performing sub-optimally compared to the transformer approaches. Our paper presents Robust Transformer Approach (ROBUSTA)built upon the Compact Convolutional Transformer (CCT). The issue of overfitting due to few samples is overcome with the notion of the stochastic classifier, where the classifier's weights are sampled from a distribution with mean and variance vectors, thus increasing the likelihood of correct classifications, and the batch-norm layer to stabilize the training process. The issue of CFis dealt with the idea of delta parameters, small task-specific trainable parameters while keeping the backbone networks frozen. A non-parametric approach is developed to infer the delta parameters for the model's predictions. The prototype rectification approach is applied to avoid biased prototype calculations due to the issue of data scarcity. The advantage of ROBUSTAis demonstrated through a series of experiments in the benchmark problems where it is capable of outperforming prior arts with big margins without any data augmentation protocols.
Rumpun Ilmu
Teknik Elektro
Bahasa Asli/Original Language
English
Level
Internasional
Status
Dokumen Karya
No
Judul
Tipe Dokumen
Aksi
1
Few-shot class incremental learning via robust transformer approach.pdf
[PAK] Full Dokumen
2
Few-shot class incremental learning via robust transformer approach.pdf