Open Access iconOpen Access

ARTICLE

Support Vector–Guided Class-Incremental Learning: Discriminative Replay with Dual-Alignment Distillation

Moyi Zhang, Yixin Wang*, Yu Cheng

College of Computer and Artificial Intelligence, Lanzhou University of Technology, Lanzhou, 730050, China

* Corresponding Author: Yixin Wang. Email: email

Computers, Materials & Continua 2026, 86(3), 88 https://doi.org/10.32604/cmc.2025.071021

Abstract

Modern intelligent systems, such as autonomous vehicles and face recognition, must continuously adapt to new scenarios while preserving their ability to handle previously encountered situations. However, when neural networks learn new classes sequentially, they suffer from catastrophic forgetting—the tendency to lose knowledge of earlier classes. This challenge, which lies at the core of class-incremental learning, severely limits the deployment of continual learning systems in real-world applications with streaming data. Existing approaches, including rehearsal-based methods and knowledge distillation techniques, have attempted to address this issue but often struggle to effectively preserve decision boundaries and discriminative features under limited memory constraints. To overcome these limitations, we propose a support vector–guided framework for class-incremental learning. The framework integrates an enhanced feature extractor with a Support Vector Machine classifier, which generates boundary-critical support vectors to guide both replay and distillation. Building on this architecture, we design a joint feature retention strategy that combines boundary proximity with feature diversity, and a Support Vector Distillation Loss that enforces dual alignment in decision and semantic spaces. In addition, triple attention modules are incorporated into the feature extractor to enhance representation power. Extensive experiments on CIFAR-100 and Tiny-ImageNet demonstrate effective improvements. On CIFAR-100 and Tiny-ImageNet with 5 tasks, our method achieves 71.68% and 58.61% average accuracy, outperforming strong baselines by 3.34% and 2.05%. These advantages are consistently observed across different task splits, highlighting the robustness and generalization of the proposed approach. Beyond benchmark evaluations, the framework also shows potential in few-shot and resource-constrained applications such as edge computing and mobile robotics.

Keywords

Class-incremental learning; catastrophic forgetting; support vector machine; knowledge distillation

Cite This Article

APA Style
Zhang, M., Wang, Y., Cheng, Y. (2026). Support Vector–Guided Class-Incremental Learning: Discriminative Replay with Dual-Alignment Distillation. Computers, Materials & Continua, 86(3), 88. https://doi.org/10.32604/cmc.2025.071021
Vancouver Style
Zhang M, Wang Y, Cheng Y. Support Vector–Guided Class-Incremental Learning: Discriminative Replay with Dual-Alignment Distillation. Comput Mater Contin. 2026;86(3):88. https://doi.org/10.32604/cmc.2025.071021
IEEE Style
M. Zhang, Y. Wang, and Y. Cheng, “Support Vector–Guided Class-Incremental Learning: Discriminative Replay with Dual-Alignment Distillation,” Comput. Mater. Contin., vol. 86, no. 3, pp. 88, 2026. https://doi.org/10.32604/cmc.2025.071021



cc Copyright © 2026 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 407

    View

  • 114

    Download

  • 0

    Like

Share Link