Abstract:
Support Vector Machines (SVMs) have been successfully applied to solve a large number of classification and regression problems. However, SVMs suffer from the catastrophic forgetting phenomenon, which results in loss of previously learned information. Learn(++) have recently been introduced as an incremental learning algorithm. The strength of Learn(++) lies in its ability to learn new data without forgetting previously acquired knowledge and without requiring access to any of the previously seen data, even when the new data introduce new classes. To address the catastrophic forgetting problem and to add the incremental learning capability to SVMs, we propose using an ensemble of SVMs trained with Learn(++). Simulation results on real-world and benchmark datasets suggest that the proposed approach is promising.