Continual Learning with Neural Network: An Introduction and Approaches
Sungmin Cha (New York University, USA)
■ Abstract
In this talk, I will briefly introduce the research field of Continual Learning, which trains a neural network in scenarios where new datasets or tasks are continually provided while overcoming catastrophic forgetting. Firstly, I will present the necessity of continual learning research. Secondly, I will provide a brief overview of three categories of continual learning algorithms. Third, I will introduce my recent research that addresses an issue with Batch Normalization Layer in Exemplar-based Class-Incremental Learning. Finally, I will share future works in continual learning research.
■ Bio
Sungmin Cha who is working as a Faculty Fellow at New York University, under the supervision of Prof. Kyunghyun Cho. Previously, he earned a Ph.D. in Electrical and Computer Engineering from Seoul National University (SNU), advised by Prof. Taesup Moon. Prior to this, he obtained Master's and Bachelor's degrees in Information and Communication Engineering and Computer Engineering from DGIST and Pukyong National University, respectively. During his doctoral studies, he had the opportunity to work as a visiting researcher at Harvard University, as well as a research scientist intern at both NAVER AI Lab and Fundamental Research Lab, LG AI Research. Additionally, he has had the honor of receiving several awards such as Qualcomm Innovation Fellowship Korea 2021, Yulchon AI START Fellowship 2022, and Distinguished Doctoral Dissertation Award 2023 from Dept. ECE, SNU, etc.
Sungmin’s primary research goal is to develop more cost-efficient ways for training neural networks. Specifically, He is interested in developing algorithms that enable continual learning, allowing neural networks to learn a sequence of tasks continuously, mirroring the human learning process. Additionally, he has a keen interest in various other topics, such as unsupervised image denoising and machine unlearning, and he is continually exploring new research topics.