Gating Mechanism in Deep Neural Networks for Resource-Efficient Continual Learning
Catastrophic forgetting is a well-known tendency in continual learning of a deep neural network to forget previously learned knowledge when optimizing for sequentially incoming tasks.To address the issue, several methods have been proposed in research on continual learning.However, these methods Wok Brushes cannot preserve the previously learned kn