“Loss of plasticity in deep continual learning”
Dohare, S., Hernandez-Garcia, J.F., Lan, Q. et al. Nature 632, 768–774 (2024).
One of the biggest differences between the brain and deep neural networks (DNN) is probably that brain synaptic strength changes constantly, whereas DNN weights are typically frozen after training. How to make machine like the brain perform continual learning is a long-standing question.
In this journal club, we are going to read a paper on continual learning: Does the current classical backpropagation algorithm work in a continual learning setting? How to make backpropagation better for continual learning?
The AI & NS Journal Club discusses computational and theoretical neuroscience publications. There are no official presenters each time; instead, we present and discuss the paper in turns. This requires every attendee to read the paper in advance. Go here to add and/or view a list of potential papers to be discussed in the future.
For inquiries contact Kaining Zhang or Zeyuan Ye.