Skip to content

Latest commit

 

History

History
2 lines (2 loc) · 398 Bytes

README.md

File metadata and controls

2 lines (2 loc) · 398 Bytes

Selective Class Attention Distillation (SCAD)

Official repository of the paper "An Attention-based Representation Distillation Baseline for Multi-Label Continual Learning" by Martin Menabue, Emanuele Frascaroli, Matteo Boschini, Lorenzo Bonicelli, Angelo Porrello and Simone Calderara, accepted at "LOD 2024 - The 10th International Conference on Machine Learning, Optimization, and Data Science"