Skip to content

'Rethinking Knowledge Distillation via Cross-Entropy' and 'ViTKD: Practical Guidelines for ViT Feature Knowledge Distillation'

License

Notifications You must be signed in to change notification settings

richardhahahaha/cls_KD

 
 

Repository files navigation

Knowledge Distillation for Image Classification

This repository includes official implementation for the following papers:

  • NKD and tf-NKD: Rethinking Knowledge Distillation via Cross-Entropy

  • ViTKD: ViTKD: Practical Guidelines for ViT feature knowledge distillation

It also provides unofficial implementation for the following papers:

If this repository is helpful, please give us a star ⭐ and cite relevant papers.

Install

  • Prepare the dataset in data/imagenet
  • # Set environment
    pip install -r requirements.txt
    pip install torch==1.8.1+cu111 torchvision==0.9.1+cu111 torchaudio==0.8.1 -f https://download.pytorch.org/whl/torch_stable.html
    
  • This repo uses mmcls = 0.23.2. If you want to use lower mmcls version for distillation, you can refer MGD to change the codes.

Run

  • Please refer nkd.md and vitkd.md to train the student and get the weight.
  • You can modify the configs to choose different distillation methods and pairs.
  • The implementation details of different methods can be seen in the folder distillation.

Citing NKD and tf-NKD

@article{yang2022rethinking,
  title={Rethinking Knowledge Distillation via Cross-Entropy},
  author={Yang, Zhendong and Li, Zhe and Gong, Yuan and Zhang, Tianke and Lao, Shanshan and Yuan, Chun and Li, Yu},
  journal={arXiv preprint arXiv:2208.10139},
  year={2022}
}

Citing ViTKD

@article{yang2022vitkd,
  title={ViTKD: Practical Guidelines for ViT feature knowledge distillation},
  author={Yang, Zhendong and Li, Zhe and Zeng, Ailing and Li, Zexian and Yuan, Chun and Li, Yu},
  journal={arXiv preprint arXiv:2209.02432},
  year={2022}
}

Acknowledgement

Our code is based on the project MMClassification.

About

'Rethinking Knowledge Distillation via Cross-Entropy' and 'ViTKD: Practical Guidelines for ViT Feature Knowledge Distillation'

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.8%
  • Other 0.2%