Skip to content

Latest commit

 

History

History
19 lines (12 loc) · 626 Bytes

README.md

File metadata and controls

19 lines (12 loc) · 626 Bytes

Quantization Schemes For Training Neural Networks

This work shows how to train neural networks with quantized weights directly via backpropagation. The code quantizes weights as well as activations. It runs LeNet-300 on MNIST and ResNet18 on CIFAR10 datasets. More details in this pdf

Run Experiment

--wbits specifies the number of bits for weights
--abits specifies the number of bits for activations

python Trainer.py --wbits 4 --abits 4

Results