Skip to content

Unofficial implementation of the paper "gradients without backpropagation"

License

Notifications You must be signed in to change notification settings

luigi-ga/gradients_without_backpropagation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gradients without Backpropagation

This repository contains an implementation of the paper Gradients without backpropagation.

Description

For a better understanding of the mathematical concepts covered, we recommend first viewing the file prerequisites.ipynb.

After understanding the mathematical concepts, we recommend proceeding with fwdgrad.ipynb: in this notwbook, the Forward Gradient Descent (FGD) algorithm is implemented on some known functions and on a convolutional network. For the convolutional network, the MNIST dataset have been used. In addition, there is also a comparison of model training using SGD and backpropagation.

Results

Compared with the results reported in the paper, no improvement was found in the execution time of the forward gradient descent algorithm. Although the results on losses are about the same, backpropagation still turns out to be faster in execution than forward gradient.

Installation

Implemented in Python3 using Pytorch. Required packages: Matplotlib, Numpy, Pytorch.

References

All the references can also be find inside the python notebooks:

  1. Derivative.
  2. Kofi Asiedu Brempong, 2020 I Finally Understood Backpropagation: And you can too....
  3. Atılım Gunes Baydin, Barak A. Pearlmutter, Don Syme, Frank Wood, Philip Torr, 2022. Gradients without Backpropagation.
  4. Dual number.
  5. Mark Saroufim, 2019. Automatic Differentiation Step by Step.
  6. Robert Lange, 2019. Forward Mode Automatic Differentiation & Dual Numbers.
  7. Atılım Gunes Baydin, Barak A. Pearlmutter, Alexey Andreyevich Radul, Jeffrey Mark Siskind, 2018. Automatic Differentiation in Machine Learning: a Survey.
  8. Daniel Worrall, 2021. Dual numbers
  9. Robert Kübler, 2022. Papers Simplified: Gradients without Backpropagation.

License

Build Status

This project is licensed under the MIT License

Authors

Copyright (c) 2023 Luigi Gallo

About

Unofficial implementation of the paper "gradients without backpropagation"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published