Skip to content

SolidusAbi/Sparse

Folders and files

NameName
Last commit message
Last commit date

Latest commit

8ecc20e · Oct 26, 2022

History

31 Commits
Oct 26, 2022
Oct 25, 2022
Dec 22, 2021
Mar 9, 2022

Repository files navigation

Sparse Network

Requiriment

  • PyTorch, Torchvision...

Reference

  1. How do neurons operate on sparse distributed representations? A mathematical theory of sparsity, neurons and active dendrites

  2. How Can We Be So Dense? The Benefits of Using Highly Sparse Representations

  3. K-Winner implementation:

Miscellaneous

  1. Sparse Coding
  2. Sparse Distributed Representations
  3. ISTA Implementation
  4. Bayesian Bits
  5. Sparse AutoEncoder
  6. Sparse AutoEncoder examples
  7. Understanding Pytorch hooks

...

$ KL(\rho||\hat{\rho})_{Ber} = \rho \log\left(\dfrac{\rho}{\hat{\rho}}\right) + (1-\rho)\log\left(\dfrac{1-\rho}{1-\hat{\rho}}\right)$

Gradient based on ρ ^

K L ( ρ | | ρ ^ ) B e r ρ ^ = 1 ρ 1 ρ ^ ρ ρ ^ = ρ ^ ρ ( ρ ^ 1 ) ρ ^

TODO

  • Blind Spot Convolution
    • Just observe the noisy context of a pixel
    • 'Efficient Blind-Spot Neural Network Architecture for Image Denoising' [Ref]
  • Learning Hybrid Sparsity Prior for Image Restoration
  • Sparse Linear
  • Important! Include a configuration where you can set the sparse property with a 'constant prunning' or 'gradual prunning'.
  • Notebook with plotting KL Div and gradient estimation

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published