Skip to content

jaber-jaber/jnet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

JNET is my own ML library built from scratch.

Dependencies

  1. Numpy
  2. Typing

Features

  1. Steepest gradient descent optimization
  2. Extensive typing in Python
  3. Tanh activation layer
  4. Mean-squared error loss

TODO:

  1. Cross Entropy or another Loss function
  2. Implement my own Tensor class (currently using NP arrays as Tensors)
  3. Adding support for ReLU, Sigmoidal or Leaky activation layers
  4. Research other optimization strategies than SGD

About

My own ML library from scratch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages