JNET is my own ML library built from scratch.
- Numpy
- Typing
- Steepest gradient descent optimization
- Extensive typing in Python
- Tanh activation layer
- Mean-squared error loss
TODO:
- Cross Entropy or another Loss function
- Implement my own Tensor class (currently using NP arrays as Tensors)
- Adding support for ReLU, Sigmoidal or Leaky activation layers
- Research other optimization strategies than SGD