Skip to content

tsadpbb/MNIST

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

MNIST dataset with Machine Learning built with Zig 0.13.0

This was harder than I though I spent almost no time optimizing the code so as it stands it's pretty slow.

The project was purely educational for me.

There are many things I could add to improve the performance of the model, like better weight and bias initialization, regularization, softmax, etc Although I would probably want to get it running faster first.

To Start / How to use it

At it's simplest form

make
./MNIST

But that's disingenuous, because that won't work.

You'll have to specify if it's going to run in training mode or inference mode

./MNIST -t

vs

./MNIST -i

When running in training mode, you need to provide an IDX3 file and IDX1 file and either a pre existing model or the amount of layers you want. You also have to specify the hyper parameters, things like epochs, batch size, and learning rate. You must always specify an output file to store the model in.

Example creating a new model, with a 100 neuron hidden layer, a batch size of 10, 30 epochs, and a learning rate of 1.

./MNIST -t -f3 data/train-images.idx3-ubyte -f1 data/train-labels.idx1-ubyte -o model.byte -b 10 -e 30 -r 1 -l 100

Example creating a new model with a two 10 neuron hidden layers.

./MNIST -t -f3 data/train-images.idx3-ubyte -f1 data/train-labels.idx1-ubyte -o model.byte -b 10 -e 30 -r 1 -l 10,10

Example training an existing model

./MNIST -t -f3 data/train-images.idx3-ubyte -f1 data/train-labels.idx1-ubyte -m model.byte -o model.file -b 10 -e 30 -r 1

When running in inference mode, you have to specify both IDX files, but this time you just need to specify a model file generated by previous training.

./MNIST -i -f3 data/t10k-images.idx3-ubyte -f1 data/t10k-labels.idx1-ubyte -m model.byte

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published