A minimalist, fully handcrafted neural network built in pure Rust β trained to solve the classic XOR problem. No frameworks, no magic β just neurons, layers, and backpropagation.
- β Feedforward neural network (MLP)
- β Manual backpropagation implementation
- β Learns XOR from scratch
- β
Only uses
rand
crate β no ML dependencies - β Fully commented and beginner-friendly
The XOR logic gate:
Input A | Input B | Expected Output |
---|---|---|
0 | 0 | 0 |
0 | 1 | 1 |
1 | 0 | 1 |
1 | 1 | 0 |
git clone https://github.com/your-username/rust-xor-mlp.git
cd rust-xor-mlp
cargo run --release
You can tweak the training parameters in main.rs:
let mut network = Network::new(2, 2); // hidden layer size
network.train(100_000, 0.1); // epochs, learning rate
Input: [0.0, 0.0] β Output: 0.02 (Expected: 0)
Input: [0.0, 1.0] β Output: 0.97 (Expected: 1)
Input: [1.0, 0.0] β Output: 0.98 (Expected: 1)
Input: [1.0, 1.0] β Output: 0.03 (Expected: 0)