minitorch — building pytorch from scratch

Reverse-mode autograd, module system, conv2d with im2col, optimizers. ~1,300 lines of Python + NumPy. Trains MNIST to 96%.

Total lines
~1,300
Files
11
Dependencies
NumPy
Best accuracy
96.4%

Codebase breakdown

Module explorer

Click a module to see details and key concepts

Training loop flow
Click a step above to see what happens at each stage of the training loop.
MNIST results
MLP (2-layer)
95.15%
784 → 128 → 10
15 epochs, 10k samples
Adam optimizer
CNN (2 conv blocks)
96.40%
Conv→Pool→Conv→Pool→FC
10 epochs, 2k samples
Adam optimizer