Prerequisites & Self-Assessment#
Purpose: Ensure you have the foundational knowledge to succeed in TinyTorch and discover complementary resources for deeper learning.
Core Requirements#
You need TWO things to start building:
1. Python Programming#
Comfortable writing functions and classes
Familiarity with basic NumPy arrays
No ML framework experience required—you’ll build your own!
Self-check: Can you write a Python class with __init__ and methods?
2. Basic Linear Algebra#
Understand matrix multiplication conceptually
Know what a gradient (derivative) represents at a high level
Self-check: Do you know what multiplying two matrices means?
That’s it. You’re ready to start building.
“Nice to Have” Background#
We teach these concepts as you build—you don’t need them upfront:
Calculus (derivatives): Module 05 (Autograd) teaches this through implementation
Deep learning theory: You’ll learn by building, not lectures
Advanced NumPy: We introduce operations as needed in each module
Learning Philosophy: TinyTorch teaches ML systems through implementation. You’ll understand backpropagation by building it, not by watching lectures about it.
Self-Assessment: Which Learning Path Fits You?#
Path A: Foundation-First Builder (Recommended for most)#
You are:
Strong Python programmer
Curious about ML systems
Want to understand how frameworks work
Start with: Module 01 (Tensor)
Best for: CS students, software engineers transitioning to ML, anyone wanting deep systems understanding
Path B: Focused Systems Engineer#
You are:
Professional ML engineer
Need specific optimization skills
Want production deployment knowledge
Start with: Review Foundation Tier (01-07), focus on Optimization Tier (14-19)
Best for: Working engineers debugging production systems, performance optimization specialists
Path C: Academic Researcher#
You are:
ML theory background
Need implementation skills
Want to prototype novel architectures
Start with: Module 01, accelerate through familiar concepts
Best for: PhD students, research engineers, anyone implementing custom operations
Complementary Learning Resources#
Essential Systems Context#
Machine Learning Systems by Prof. Vijay Janapa Reddi (Harvard)
TinyTorch’s companion textbook providing systems perspective
Covers production ML engineering, hardware acceleration, deployment
Perfect pairing: TinyTorch teaches implementation, ML Systems book teaches context
Mathematical Foundations#
Deep Learning Book by Goodfellow, Bengio, Courville
Comprehensive theoretical foundations
Mathematical background for concepts you’ll implement
Use alongside TinyTorch for deeper understanding
Visual Intuition#
Visual explanations of backpropagation, gradient descent, neural networks
Perfect visual complement to TinyTorch’s hands-on implementation
Geometric intuition for vectors, matrices, transformations
Helpful refresher for tensor operations and matrix multiplication
Python & NumPy#
Essential NumPy operations and array manipulation
Review before Module 01 if NumPy is unfamiliar
Ready to Begin?#
If you can:
âś… Write a Python class with methods
âś… Explain what matrix multiplication does
âś… Debug Python code using print statements
Then you’re ready to start building!
Not quite there? Work through the resources above, then return when ready. TinyTorch will still be here, and you’ll get more value once foundations are solid.
Next Steps#
Ready to Build:
See Quick Start Guide for hands-on experience
See Student Workflow for development process
See Course Structure for full curriculum
Need More Context:
See Additional Resources for broader ML learning materials
See FAQ for common questions about TinyTorch
See Community to connect with other learners
Your journey from ML user to ML systems engineer starts here.