Day 36DL · Neural network foundations
Perceptron, MLP, forward pass
- • Walk me through forward pass through a 2-layer MLP for binary classification.
- • Why can't a single perceptron solve XOR — and how does adding a hidden layer fix it?
Reason about architectures
Move from backprop and optimization into CNNs, sequence modeling, and transformer intuition.
Featured topics
Each topic includes a summary, practical learning goals, representative interview prompts, and a suggested roadmap day.
Build an interview-safe explanation for gradient flow, learning dynamics, and optimization choices.
Learning objectives
Cover convolutions, receptive fields, and architecture choices that still surface in vision-heavy roles.
Learning objectives
Understand the transformer stack deeply enough to explain scaling, context handling, and attention trade-offs.
Learning objectives
Practice prompts
These are pulled from the same 133-day roadmap content used by Browse Questions.