Day 44 of 133
CNN architectures (AlexNet → ResNet → EfficientNet) + DSA Tries
What problem each generation solved. Residuals, 1×1 convs, depthwise separable.
DSA · NeetCode Tries
- Implement Trie Prefix TreeDSA · Tries
Interview questions to prep
- Why a trie over a hash map — what queries does the trie make cheaper?
- What's the time/space trade-off vs storing all suffixes?
- How would you support deletion or wildcard matching efficiently?
- Design Add And Search Words Data StructureDSA · Tries
Interview questions to prep
- Why a trie over a hash map — what queries does the trie make cheaper?
- What's the time/space trade-off vs storing all suffixes?
- How would you support deletion or wildcard matching efficiently?
DL · CNN architectures
Interview questions to prep
- What problem did ResNet's residual connections actually solve?
- Why did 1×1 convs become so important (Inception, bottleneck blocks)?
Interview questions to prep
- Explain why training error went UP with depth before ResNet.
- Walk me through a residual block.
Interview questions to prep
- How do depthwise separable convolutions reduce compute?
- What does EfficientNet's compound scaling do that one-axis scaling doesn't?
References & further reading
- CS231n — CNNs for visual recognition ↗Stanford
- ResNet paper ↗He et al.
- fast.ai — Practical Deep Learning ↗fast.ai