Day 21 of 133
Hyperparameter tuning — random > grid; Bayes / Hyperband
Search spaces, pruning bad runs early.
DSA · NeetCode Linked List
- Reverse Linked ListDSA · Linked List
Interview questions to prep
- Show both iterative (3 pointers) and recursive solutions. Compare stack space.
- What if you only want to reverse a sub-range [m, n]?
- Merge Two Sorted ListsDSA · Linked List
Interview questions to prep
- How would you generalize this to merging k sorted lists efficiently?
- Can you do it in-place without a dummy node? What's gained / lost?
ML · Hyperparameter tuning
Interview questions to prep
- Why does random search often beat grid search?
- When is Bayesian optimization worth the complexity over random search?
- How does Hyperband save compute by early-stopping bad configs?
Interview questions to prep
- How would you set up a tuning study for a GBDT model with limited compute?
- What's the difference between Optuna's TPE sampler and a random sampler — when does TPE actually help?
References & further reading
- scikit-learn user guide ↗scikit-learn
- XGBoost docs (parameters + tuning) ↗XGBoost