Day 20 of 133

Bagging, Random Forest, boosting intuition + DSA Binary Search finish

Variance reduction vs bias reduction; bagging vs boosting.

DSA · NeetCode Binary Search

  • Time Based Key Value StoreDSA · Binary Search

    Interview questions to prep

    1. State your loop invariant precisely — what must be true on every iteration?
    2. Why does the loop terminate, and how do you avoid infinite loops on the search-space update?
    3. Walk through edge cases: empty array, target smaller than min, target larger than max, duplicates.
  • Median OF Two Sorted ArraysDSA · Binary Search

    Interview questions to prep

    1. Walk through the partition idea: pick i in A, derive j in B from it.
    2. Why is the binary search on the smaller array, and what's the worst-case complexity?

ML · Ensembles: bagging, RF, boosting

  • Bagging & Random ForestTraditional MLStatQuest

    Interview questions to prep

    1. How does bagging reduce variance? Why doesn't it reduce bias?
    2. What two extra ingredients does Random Forest add on top of bagging?
    3. Which Random Forest hyperparameters control tree diversity, and how does out-of-bag validation work?
  • Boosting intuition: AdaBoost, GBMTraditional MLStatQuest

    Interview questions to prep

    1. Compare bagging vs boosting — what's reduced and how?
    2. Walk me through how AdaBoost reweights samples after each weak learner.
  • Stacking & blendingTraditional MLMLWave

    Interview questions to prep

    1. When does stacking actually help vs just adding overhead?
    2. Walk me through how you'd avoid leakage when training a stacked model with k-fold meta-features.

References & further reading