Traditional ML

Trees, Random Forest, and Boosting

Understand why tree ensembles remain the default baseline for many tabular interview cases.

Recommended on day 15100 minutesIntermediate

Learning objectives

  • Contrast bagging and boosting clearly
  • Explain bias-variance trade-offs for trees and ensembles
  • Reason about interpretability, calibration, and feature importance limitations

Interview prompts

  • Why does gradient boosting often outperform logistic regression on tabular data?
  • What failure mode appears when trees overfit a sparse feature space?