Day 17 of 133

Regularization (L1, L2, ElasticNet) + DSA Stack finish

Sparsity geometry, Bayesian priors, why ElasticNet exists.

DSA · NeetCode Stack

  • Car FleetDSA · Stack

    Interview questions to prep

    1. Why a stack here — what LIFO property does the problem exploit?
    2. If this uses a monotonic stack, state the monotonic invariant and how it's restored on each push.
    3. Walk through complexity: each element is pushed and popped at most once, so the total work is O(n).
  • Interview questions to prep

    1. Walk through the monotonic stack — what does popping a bar tell you about its rectangle?
    2. How does this generalize to 'maximal rectangle in a binary matrix'?

ML · Regularization (L1, L2, ElasticNet)

  • Interview questions to prep

    1. Why does L1 produce sparse solutions while L2 doesn't? Show the geometric picture.
    2. When do you use ElasticNet over L1 or L2 alone?
  • Interview questions to prep

    1. Why is early stopping equivalent to L2 regularization in some cases?
    2. How would you choose the early-stopping patience and what happens when it's too small or too large?
  • Interview questions to prep

    1. Show that L2 regularization corresponds to a Gaussian prior on weights and L1 to a Laplace prior.
    2. When does the Bayesian framing actually change a modeling decision in practice?

References & further reading