Regularization for Sparsity

This module focuses on the special requirements for models learned on feature vectors that have many dimensions.

Regularization for Sparsity

Let's Go Back to Feature Crosses

  • Caveat: Sparse feature crosses may significantly increase feature space
  • Possible issues:
    • Model size (RAM) may become huge
    • "Noise" coefficients (causes overfitting)

L1 Regularization

  • Would like to penalize L0 norm of weights
    • Non-convex optimization; NP-hard

L1 Regularization

  • Would like to penalize L0 norm of weights
    • Non-convex optimization; NP hard
  • Relax to L1 regularization:
    • Penalize sum of abs(weights)
    • Convex problem
    • Encourage sparsity unlike L2

Send feedback about...

Machine Learning Crash Course