Stay organized with collections Save and categorize content based on your preferences.

Explore the options below.

Imagine a linear model with 100 input features:
• 10 are highly informative.
• 90 are non-informative.
• Assume that all features have values between -1 and 1. Which of the following statements are true?
L2 regularization will encourage many of the non-informative weights to be nearly (but not exactly) 0.0.
Yes, L2 regularization encourages weights to be near 0.0, but not exactly 0.0.
L2 regularization will encourage most of the non-informative weights to be exactly 0.0.
L2 regularization does not tend to force weights to exactly 0.0. L2 regularization penalizes larger weights more than smaller weights. As a weight gets close to 0.0, L2 "pushes" less forcefully toward 0.0.
L2 regularization may cause the model to learn a moderate weight for some non-informative features.
Surprisingly, this can happen when a non-informative feature happens to be correlated with the label. In this case, the model incorrectly gives such non-informative features some of the "credit" that should have gone to informative features.
[{ "type": "thumb-down", "id": "missingTheInformationINeed", "label":"Missing the information I need" },{ "type": "thumb-down", "id": "tooComplicatedTooManySteps", "label":"Too complicated / too many steps" },{ "type": "thumb-down", "id": "outOfDate", "label":"Out of date" },{ "type": "thumb-down", "id": "samplesCodeIssue", "label":"Samples / code issue" },{ "type": "thumb-down", "id": "otherDown", "label":"Other" }]
[{ "type": "thumb-up", "id": "easyToUnderstand", "label":"Easy to understand" },{ "type": "thumb-up", "id": "solvedMyProblem", "label":"Solved my problem" },{ "type": "thumb-up", "id": "otherUp", "label":"Other" }]