0 votes
0 votes
Suppose we have a regularized linear regression model: \[ \text{argmin}_{\mathbf{w}} \left||\mathbf{Y} - \mathbf{Xw} \right||^2 + \lambda \|\mathbf{w}\|_1. \] What is the effect of increasing \( \lambda \) on bias and variance?

(a)] Increases bias, increases variance

(b)] Increases bias, decreases variance

(c)] Decreases bias, increases variance

(d)] Decreases bias, decreases variance

(e)] Not enough information to tell
in Artificial Intelligence
121 views

1 Answer

0 votes
0 votes

Understanding Bias and Variance:

  • Bias: Refers to the difference between the expected prediction of a model and the true value. A model with high bias tends to underfit the data, meaning it's too simple to capture the underlying patterns.
  • Variance: Refers to the variability of a model's predictions when trained on different datasets. A model with high variance tends to overfit the data, meaning it's too complex and captures noise in the training data.

Effect of Increasing λ:

  • Regularization: The λ term in the given model is a regularization term. It's used to prevent overfitting by penalizing large model weights.
  • L1 Regularization: The $∥w∥_₁$ term specifically is $L1$ regularization, which encourages sparsity in the model weights. This means it pushes some weights towards zero, effectively simplifying the model.

Bias and Variance Trade-off:

  • Increasing λ increases the penalty on model weights:
    • This leads to smaller weights overall, making the model simpler.
    • Simpler models tend to have higher bias because they might not fully capture the underlying patterns in the data.
    • However, simpler models also tend to have lower variance because they are less sensitive to noise in the training data.

Therefore, increasing λ increases bias but decreases variance.

Related questions