0 votes
0 votes
Suppose we have a regularized linear regression model: \[ \text{argmin}_{\mathbf{w}} \left||\mathbf{Y} - \mathbf{Xw} \right||^2 + k \|\mathbf{w}\|_p^p. \] What is the effect of increasing \( p \) on bias and variance (for \( p \geq 1 \)) if the weights are all larger than $1$?

(a)] Increases bias, increases variance

(b)] Increases bias, decreases variance

(c)] Decreases bias, increases variance

(d)] Decreases bias, decreases variance

(e)] Not enough information to tell
in Artificial Intelligence
153 views

1 Answer

0 votes
0 votes

Regularization and Bias-Variance Trade-off:

Regularization aims to control the complexity of a model to reduce overfitting and improve generalization. Increasing regularization generally increases bias (underfitting) but decreases variance (overfitting).

Effect of Increasing \( p \):

  • Penalty on Larger Weights:The \( \|\mathbf{w}\|_p^p \) term is a regularization term that penalizes large weights. As \( p \) increases, the penalty on weights larger than $1$ becomes more severe. This leads to smaller weights overall, making the model simpler.
  • Impact on Bias: Simpler models tend to have higher bias because they might not fully capture the underlying patterns in the data.
  • Impact on Variance: Simpler models also tend to have lower variance because they are less sensitive to noise in the training data.

Key Point: The fact that weights are initially larger than $1$ ensures that increasing \( p \) does indeed lead to smaller weights.

Therefore, increasing \( p \) in this model increases bias but decreases variance.

Related questions