in Machine Learning
129 views
0 votes
0 votes

In the pursuit of achieving weight sparsity in a neural network, which regularization method should be your go-to choice?

Options:

A) L1 regularization
B) L2 regularization
C) Both L1 and L2 regularization
D) No regularization needed for weight sparsity

in Machine Learning
129 views

1 Answer

2 votes
2 votes
  1. L1 regularization (lasso) because it force weight to became 0 for non informative feature.

1 comment

did you like the question?
0
0

Related questions