in Machine Learning
152 views
1 vote
1 vote

Suppose that you are training a neural network for classification, but you notice that the training loss is much lower than the validation loss. Which of the following can be used to address the issue (select all that apply)?

  1.  Use a network with fewer layers
  2. Decrease dropout probability √
  3. Increase $L2$ regularization weight
  4. Increase the size of each hidden layer
in Machine Learning
152 views

1 Answer

1 vote
1 vote
As model is overfitting, we have to reduce the model complexity. A, C does the same. Adding drop-out is adding regularization. So reducing dropout is equivalent to less regularization, so it will not help.

Related questions