Change the loss function:
Choosing a loss function that penalizes the model for overfitting tendencies or introduces regularization terms might guide the model towards better generalization.
Reduce model complexity:
Simplifying the model architecture, like using fewer layers or parameters, can make it less prone to memorizing the training data and more likely to generalize well to new, unseen data.
Increase the training data:
More data can provide the model with a broader view of the underlying patterns in the data, making it less likely to overfit specific examples. It's like giving the model a richer experience to learn from.
Increase the number of optimization routine steps:
increasing the number of optimization steps might make the model more prone to overfitting the training data, especially if the model is already complex. It's generally better to focus on the other strategies mentioned.
So, the effective ways to combat overfitting from your list would be:
Change the loss function.
Reduce model complexity.
Increase the training data.
correct answer is C