in Machine Learning
175 views
1 vote
1 vote

Suppose a classifier predicts each possible class with equal probability. If there are 10 classes, what will the cross-entropy error be on a single example?

  1. $− log(10)$
  2. $−0.1 log(1)$
  3. $− log(0.1)$
  4. $−10 log(0.1)$
in Machine Learning
175 views

2 Answers

2 votes
2 votes
As $p_{i}$ is 0.1, we can write cross entropy as follows. So answer is C.

 

$L = -\Sigma_{i=1}^{i=10}y_{i}log(p_{i}) = -log(p_{i}) = -log(0.1)$
0 votes
0 votes
In this case, since each class is predicted with equal probability, the probability assigned to the true class is 1/10 (assuming there are 10 classes). Therefore, the cross-entropy error (denoted as H) for a single example can be calculated as follows:

H = -log(1/10) = -log(0.1)

The base of the logarithm is typically 2 or e (natural logarithm).

hence answer ‘C’ is the correct answer.

Related questions