In this case, since each class is predicted with equal probability, the probability assigned to the true class is 1/10 (assuming there are 10 classes). Therefore, the cross-entropy error (denoted as H) for a single example can be calculated as follows:
H = -log(1/10) = -log(0.1)
The base of the logarithm is typically 2 or e (natural logarithm).
hence answer ‘C’ is the correct answer.