0 votes
0 votes
Suppose you have a three-class problem where class label \( y \in \{0, 1, 2\} \), and each training example \( \mathbf{X} \) has 3 binary attributes \( X_1, X_2, X_3 \in \{0, 1\} \). How many parameters do you need to know to classify an example using the Naive Bayes classifier?

(a) 5

b) 9

(c) 11

(d) 13

(e) 23
in Artificial Intelligence
391 views

1 Answer

1 vote
1 vote

I think Option (C) i.e 11 is correct.

As only 11 parameters are required 
P(Y | X1,X2,X3) = [P(Y) * P(X1 |Y) * P(X2 | Y) * P(X3 | Y)] 

Only numerator we are considering here as denominator’s impact is incorporated while comparing the final probabilities with one another.

we require only any two of the prior probability for y and the last can be find out using :

       P(Y=0) + P(Y=1) + P(Y=2) = 1

and require only one for each class as P(X1 = 0 |Y) + P(X1 = 1|Y) = 1

then total params are 1 * 3 * 3 = 9 + 2(for prior probabilities) = 11

1 comment

Yes C is the answer
0
0

Related questions