in Digital Image Processing edited by
2,127 views
2 votes
2 votes

Consider the conditional entropy and mutual information for the binary symmetric channel. The input source has alphabet $X=\{0,1\}$ and associated probabilities ${\dfrac{1}{2}, \dfrac{1}{2}}$. The channel matrix is $\begin{pmatrix} 1-p & p \\ p & 1-p \end{pmatrix}$ wgere p is the transition probability. Then the conditional entropy is given by:

  1. $1$
  2. $-p \log(p)-(1-p) \log(1-p)$
  3. $1+p \log(p)+(1-p) \log(1-p)$
  4. $0$
in Digital Image Processing edited by
2.1k views

1 comment

Kindly Help anyone
0
0

1 Answer

0 votes
0 votes

2 Comments

Ans  is B:  -plog(p)-(1-p)log(1-p)

0
0
Explanation please
0
0
Answer:

Related questions

Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true