in Computer Networks
559 views
0 votes
0 votes
In hamming method to transmit a message of size 12B data . How many minimum number of padding bits are considered to correct single bit error
in Computer Networks
by
559 views

1 comment

7 ??
1
1

1 Answer

2 votes
2 votes
Best answer

Using Hamming code , we can correct the errors .. For that we need to add some bits in between the bits of the message at specified locations , which are known as padding bits . These bits are placed at places which are powers of 2..i.e. at position number 1 , 2 , 4 etc starting from left i.e. from MSB (most significant bit)..

Now to know how many padding bits needed we have the following equation :

2r   >=  m + r + 1

The simple reason behind this equation is that we have to keep rth padding bit which is the last padding bit at 2r position..

Hence here message size given    =     12 B      =     96 bits

Hence  m  =   96 bits

Thus we have :

          2r   >=  96 + r + 1

==>    2r  >=   97 + r

==>    rmin    =    7     

Hence number of padding bits  =  7

edited by

4 Comments

hamming code is error correcting code ryt. it can correct single bit error.

This not work for multiple bit error.
0
0

Here we are asked about padding bits that are to be added to the data which is an entirely different question. Wht u r talking about is related to the minimum Hamming distance ..Regarding that we have:

a) To correct 'd' errors , minimum Hamming distance  = 2d + 1

b) To detect 'd' erros , minimum Hamming distance  =  d + 1

But here we are talking about Hamming code uasge instead..It is not to be confused with Hamming distance..:) 

0
0
yeah got confused.. thanks :) @habib
1
1