in Digital Logic recategorized by
521 views
1 vote
1 vote

How many number bits are required to represent a $32$ digit decimal number in binary?

  1. $6$ bits
  2. $32$ bits
  3. $106$ bits
  4. $107$ bits
in Digital Logic recategorized by
521 views

1 Answer

1 vote
1 vote
$9 \: 9 \: 9 \: \dots 9 \: 9 \: 9_{10}$ is the largest $32$ digit decimal possible, so $2^n \geq 9 \: 9 \: 9 \: \dots 9 \: 9 \: 9_{10}$
$32$ -decimal digits
Hint: Go for approximation
$\begin{array}{l} 99 \equiv 10^2 \\ 999 \equiv 10^3 \\ 9999 \equiv 10^4 \\ \vdots \\ \vdots \\ 2^n > 10^{32} \end{array}$
(As RHS has been incremented by $'1'$ for approximation)
Apply $\log_{10}$ on both sides
$n \: \log_{10}^2 > 32 \log_{10}^{10}$
$n > \frac{32}{\log_{32}^2} \Rightarrow n > 106.30$
Here do not round off $106.30$ to $106$ because $n$ should be greater than $106.30$. So the nearest integer greater than $106.30$ is $107$ bits.
Answer:

Related questions