in Algorithms recategorized by
689 views
2 votes
2 votes

Consider the following program code, where pow() is the exponentiation function.

f (int n)
{
    if (n <= 1) return 1;
    return f(n-1) + g(n) + g(pow(2, n-1));
}

g (int n)
{
    if (n <= 1) return 1;
    return 1 + g(n/2);
}

What is the worst-case time complexity of $f(n)$?

  1. $O(2^n)$
  2. $O(n^2)$
  3. $O(2^n \log (n))$
  4. $O( \log ^n (n))$
in Algorithms recategorized by
689 views

1 comment

$f(n)=f(n-1)+g(n)+g(2^{n-1})$


$g(n) = g(n/2)+1$

=> $g(n) = \Theta (logn)$ //Master Theorem

 

$g(2^{n-1})=g(2^{n-2})+1$

This will stop when value = 1, which is possible when the exponent reaches 0. So, we'll keep subtracting 1 from n till it reaches 0. => $\Theta (n)$


So,

$f(n)=f(n-1)+\Theta (logn)+\Theta (n)$

=> $f(n)=f(n-1)+ n$

=> $f(n)=O(n^2)$

 

Sorry, don't know how to solve it formally. :(

1
1

2 Answers

2 votes
2 votes
$\begin{array}{ll} T^1(n) & = T^1( \frac{n}{2})+\theta (1) \\ T^1(n) & = O( \log_2 n) \dots (1) \\ T(n) & = T(n-1)+T^1(n) +T^1(2^n) \\ \Leftrightarrow T(n) & = T(n-1)+O( \log_2 n ) +O(n) \text{Using 1} \\ T(n) & = \Sigma_{i=1}^n O(i) + \Sigma_{i=1}^n O( \log i) \\ & = O(n^2)+O( \log n!) \\ & = O(n^2)+O( \log_n n) \\ & = O(n^2)+O( n \log n) \\ & = O(n^2) \end{array}$
So option B is correct.

1 comment

@Applied Course either you or only god knows that what logic you're applying in your explanations? perhaps you don't know the definition of big-oh, all options are correct :/

1
1
0 votes
0 votes

@Arjun

Sir can you please verify my answer

from observation we can say time complexity of g(n) is O(log n)

hence time complexity for g($2^{n-1}$) will be O(log($2^{n-1}$)) = O(n)

Now

f(n-1) + g(n) + g(pow(2, n-1));

this statement will be called n times

                        f(n-1)                                                   + g(n)                                                 + g(pow(2, n-1));

        Not doing any work                                                 O(log n)                                               O(n)

hence overall                                                    max(    n times x O(log n)                        ,       n times x O(n))

hecne times complexity is O($n^{2}$)

                            

 

Answer:

Related questions