in Algorithms edited by
31,860 views
50 votes
50 votes

What is the $\text{time complexity}$ of the following recursive function?

int  DoSomething (int n) {
    if (n <= 2)
        return 1;
    else 
        return (DoSomething (floor (sqrt(n))) + n);
}
  1. $\Theta(n^2)$

  2. $\Theta(n \log_2n)$

  3. $\Theta(\log_2n)$

  4. $\Theta(\log_2\log_2n)$

in Algorithms edited by
31.9k views

4 Comments

@ASNR1010

Recurrence relation is :

T(n) = T(sqrt(n)) + 1

Because, in recurrence time complexity is depend on total number of calls not on the number added to it.

Suppose the  code is:

int DoSomething (int n) {

    if (n <= 2)

        return 1;

    else

        return (DoSomething (floor (sqrt(n))) + n);

    for (int i = 0; i <n ; i++)

          printf ("*");

}

In that case recurrence is:

T(n) = T(sqrt(n)) + n (n because in each call, loop is executed n times)
8
8
It is pretty easy question considering the level of gate, more suitable for exams like isro and ugc.
1
1
in your given case what will be the solution of your recurrence relation
0
0

8 Answers

88 votes
88 votes
Best answer

We are asked the time complexity which will be the number of recursive calls in the function as in each call we perform a constant no. of operations and a recursive call. The recurrence relation for this is (considering constant time "$c$" as $1$)

$T(n) = T(\sqrt n) + 1$

$\qquad= T\left(n^{1/4}\right) + 2 $
$\\\qquad=T\left(n^{1/8}\right) + 3 $

Going like this we will eventually reach $T(3)$ or $T(2)$. For asymptotic case this doesn't matter and we can assume we reach $T(2)$ and in next step reach $T(1)$. So, all we want to know is how many steps it takes to reach $T(1)$ which will be $1+ $no. of steps to reach $T(2)$. 

From the recurrence relation we know that $T(2)$ happens when $n^{\left(\frac{1}{2^k}\right)} = 2$.

Taking $\log $ and equating,

$\frac{1}{2^k} \log n = 1$
$ \\\implies 2^k = \log n$
$ \\\implies k = \log \log n$. 

So, $T(1)$ happens in $\log \log n + 1$ calls, but for asymptotic complexity we can write as $\Theta \left( \log \log n\right)$


Alternatively,

Substituting values

$T(1) = 1$
$T(2) = 1$
$T(3) = T(1) + 1 = 2$
$\vdots$
$T(8) = T(2) + 1 = 2$
$T(9) = T(3) + 1 = 3$
$\vdots$

$T\left(\left({\left({2^2}\right)^2}\right)^2\right) = T\left(\left({2^2}\right)^2\right) + 1$
$ \\\quad= T(2^2)+ 2$
$ \\\quad= T(2) + 3 = 1 + 3 = 4, $
$\\ \log \log n = 3 \text{ as } n = 256$.


$T\left(\left({\left({\left({\left({2^2}\right)^2}\right)^2}\right)^2}\right)^2\right) = 6,$
$\\\quad \log \log n = 5 \text{ as } n = 65536 \times 65536 = 2^{32}$

$T\left(2^{(2^{10})}\right) = T\left(2^{512}\right) + 1 $
$\\\quad= T(2^{256}) + 2 $
$\\\quad= T(2^{128}) + 3$
$\\\quad = T(2^{64}) + 4 $
$\\\quad= T(2^{32}) + 5 $
$\\\quad= T(2^{16}) + 6 $
$\\\quad= T(2^8)+7 $
$\\\quad= T(2^4) + 8 $
$\quad\\= T(2^2) + 9$
$\quad = T(2) + 10 = 11,$
$\quad \log \log n = 10$

So, answer is D.

http://stackoverflow.com/questions/16472012/what-would-cause-an-algorithm-to-have-olog-log-n-complexity

edited by
by

4 Comments

I guess a lot of people like me didn't notice that the 'n' in the return statement is outside DoSomething recursive function call and hence the confusion.
8
8
@arjun sir thanks for this beautiful explanation
0
0
edited by

@aspi_r_osa

i did the same mistake of taking the recc relation for time as T(n)=T(sqrt(n))+n and this will give ans as theta(n).

but this is wrong as the addition will take only constant time and that n there is given only to create confusion so treat it as a constant it is not related to that function which is recursively being called .

so the correct recc relation for time will be T(n)=T(sqrt(n)) + c  by solving which we get

theta(loglogn).

hope it is clear now!

1
1
32 votes
32 votes

Recursive relation for the DoSomething() is

  T(n) =  T(√n) + C1 if n > 2  

We have ignored the floor() part as it doesn’t matter here if it’s a floor or ceiling.

  Let n = 2^m,  T(n) = T(2^m)
  Let T(2^m) =  S(m)

  From the above two, T(n) = S(m)

  S(m) = S(m/2) + C1  /* This is simply binary search recursion*/
  S(m)  = O(logm)      
          = O(loglogn)  /* Since n = 2^m */
  
  Now, let us go back to the original recursive function T(n) 
  T(n)  = S(m) 
          = O(LogLogn)

4 Comments

thanks got it
0
0

If it would have been 

T(n) =  T(√n) + n , it would mean to solve a solve a problem of size of n we need to solve for size(root(n)) and to solve it we need O(n) i.e iterate over n items, which is not the case. So, the soln requires c1 time (simple statements take O(1)
0
0
what is the n here? how is it 2^m? can someone please explain?
0
0
21 votes
21 votes

T(n)=T(√n) + c  

Let n= 2^m   hence m=log n
T(2^m) = T(√(2^m))+c
Let T(2^m) = S(m)

Now S(m)=S(m/2)+c
Apply master's theorem.
a=1, b=2, k=0,p=0
a=b^k, p>-1,  
Hence S(m)= theta(log m)
Hence it is theta(log log n)  (Since m=log n)

by

4 Comments

@Ahwan why did you take "C" instead of "n"
0
0
It is calling itself and doing constant work.  Except calling itself is it doing anything of O(n) ?
1
1

  T(2^m) = S(m) 

 S(m)=S(m/2)+c   

how did this come? I don't want to memorize it

0
0
  1. Here we have p>-1 and the case for that is t(n)=theta(n^log(a) log^(p+1) n)
0
0
4 votes
4 votes

 =(floor(sqrt(n))) + n

here n is constant so 

T(n)=T$\sqrt{n}$ + 1

T(n) = T(n1/2) +1 EQU-A

put n1/2 in place of n we will get

T(n1/2) = T(n1/4) +1  EQU-1

NOW PUT EQU-1 INTO EQU-A WE WILL GET FOLLOWING

T(n) = T(n1/4) +1 +1 BY DOING MORE THIS BACK SUBSTITUTION WE WILL GET GENERALIZED EQUA.

T(n) = T(n1/(2^i)) + $\sum_{1}^{i+1}1$ EQU-F

n1/(2^i) = 2 taking log both the sided we will get following

logn=2taking again log both LHS AND RHS 

loglogn=i 

$\sum_{1}^{i+1}1$ = i+1

putting value of i into EQU-F WE WILL GET TIME COMPLEXITY = O(loglogn)

2 Comments

Thanks
0
0
how can u tell that n is constant as it varies in every iteration

I am not able to understand that point can u please elaborate
1
1
Answer:

Related questions