in Programming in C edited by
10,848 views
35 votes
35 votes
double foo(int n)
{
    int i;
    double sum;
    if(n == 0)
    {
        return 1.0;
    }
    else
    {
        sum = 0.0;
        for(i = 0; i < n; i++)
        {
            sum += foo(i);
        }
        return sum;
    }

}

Suppose we modify the above function $foo()$ and stores the value of $foo(i) \ 0 \leq i < n$, as and when they are computed. With this modification the time complexity for function $foo()$ is significantly reduced. The space complexity of the modified function would be:

  1. $O(1)$
  2. $O(n)$
  3. $O(n^2)$
  4. $n!$
in Programming in C edited by
10.8k views

2 Comments

here time complexity is reduced bcz in it we have all value of foo(i) for 0 <=i <n  but to store these value obviously we need extra space i.e we required an array of size n  which holds the value of foo(i)  for i=0 to n so this will take addditionaly o(n) space . or more oer if u reduce the time by some modification that means space will take place
1
1
All previous GATE questions are answered on site. You can either search them of browse from the "Previous" tab
1
1

5 Answers

60 votes
60 votes
Best answer

Given program:

#include <stdio.h>
double foo(int n) {
	int i;
	double sum;
	if(n == 0) {
	    return 1.0;
	}
	else {
		sum = 0.0;
		for(i = 0; i < n; i++) {
		    sum += foo(i);
		}
		return sum;
	}
}

int main() {
	double a = foo(5);
	printf("%.2f\n",a);
}

And here is the present situation :

Here we can see that, we have lots of overlapping subproblems. Too many function calls.

  1. No of function calls = $2^n$
  2. stack depth used = $O(n)$

Therefore space is linear and time is exponential.

If we take a small number say $4$, then we would have $8.0$ as answer, or we can see that $foo(n) = 2^{n-1}$

and 

  1. stack depth used = $5$.
  2. No of function calls = $2^4 = 16$.

Now, using one-dimensional ($1D$) table we can reduce the no of function calls and depth of stack space used as well.

Here is what we want to achieve:

We are reusing already computed $foo(x)$ values. For this purpose, we will be using one $1D$ array of doubles of size $n$.

Here is what we are going to do:

  1. First, check in the $1D$ table for the required call value.
  2. If correct value found: do not call recursive functions
  3. If not, then only attempt for loop recursive calls

Here is the code:

#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <math.h>
#define UNUSED 0.5
double *memo_table;
double foo(int n) {
	int i;
	double sum;
	if(memo_table[n] != UNUSED) {
		return memo_table[n];
	}else {
		sum = 0.0;
		for(i=0;i<n;i++) {
			sum += foo(i);
		}
		return memo_table[n] = sum;
	}
}
int main() {
	int n,i;
	scanf("%d",&n);
	memo_table = malloc((1+n)*sizeof(double));
	for(i=0;i<=n;i++) memo_table[i] = UNUSED;
	// base case
	memo_table[0] = 1.0;
	double a = foo(n);
	printf("%lf\n",a);
	free(memo_table);
}

Improvements over given program:

  1. Stack space because of function calls reduced to two level only.
  2. Extra space used for the $1D$ array = $O(n)$
  3. More importantly, Time is reduced to $O(n^2)$. (Exponential to Quadratic !! )

Overall space complexity = stack + extra = $O(1) + O(n) = O(n) $

Answer is B.

edited by
by

4 Comments

for each function call to foo() , a loop will run. We can total n different foo(x) and each foo takes linear time,
4
4

@dd
Yeah. Space complexity is $\mathrm{O}(n)$ but the time complexity is $\mathrm{O}(n^2)$.

Because time complexity is calculated against how many times any $\mathrm{O}(1)$ operations happen not actually how many times the function is called. Here '+=' operator is used $\mathrm{O\left(\frac{n(n-1)}{2}\right)}=\mathrm{O}(n^2)$ times using the prescribed optimization.

3
3

Time Complexity is O(n^2)

Eg; foo(4) execute

  1. foo(0)
  2. foo(1) uses value of foo(0).
  3. foo(2) loop runs 2 times and use value of foo(0) and foo(1)
  4. foo(3) loop runs 3 times and use value of foo(0) and foo(1) and foo(2)

Similar to sum of 1st natural nos , so O(n^2) complexity.

6
6
18 votes
18 votes

The modification being proposed is tabulation method of Dynamic Programming.

So let we have an auxillary array which stores the value of $foo(i)$ once it is computed and can be used at later stage .So $foo(i)$ is not called again , instead lookup from the table takes place. It would be something like :

#include <stdio.h>
#define max 1000
int foo[max];
foo(int n) {
	if(foo[n] == -1) //compute it and store at foo[n] and return
	else // return foo[n] directly
}
int main(void) {
	//initialize all elements of foo[max] to -1  
	return 0;
}

Using this approach , we need an auxillary array of size $n$ to store foo(i) where $i$ ranges from $0$ to $ n-1$.

So space complexity of this method =   $O(n)$ And function foo(n) computes $2^{n-1}$

Hence the correct option should be B)

edited by

3 Comments

What would be the running time complexity ?
0
0

Dulqar  

time complexity is O(n) in this code.

3
3

@Bikram sir Time complexity for above program is O(N) or O(N2)

1
1
2 votes
2 votes
Could anybody find the time complexity of foo(int n) through recursion tree method.

Thankyou
2 votes
2 votes

For part one of the question (NON-DP)

Space Complexity: T(n-1) + 1

Time Complexity: 2T(n-1) + 1

Reason: One T(n-1) for i=0 to n-2 and other T(n-1) for i=n-1.

For this 2nd part of the question (DP)

Space Complexity: No recurrence relation required.

Time Complexity:T(n-1)+n

Reason: only one T(n-1) for i=0 to n-1 and doing work n times.

Answer:

Related questions