Redirected
in Algorithms edited by
20,109 views
66 votes
66 votes

Let $A[1,\ldots,n]$ be an array storing a bit ($1$ or $0$) at each location, and $f(m)$ is a function whose time complexity is $\Theta(m)$. Consider the following program fragment written in a C like language:

counter = 0;
for (i=1; i<=n; i++)
{ 
    if (a[i] == 1) counter++;
    else {f (counter); counter = 0;}
}

The complexity of this program fragment is

  1. $\Omega(n^2)$

  2. $\Omega (n\log n) \text{ and } O(n^2)$

  3. $\Theta(n)$

  4. $o(n)$

in Algorithms edited by
20.1k views

4 Comments

Sry it's c
0
0

@ Ankit Chourasiya

Then why f(m) complexity is given .

0
0
For those who think we will have O(n^2) when the bit string is all ones and end element with 0.

Just remember in the last we should not multiply but add so it will be  n-1 + O(n-1)  i.e O(n)
1
1

13 Answers

87 votes
87 votes
Best answer

The key part in the code is "$\text{counter} = 0$" in the else part as we can see below.

Lets take the best case. This happens when $a[i] = 1$ for all $i$, and then the loop executes with time complexity $\Theta(1)$ for each iteration and hence overall time complexity of $\Theta(n)$ and we can say time complexity of the code fragment is $Ω(n)$ and hence options A and B are false.

Now, consider the worst case. This happens when $a[i] = 0$ or when else part is executed. Here, the time complexity of each iteration will be $\Theta(\text{counter})$ and after each else, counter is reset to $0$. Let $k$ iterations go to the else part during the worst case. Then the worst case time complexity will be $\Theta(x_1) + \Theta(x_2) + \dots +\Theta(x_k) + \Theta(n-k)$, where $x_i$ is the value of the counter when, $A[i] = 0$ and $f(\text{counter})$ is called.  But due to $\text{counter} = 0$ after each call to $f()$, we have,

$x_1+x_2+\ldots x_k = n-k.$ (counter is incremented $n-k$ by the “if” part)

$\implies \Theta(x_1) + \Theta(x_2) + \dots +\Theta(x_k) + \Theta(n-k) = \underbrace{\Theta(n-k)}_{\text{combined execution of f across all iterations}} + \underbrace{\Theta(k)}_{k \text{ times else part statement}} + \underbrace{\Theta(n-k)}_{n-k\text{ times if part}}$

$\qquad \qquad = \Theta(k) + \Theta(n-k) = \Theta(n)$.

Since the time complexity is $Ω(n)$ and $\Theta(n)$ we can say it is $\Theta(n)$ - Option (C). (Option D is false because the small o needs the growth rate to be STRICTLY lower and not equal to or lower as the case for big O)

If $\text{counter} = 0$ was not there in else part, then time complexity would be $\Omega(n)$ and $O(n^2)$ as in worst case we can have equal number of $0$'s and $1$'s in array $a$ giving time complexity $\Theta(1) + \Theta(2) + \cdots  + \Theta(n/2) + \Theta(n/2)$ would give $O(n^2)$. 

Correct Answer: C.

edited by
by

4 Comments

@samarpita @Abhrajyoti00 Is it fine now?

1
1

@gatecse yes sir

@Abhrajyoti00😀

3
3

if counter=0 not there then for worst case time complexity then array content will be like 101010… equal(alternate & equal 1 and 0).

1
1
25 votes
25 votes

 Please note that inside the else condition, f() is called first, then counter is set to 0.

Consider the following cases:

a) All 1s in A[]: Time taken is Θ(n) as
                  only counter++ is executed n times.

b) All 0s in A[]: Time taken is Θ(n) as
                  only f(0) is called n times

c) Half 1s, then half 0s: Time taken is  Θ(n) as
                  only f(n/2) is called once.
edited by

1 comment

one more case will be when (n-1) positions are 1 and One position(nth / last position) is 0.

In this case if condition will be executed n-1 times making the value of counter = n-1

when i = n then the else condition will be executed f (n - 1) once

therefore

T(n) = (n-1)*O(1) + 1*Q(n - 1) = O(n)

Now since best case and worst case are n therefore Option C is correct.
3
3
4 votes
4 votes
ans should be C which is Θ(n)

because in best case time complexity is Ω(n) when array contain all 1's then only if condition executing n times

in worst case time complexity is O(n)

let if condition  executing n/2 times then counter value is n/2 and "(n/2)+1" th iteration else condition executing so  f(n/2) executing n/2 times  and make counter value is 0 and again remaing (n/2) iterations running if condition (n/2) times so total time (n/2)+(n/2)+(n/2) so worst case time complexity is O(n)
edited by
by
3 votes
3 votes

I think best case time complexity when all of the array elements contains 1 , then it will be Ω(N) .

and in worst case time complexity when all of the array elements contains 0 . then it will be e(0)+e(0)+....+ e(0) (its totally depends upon the e(m) . )

or the array elemnt is like this 11111 to m000....  then time complexity 1+1+1......+e(m)+e(m)+....+ e(m) ( then also its totally depends upon the e(m) . )

edited by

2 Comments

I think your analysis in case of worst time complexity is wrong
because each time func(m)is invoked counter becomes 0 and as the
time complexity of fun(m)is O(m) and when counter would be zero then
the func would take O(0) time .
Worst time would be O(n).
0
0
no when function is called counter remains same , because the counter = 0 is after calling f(counter) so the counter remain same .
1
1
Answer:

Related questions