in Algorithms retagged by
8,046 views
19 votes
19 votes

For constants $a \geq 1$ and $b>1$, consider the following recurrence defined on the non-negative integers:

$$T(n) = a \cdot T \left(\dfrac{n}{b} \right) + f(n)$$ Which one of the following options is correct about the recurrence $T(n)$?

  1. If $f(n)$ is $n \log_2(n)$, then $T(n)$ is $\Theta(n \log_2(n))$
  2. If $f(n)$ is $\dfrac{n}{\log_2(n)}$, then $T(n)$ is $\Theta(\log_2(n))$
  3. If $f(n)$ is $O(n^{\log_b(a)-\epsilon})$ for some $\epsilon >0$, then $T(n)$ is $\Theta(n^{\log_b(a)})$
  4. If $f(n)$ is $\Theta(n^{\log_b(a)})$, then $T(n)$ is $\Theta(n^{\log_b(a)})$
in Algorithms retagged by
by
8.0k views

4 Comments

No, but this time, the questions were jumbled up
1
1
Questions are jumbled but if you look at the right hand corner of every question you’ll observe a question id for instance for this question the ID was ...8232513139 … so the last 2 digits that’s “39” denote that it is the question 39 hence you can be sure that the it’s a 2 marker .
ps: 01-25 are 1 marker 26-55 are 2 markers, the range of GA question is from 8232512991-8232513100. with similar marking scheme
2
2
Thanks @pranav_mehta this was really helpful.
0
0

3 Answers

24 votes
24 votes
Best answer

Options A and B are definitely wrong, condition on $f(n)$ can’t be independent of $a$ and $b$ in any case, it should take both $a$ and $b$ into account.

Option C is correct. standard case of master theorem, if $f(n)$ is polynomial time smaller than $O(n^{\log_ba})$, then $T(n) = \Theta(n^{\log_ba})$.  (see case $1$ below).

$\textbf{Theorem 4.1 (Master theorem)}$

Let $a \geq 1$ and $b>1$ be constants, let $f(n)$ be a function, and let $T(n)$ be defined on the nonnegative integers by the recurrence

$$T(n) = aT(n/b) + f(n),$$

where we interpret $n/b$ to mean either $\left \lfloor n/b \right \rfloor$ or $\left \lceil n/b \right \rceil.$ Then $T(n)$ has the following asymptotic bounds:

  1. If $f(n) = O(n^{\log_{b}a-\varepsilon})$ for some constant $\varepsilon >0,$ then $T(n) = \Theta (n^{\log_{b}a}).$
  2. If $f(n) = \Theta (n^{\log_{b}a}),$ then $T(n) = \Theta(n^{\log_{b}a}\lg n).$
  3. If $f(n) = \Omega(n^{\log_{b}a+\varepsilon})$ for some constant $\varepsilon > 0,$ and if $af(n/b)\leq cf(n)$ for some constant $c < 1$ and all sufficiently large $n,$ then $T(n) = \Theta (f(n)).$

Reference

Option D is wrong,  (see case $2$ above).

A good slide to understand master theorem and the idea behind it.

edited by

2 Comments

Best👍.

 Sir,

DOUBT::

what if the regularity condition is not satisfied then what are the further steps to make? 

0
0
0 votes
0 votes
  1. it is standard case of the extended master’s theorem.
0 votes
0 votes
In recurrence relation $f\left ( n \right )$ means extra time other than the time taken by smaller problem .

So, We can say that the Total time

$T\left ( n \right )=$ Time taken by smaller problem +  Any other extra Time

With this, we can conclude the following relation

$T\left ( n \right ) $  $\geq $ $f\left ( n \right )$

And, above equation is only satisfied in option C.
edited by

4 Comments

But bro, you wrote it as time taken by merging the solution, shouldn't it be “time taken to split subproblem + time to merge them + some extra time"?

Plus I'm not getting your logic here.. how are you saying only option C is true from this thing.. This doesn't seems a valid conclusion.

as you are saying you did an actual time comparision there..  but qn gave you an asymptotic comparison.. how can you say that $f(n) < T(n)$ in option C. It might be true in other option as well, we don't know the actual constants or do we?
0
0
Yes, to be specific you can say that, it’s an asymptotic comparison only. And, I think the merge meaning is not very clear and it confuses with the merge procedure. So, I have updated my answer.

you can take any recursive equation and see if this property holds or not. If not please let me know because this answer was based on my basic understanding only.
0
0
How does the option (A) or (D) not satisfy the equation $T(n) \geq f(n)$?

and if you'll decide to remove this equal sign..

then as well this method won't work..

$T(n) = T(n/2) + O(n)$, here $f(n) = O(n)$ , so will be  $T(n)$. I hope you get the flaw here.

I agree this will be true for all down growing recursions.. but I don't think this procedure is correct way do this.

Read the topic from CLRS or from any good resource.
0
0
Answer:

Related questions