in Algorithms
882 views
2 votes
2 votes
Which one of the following notations is most relevant for finding the best algorithm for a problem?

(A) $o(f(n))$

(B) $O(f(n))$

(C) $\omega (f(n))$

(D) $ \Omega (f(n))$
in Algorithms
882 views

11 Comments

Arjun's reply - 

First of all, when we talk about algorithm -- it must work for ALL inputs. So, we should consider worst case only. 

Remember -- the minimum number of comparisons needed to sort n numbers is $\Omega(n \cdot \log n)$ -- not $n $ even though in best case we need $O(n)$ comparisons only. 

Now, for any problem, if we have a lower bound say $\Omega(f(n))$, then it shows the performance of the best possible algorithm. If any algorithm matches it, then we got the best algorithm -- at least in asymptotic terms. 


We can say that any algorithm gives a big O of a problem, and when this big O matches the big Omega, it becomes big Theta and we have solved the problem in the best way possible (again only in asymptotic terms). 

Now, as told in above comment, what is the criteria for a best algorithm -- it can be number of operations, time taken, number of memory operations etc. By default we can assume time taken. Anyway whatever we take, it works for the asked question. 

Please do not say that big O - worst case, big Theta - average case and big Omega - best case.

0
0
I am not able to understand so many statements of the comment above. It will be great if you explain.
> "the minimum number of comparisons needed to sort n numbers is $\Omega(n \cdot \log n)$ -- not $n$ even though in best case we need $O(n)$ comparisons only. "

- Why is this statement here? Is it incorrect to say that minimum number of comparisons needed to sort $n$ numbers is $\Omega(n)$ since best case takes $O(n)$ comparisons?

> for any problem, if we have a lower bound say $\Omega(f(n))$ ...

- I don't understand this part, rest is okay. How do we know a lower bound to a problem, without even knowing the algorithm?
> We can say that any algorithm gives a big O of a problem, and when this big O matches the big Omega, it becomes big Theta and we have solved the problem in the best way possible (again only in asymptotic terms).
- Does that not mean that big O of an algorithm is just as "relevant" as big Omega?

> Please do not say that big O - worst case, big Theta - average case and big Omega - best case.

- Why?
0
0

@toxicdesire

"the minimum number of comparisons needed to sort n numbers is Ω(n⋅logn)"

> for any problem, if we have a lower bound say Ω(f(n)) ...

- I don't understand this part, rest is okay. How do we know a lower bound to a problem, without even knowing the algorithm?

It is related to decision tree model for comparison based sorting. Please check below screenshots from CLRS.

If you are still not getting it then please check this video lecture :-

If you are still not getting it then please tell me.   

1
1

> "the minimum number of comparisons needed to sort n numbers is Ω(n⋅logn) -- not n even though in best case we need O(n) comparisons only. "

- Why is this statement here? Is it incorrect to say that minimum number of comparisons needed to sort n numbers is Ω(n) since best case takes O(n) comparisons?

we analyze best, worst, average cases when we know the algorithm for a particular problem. For example, for "sorting" problem , if we use insertion sort algorithm, then we analyze the best,worst and average cases for this algorithm. For example, in best case of this algorithm, elements should be already sorted and in that case, running time(in terms of no. of comparisons+ no. of swaps) will be O(n).

> for any problem, if we have a lower bound say Ω(f(n)) ...

- I don't understand this part, rest is okay. How do we know a lower bound to a problem, without even knowing the algorithm?

That's why lower bound for a problem is difficult to prove. For lower bound, we have to prove that for any algorithm that anyone could ever think of to solve a problem, It can't run faster than a certain speed. It is harder to do.

> We can say that any algorithm gives a big O of a problem, and when this big O matches the big Omega, it becomes big Theta and we have solved the problem in the best way possible (again only in asymptotic terms).
- Does that not mean that big O of an algorithm is just as "relevant" as big Omega?

Sorry, not getting you.

Please do not say that big O - worst case, big Theta - average case and big Omega - best case.

- Why?

Because asymptotic notations represent set of functions. It does not give information about the best,worst or average cases.

0
0

@toxicdesire

the minimum number of comparisons needed to sort n numbers is $\Omega(n \cdot \log n)$

where we got minimum no. of comparison nlogn?? 

0
0

@ankitgupta.1729 so, it's $\Omega(n \cdot \log n)$ for comparison based sort, in the worst case. I get it now. 

However, coming back to the original question. Let's say $\Omega(f(n))$ is relevant for finding out the best algorithm for a problem (as the answer suggests).

$\Theta(f(n))$ seems to be most accurate answer, but it's not an option. If $\Theta(f(n))$ is the correct answer, does that not mean both $O(f(n))$ and $\Omega(f(n))$ are equally relevant? 

Now, let's say there's a problem A. Now, let's assume lower bound to the problem A is $\Omega(n \cdot \log n)$. Now assume the problem can be solved with two algorithms, both have $\Omega(n \cdot \log n)$ as lower bounds. Tell me which one is the best one? 

What gives $\Omega(f(n))$ an edge over $O(f(n))$ for choosing best algorithm to a problem?

@srestha 

2
2
edited by

@toxicdesire

yes,

but how $\Theta$ will be better choice than $\Omega$??

Even can u tell me is $\Omega$ or $\omega$ will be better option??

See $\Omega$ gives lower bound for some algorithm and in case of $\Theta$, it is not even possible for some algo.

0
0

@toxicdesire

"Θ(f(n)) seems to be most accurate answer, but it's not an option. If Θ(f(n)) is the correct answer, does that not mean both O(f(n)) and Ω(f(n)) are equally relevant?"

yes, For example, in case of "sorting" problem, Heap Sort and Merge-Sort are asymptotically optimal comparison sorts which is given as corollary above because in worst case, $O(nlgn)$ and $\Omega(nlogn)$ matches which implies $\Theta(nlgn)$. So, we can say "best" algorithm for comparison based sorting is merge and heap sorts whose running time is represented by  $\Theta(nlgn).$ So, here Theta,Big Omege, Big Theta all says that we found the best algorithm for sorting which work for all inputs.

"Now, let's say there's a problem A. Now, let's assume lower bound to the problem A is Ω(n⋅logn). Now assume the problem can be solved with two algorithms, both have Ω(n⋅logn) as lower bounds. Tell me which one is the best one? "

suppose, you have proved same lower bound for "no. of comparisons" by both algorithms then I will check how much memory will be taken by both algorithms and other parameters and then say which is optimal.

"What gives Ω(f(n)) an edge over O(f(n)) for choosing best algorithm to a problem?"

  Ω(f(n)) gives lower bound means can we do better than something or not. For example, we can't sort numbers or anything with less than nlgn comparisons (or) we can't do better than nlgn.  Lower bounds are harder to prove. 

1
1

@ankitgupta.1729 Alright. got it. Also, 

suppose, you have proved same lower bound for "no. of comparisons" by both algorithms then I will check how much memory will be taken by both algorithms and other parameters and then say which is optimal.

- Won't we check the upper bounds of the two algorithms first, and choose one with lower $O(f(n))$?

1
1

@toxicdesire yes, we will check.

0
0

Please log in or register to answer this question.

Related questions

0 votes
0 votes
0 answers
1