in Algorithms retagged by
768 views
0 votes
0 votes
How many key comparisons are there , what is the lower bound and upper bound ?

 

For calculating the lower bound , should we consider the case when the keys are all in non-increasing fashion and then after n-k comparisons we shall find one of the K keys and then we are done , and for calculating upper bound then , what should be the case considered for the order of keys ?
in Algorithms retagged by
768 views

1 Answer

2 votes
2 votes
Best answer

I suppose here the upper bound and lower bound cases would be same.
To find one of the k smallest keys, we need to look at n-k+1 elements atleast to be sure that we have chosen one among k smallest keys.. So, total n-k comparisons
In the question nothing about the order of the keys is mentioned. so we must look for n-k+1 elements, so n-k comparisons.
Time complexity should be ϴ(n-k).

edited by

11 Comments

Lower bound and upper bound being same we can use $\Theta$ :)
2
2

if question is like that

We are given n keys and an integer k such that 1<=k<=n.Give an efficient algo to find the kth smallest key .

Suppose the list is 10,4,8,5,2,1,6

So, no. of comparison to find 3rd smallest element?

What should we do here?

first we do sorting.

Best case of any sorting takes O(n) comparison

Then get the element in O(1) time.

right?

0
0

@srestha goel.. the question you have asked is about finding the kth order statistic
There are two good ways to find that..
1. Sort the array first, which takes Ω(nlogn) time (for comparison based sorting), and then find the kth element in array, which takes O(1), so overall the algorithm takes ϴ(nlogn)

2. There is an algorithm, similar to quick sort, called quick select..
We first partition the array on the pivot and try to find the kth smallest element in one of the partitions of the array.
you can see the algo here https://en.wikipedia.org/wiki/Quickselect
This has a worst case time of O(n2), but has best case time and expected time of O(n)

0
0
Is it necessary that we strict to comparison based sorting (merge sort, quick sort,selection sort) ,if those takes more time Ω(nlogn) ? Other sort best case takes Ω(n).
0
0
Even if you use those for the best case.. Still the complexity would remain same I think.
Also, there are other disadvantages of using such sorts.. like the space complexity can be really huge.. when the range of numbers is really huge..
So, it may not be a really efficient solution..
And also here we cannot predict how often a best or worst case would occur..
0
0
I am not at all concern about other things.

But why we could not take time complexity as O(n) , if we are searching in doubly linked list

and if we are searching in an array then O(nlogn) [ because after sorting we do binary search to find k th element]

right?
0
0
In array if we are once sorted. we can directly access A[k], assuming array index start from 1, and get the answer, no need of binary search.

And i dint get your point about the doubly linked list. how can we find the kth smallest element in O(n) in doubly linked list?
0
0
edited by
We can do the search in both (1) array (2)Linked list

yes for array no need to binary search.

And in circular doubly linked list (sorry not only doubly) access time of any element is O(1). right?

Then time complexity will be O(n) [for kth smallest element ]

Why we will take O(n log n) time ,while we can do it by only O(n) time?
0
0

Well, i dont think it is possible to do that in O(n) time.
1. The sorting you are talking about has a best case of Ω(n), and we cannot say whether a given array is a best case or not directly. In the worst case of say radix sort or hash table/etc the worst case can become O(n2).
So, saying that it will take O(n) time is clearly not correct.
2. In doubly and circular linked list, the access time of any element is not O(1). 
Suppose we want to access the n/2th element, we need to either start from head or tail(which is same for circular list) and need to parse through n/2 elements in the list.. which is O(n).
But then to first sort it using your method, we may require O(n2) time..

1
1
edited by
ok ,getting ur view So, if we have to select an sorting between (a) Insertion sort and (B) Merge sort for this algo We will select Merge sort. right? As in insertion sort worst and avg case has a possibility to get complexity O(n^2). Right?

Best case and Worst case can predict only after operation is performed. right?

and if we have to select between (A)Insertion Sort and (B) Quick sort ?

then we can select Insertion sort, as quick sort worst case also gives O(n^2). right?
0
0

Your first two points are right.. thats exatly what i wanted to say :)

But the last one isnt.
Now in quick sort, the worst case occurs fairly very less number of times (assuming equal probability for each permutation)
And i dont remember the exact analysis now, but its given in Cormen..
Although the worst case time of quick sort is O(n2), the EXPECTED time for it is still nlogn..the probability of worst case can even be lowered more by using the randomized version of quick sort.

So, to answer your last question, if nothing is mentioned about the input we should choose quicksort(asymptotically better) because it has less expected running time..
But suppose, if we are given the numbers in array are nearly sorted, then we can choose insertion sort as it will produce better results for such cases, close to Θ(n) time..

1
1

Related questions