@hemant
I think you have a point, but, isnt there always a constant factor deviation?
I mean the definition states that
g (n) is in O (f (n)) if for a CONSTANT c,
g (n)<= c.f (n)
But why arent we taking the lower bounds in constants, but in n, n^2 etc ?