in Algorithms
362 views
0 votes
0 votes
   The conventional way of multiplying two n-bit integers requires O(n2) time. There are much better ways of multiplying long integers, based on a divide-and-conquer approach. For example, we can break each n-bit integer into n/2-bit halves, and produce the result by making 3 recursive calls to n/2-bit multiplications (which are themselves done in this clever way) and several additions of n/2-bit integers. Since addition of n-bit numbers only requires O(n) time, you get a recurrence for the running time of T(n) = 3T(n/2) + cn for some constant c (which, asymptotically, doesn't matter). This algorithm is sometimes referred to as the "Karatsuba-Ofman (KO) Algorithm."

Hypothetically, we can beat the KO approach if we can break our two n-bit numbers into y pieces of length n/y, and perform the multiplication of n-bit numbers by using x recursive multiplications of n/y bit numbers plus any constant number of additions of n/y-bit numbers. What is the relationship between x and y that would make this hypothetical algorithm have a lower running time than the KO Algorithm? Identify one such pair below.

Note, because the exact calculation requires the computation of logarithms, you may wish to find and use a scientific calculator of some sort.

 
  a)  x=53; y=12.
  b)  x=44; y=11.
  c)  x=40; y=10.
  d)  x=27; y=8.
 
in Algorithms
by
362 views

1 Answer

1 vote
1 vote

The complexity for Karatsuba-Ofman (KO) Algorithm will be O(n1.58496).  ---------------------1

The equation for the improved hypothetical version will

T(n)=x*T(n/y)+O(1) 

Time complexity for this equation should be O(nlogyx).    -------------------2

comparing 1 and 2 , we get

logyx<1.58496              --------------3

Now check options..

(b) will satisfy the equation.

Correct me if I am wrong.

Related questions