Consider the problem of multiplying two *n* digit integers. The algorithm that we
learned in school involves multiplying each digit of one number times each digit of the other.
This requires *n*^{2 } multiplications followed by
some appropriately aligned additions. So the algorithm is O(*n*^{2}) -- which was the answer
to problem 2.10b in the text. For large values of *n* this is not very good.

One might try a recursive divide and conquer approach. Assume that *n* is an even power of 2.
It is easy to see that an *n* digit integer can be viewed as the sum
of two *n*/2 digit integers
with one of the integers shifted *n*/2 places to the left before the addition.
Hence, the *n* digit integer
*X* in base 10 can be expressed as:

X= 10^{n}^{/2}a + b

*
*and the *n* digit integer *Y* as:

Y= 10^{n}^{/2}c + d

where *a,b,c,d* are *n*/2 digit integers. This suggests that we could solve the
multiplication problem recursively by finding the product:

XY= (10^{n}^{/2}a + b)(10^{n}^{/2}c + d)

Unfortunately, multiplying this out gives us:

XY= 1010^{n}ac +^{n}^{/2}(ad+cb)+ bd

*
*which seems to leave us with 4 multiplications, each half the size of the original
problem. According to the master theorem, this would still be an O(*n*^{2})
algorithm.

Can we improve on this? Can we get by with only 3 multiplications? The trick is to notice that:

ad + cb=(a + b)(c + d) -ac - bd

and thus that

10^{n}^{/2}(ad+cb)=10^{n}^{/2}((a + b)(c + d) -ac - bd)

*
*Since we would be computing *ac* and *bd* in any case, this would mean that we
only need to do, further, the *n*/2 digit multiplication (*a + b)(c + d*). So we
can use as a basis for recursion the formula:

XY= 1010^{n}ac +^{n}^{/2}((a + b)(c + d) -ac - bd)+ bd

*
*which only requires 3 multiplications of *n*/2 digit numbers along with some shifting
and addition. According to the master theorem, this is an
O(*n*^{log23}) = O(*n*^{1.59}) algorithm (this was the
solution to recursion d in assignment #2). For large values of *n* this is a significant
improvement over the school algorithm. Note that this approach works equally well for base 2
-- shifts in modern processors are extremely cheap.

The multiplication of polynomials in one variable works exactly the same way -- except polynomials are easier
to multiply than integers, since one does not need to worry about carries. Two polynomials of
degree
*n* can be multiplied either by the school algorithm, given in figure 3.19 on p. 65 of
our text, or by a divide and conquer recursion similar to the algorithm suggested above for
integers. The former is
O(*n*^{2}) while the latter is O(*n*^{log23}).

Assume, for the divide and conquer algorithm,
that the number of terms is an integral power of 2, and that we are given two polynomials
*X* and *Y* of degree *n* - 1. Then the algorithm could be described as follows:

1. IfUse an Array or ArrayList to store the coefficients of a polynomial, and implement the divide and conquer approach to polynomial multiplication.n= 1 orn= 2 solve the problem directly; Otherwise:

2. Construct four half-size polynomialsa,b,candd, of degreen/2 - 1, out ofXandY.DivideXup into parts,aandb, and divideYup into partscandd-- soaandcwill have the low order coefficients (indices 0 throughn/2 -1) ofXandY, andbanddwill have the high order coefficients (indicesn/2 throughn-1).

3. Calculate the sumsx'=a+bandy'=c+d-- note thatx'andy'are also polynomials of degreen/2 - 1.

4. Calculatetemp =x'recursively.^{.}y'5. Calculate

ac, andbdrecursively.

6. Lettemp=temp - ac - bd;

7. Combineac,bd, andtemp, which are all polynomials of degreen- 1, into a new polynomial of degree 2n-1, placingacin indices 0 throughn- 1,bdin indicesnthrough 2n- 1, andaddingtempto the values now in indicesn/2 through 3n/2 -1. Notice that the highest coefficient will be 0.

EXTRA EXTRA CREDIT: Using random arrays, compare
the running times of both the school algorithm and the divide and conquer algorithm for *n* ranging over the even powers of 2 up through 128. Do your experimental results confirm your theoretical prediction? Why or why not?

EXTRA EXTRA EXTRA CREDIT: Can this be done any faster by dividing the polynomial into 3 parts? How many
multiplications would be required? What is the general formula for the number of multiplications
required given division into *k* parts? What does this say about the theoretical lower limit
on the order of growth for multiplication?