Polynomial Multiplication Problem

Consider the problem of multiplying two n digit integers. The algorithm that we learned in school involves multiplying each digit of one number times each digit of the other. This requires n2 multiplications followed by some appropriately aligned additions. So the algorithm is O(n2) -- which was the answer to problem 2.10b in the text. For large values of n this is not very good.

One might try a recursive divide and conquer approach. Assume that n is an even power of 2. It is easy to see that an n digit integer can be viewed as the sum of two n/2 digit integers with one of the integers shifted n/2 places to the left before the addition. Hence, the n digit integer X in base 10 can be expressed as:

X = 10n/2a + b

and the n digit integer Y as:

Y = 10n/2c + d

where a,b,c,d are n/2 digit integers. This suggests that we could solve the multiplication problem recursively by finding the product:

XY = (10n/2a + b)(10n/2c + d)

Unfortunately, multiplying this out gives us:

XY = 10nac + 10n/2(ad + cb) + bd

which seems to leave us with 4 multiplications, each half the size of the original problem. According to the master theorem, this would still be an O(n2) algorithm.

Can we improve on this? Can we get by with only 3 multiplications? The trick is to notice that:

ad + cb = (a + b)(c + d) - ac - bd

and thus that

10n/2(ad + cb) = 10n/2((a + b)(c + d) - ac - bd)

Since we would be computing ac and bd in any case, this would mean that we only need to do, further, the n/2 digit multiplication (a + b)(c + d). So we can use as a basis for recursion the formula:

XY = 10nac + 10n/2((a + b)(c + d) - ac - bd) + bd

which only requires 3 multiplications of n/2 digit numbers along with some shifting and addition. According to the master theorem, this is an O(nlog23) = O(n1.59) algorithm (this was the solution to recursion d in assignment #2). For large values of n this is a significant improvement over the school algorithm. Note that this approach works equally well for base 2 -- shifts in modern processors are extremely cheap.

The multiplication of polynomials in one variable works exactly the same way -- except polynomials are easier to multiply than integers, since one does not need to worry about carries. Two polynomials of degree n can be multiplied either by the school algorithm, given in figure 3.19 on p. 65 of our text, or by a divide and conquer recursion similar to the algorithm suggested above for integers. The former is O(n2) while the latter is O(nlog23).

Assume, for the divide and conquer algorithm, that the number of terms is an integral power of 2, and that we are given two polynomials X and Y of degree n - 1. Then the algorithm could be described as follows:

1. If n = 1 or n = 2 solve the problem directly; Otherwise:
2. Construct four half-size polynomials a, b, c and d, of degree n/2 - 1, out of X and Y. Divide X up into parts, a and b, and divide Y up into parts c and d -- so a and c will have the low order coefficients (indices 0 through n/2 -1) of X and Y, and b and d will have the high order coefficients (indices n/2 through n -1).
3. Calculate the sums x' = a + b and y' = c + d -- note that x' and y' are also polynomials of degree n/2 - 1.
4. Calculate temp = x' . y' recursively.
5. Calculate ac, and bd recursively.
6. Let temp = temp - ac - bd;
7. Combine ac, bd, and temp, which are all polynomials of degree n - 1, into a new polynomial of degree 2n -1, placing ac in indices 0 through n - 1, bd in indices n through 2n - 1, and adding temp to the values now in indices n/2 through 3n/2 -1. Notice that the highest coefficient will be 0.
Use an Array or ArrayList to store the coefficients of a polynomial, and implement the divide and conquer approach to polynomial multiplication.

EXTRA EXTRA CREDIT: Using random arrays, compare the running times of both the school algorithm and the divide and conquer algorithm for n ranging over the even powers of 2 up through 128. Do your experimental results confirm your theoretical prediction? Why or why not?

EXTRA EXTRA EXTRA CREDIT: Can this be done any faster by dividing the polynomial into 3 parts? How many multiplications would be required? What is the general formula for the number of multiplications required given division into k parts? What does this say about the theoretical lower limit on the order of growth for multiplication?