Given all the roots of a polynomial, I have to figure out an algorithm that generates the coefficients faster than O(n^2). I\'m having trouble approaching this problem. I\'m
kraskevich basically nailed it. Some details are missing, and it would be too long to fit into the comment field. Here are the details.
Basically, you want to set up as a polynomial multiplication problem. Your input would be p1,...pN where pj(x) = (x-rj).
Here's pseudo-code:
function multiply2Poly(p1, p2)
// here, use FFT, multiply and use IFFT back
function multiplyPoly(p[1],...p[N])
if (N==1) return p[1]
if (N==2) return multiply2Poly(p[1],p[2])
else {
return multiply2Poly(multiplyPoly(p[1],...p[N/2]),multiplyPoly(p[1+N/2],...,p[N])
}
function getCoef(r[1],...r[N])
return multiplyPoly((p[1]=x-r[1]),...(p[N]=x-r[N]));
And for the FFT part:
Observe that if two polynomials are:
p1 = a[0]+a[1] x + ...+a[n] x^n
p2 = b[0]+a[1] x + ...+a[n] x^n
Then p1 * p2 = c[0] + c[1] x + ...+c[n] x^n
where C = A [x] B where [x] = convolution. A = (a[0],...,a[n]), B = (b[0],...,b[n]) and C = (c[0],...,c[n]).
Then use FFT and the convolution theorem to speed this up.
C = A [x] B = IFFT{ FFT{A} * FFT{B} } where * here is just multiplication.
runtime of IFFT = runtime of FFT = O(n log n).
multiplication run time is n, so the total run time is O(n log n).
The total run time of multiplyPoly is then:
T(N) = R(N/2) + T(N/2)*2
And R(N/2) = O(n log n) is the run time of multiply2Poly as described above.
so
T(N) = T(N/2) * 2 + O(n log n)
and now the Master theorem gives the O(n (log n)^2)
Here is an O(n * (log n)^2) solution:
The base case: for one root a, the answer is just x - a.
Let's assume that we have a list of more than one root. We can solve the problem recursively for the first and the second half of the list and then multiply the results using Fast Fourier Transform.
The time complexity is obtained from the equation T(n) = 2 * T(n / 2) + O(n log n), which is O(n * (log n)^2) according to the Master theorem.