Python: Approximating ln(x) using Taylor Series
问题 I'm trying to build an approximation for ln(1.9) within ten digits of accuracy (so .641853861). I'm using a simple function I've built from ln[(1 + x)/(1 - x)] Here is my code so far: # function for ln[(1 + x)/(1 - x)] def taylor_two(r, n): x = 0.9 / 2.9 i = 1 taySum = 0 while i <= n: taySum += (pow(x,i))/(i) i += 2 return 2 * taySum print taylor_two(x, 12) print taylor_two(x, 17) What I need to do now is reformat this so that it tells me the number of terms needed to approximate ln(1.9) to