Line 67: Line 67:
 
<math>\lim_{N\to\infty}\frac{E(N)+E(N+1)}{2}=\lim_{N\to\infty}\frac{E(N)}{2}+\lim_{N\to\infty}\frac{E(N+1)}{2}=0+0=0</math>
 
<math>\lim_{N\to\infty}\frac{E(N)+E(N+1)}{2}=\lim_{N\to\infty}\frac{E(N)}{2}+\lim_{N\to\infty}\frac{E(N+1)}{2}=0+0=0</math>
  
And so the average still goes to infinity because:
+
And so the average still goes to pi/4 as N goes to infinity because:
  
 
<math>\lim_{N\to\infty}\frac{S(N) + S(N+1)}{2}=\lim_{N\to\infty}\frac{\pi}{4}+\frac{E(N)+E(N+1)}{2}=\frac{\pi}{4} + 0 = \frac{\pi}{4}</math>
 
<math>\lim_{N\to\infty}\frac{S(N) + S(N+1)}{2}=\lim_{N\to\infty}\frac{\pi}{4}+\frac{E(N)+E(N+1)}{2}=\frac{\pi}{4} + 0 = \frac{\pi}{4}</math>

Revision as of 08:40, 28 October 2008

We will start this from the beginning with the series:

$ 1+r+r^2+r^3+...+r^N=\frac{1}{1-r}-\frac{r^{N+1}}{1-r} $

From here we substitute $ r=-x^2 $ to get

$ 1-x^2+x^4-x^6+...+(-1)^Nx^{2N}=\frac{1}{1+x^2}-\frac{(-1)^{N+1}x^{2(N+1)}}{1+x^2} $

Now integrate both sides from 0 to 1 with respect to x.

$ \int_0^11-x^2+x^4-x^6+...+(-1)^Nx^{2N}dx=\int_0^1\frac{1}{1+x^2}dx-\int_0^1\frac{(-1)^{N+1}x^{2(N+1)}}{1+x^2}dx $

$ 1-1/3+1/5-1/7+...+\frac{(-1)^N}{2N+1}=\frac{\pi}{4}+\int_0^1\frac{(-1)^{N}x^{2(N+1)}}{1+x^2}dx $

Notice that $ \lim_{N\to\infty}\int_0^1\frac{(-1)^{N}x^{2(N+1)}}{1+x^2}dx=0 $ so the more terms we use on the left side of the equation the closer we get to $ \frac{\pi}{4} $. This integral can therefore be called the error function.

For the sake of space we will give labels for several parts.

$ S(N)=1-1/3+1/5-1/7+...+\frac{(-1)^N}{2N+1} $

$ E(N)=\int_0^1\frac{(-1)^{N}x^{2(N+1)}}{1+x^2}dx $

$ S(N)=\frac{\pi}{4}+E(N) $

We can call S(N) the sum and E(N) the error, since as you add more and more terms to the sum, the error becomes closer and closer to zero.

Just how quickly does this sum approach $ \frac{\pi}{4} $? To find out, we can estimate the error for any N by the following comparison:

for $ 0\le x\le 1 $

$ 0\le x^2\le 1 $

$ 1\le 1+x^2 \le 2 $

$ \frac{1}{2}\le \frac{1}{1+x^2}\le 1 $

so

$ \frac{x^{2(N+1)}}{1+x^2}\le x^{2(N+1)} $

And finally

$ |E(N)|=|\int_0^1\frac{(-1)^{N}x^{2(N+1)}}{1+x^2}dx|=\int_0^1\frac{x^{2(N+1)}}{1+x^2}dx\le\int_0^1x^{2(N+1)}dx $

Integrating the estimate gives us:

$ |E(N)|\le\frac{1}{2(N+1)+1}\le\frac{1}{2N} $

Wow, the sum of N terms can reall only be guaranteed to be within $ \frac{1}{2N} $ of $ \frac{\pi}{4} $. That's a lot of terms we have to take to guarantee even two digits of accuracy. And it's even worse when we multiply everything by four to estimate pi.

We need a faster way. There's lots of faster ways. But let's say we're too lazy to find another method. We just want to work with this series. What can we do to make it converge to $ \frac{\pi}{4} $ faster?

First, let's notice that for every other term the sum is too great and then too little. $ \frac{\pi}{4} $ is always between two consecutive sums. Dr. Bell noticed this and posed the challenge to exploit this property in order to make a new way to estimate that converges more quickly. He said we should try to take the average of two consecutive terms. Since the target lies between the two sums, the average of the sums should yield a new estimate that is closer than both the other two. The challenge was to prove this true.

So we notice that for

$ N=\text{odd}, E(N)<0 $

$ N=\text{even}, E(N)>0 $

Now we add two consecutive sums together and divide by two to get the average:

$ \frac{S(N) + S(N+1)}{2} = \frac{2\frac{\pi}{4}+E(N)+E(N+1)}{2}=\frac{\pi}{4}+\frac{E(N)+E(N+1)}{2} $

So now we have a new error because

$ \lim_{N\to\infty}\frac{E(N)+E(N+1)}{2}=\lim_{N\to\infty}\frac{E(N)}{2}+\lim_{N\to\infty}\frac{E(N+1)}{2}=0+0=0 $

And so the average still goes to pi/4 as N goes to infinity because:

$ \lim_{N\to\infty}\frac{S(N) + S(N+1)}{2}=\lim_{N\to\infty}\frac{\pi}{4}+\frac{E(N)+E(N+1)}{2}=\frac{\pi}{4} + 0 = \frac{\pi}{4} $

Alumni Liaison

EISL lab graduate

Mu Qiao