For some reason I felt like this was relevant |

In this post, I'll describe the process of finding an algorithm to approximate the number π.

There are three main types of algorithms. The first type converges very quickly to the correct solution; we use algorithms of this type in computers and calculators. The second type converges slowly, but is interesting to think about; we take this type of algorithm to mathematics conferences. The third type is slow to converge, and is boring to think about; these are pretty much useless so we get rid of them.

This may be interesting to know, but I wasn't too worried about it when I wrote my Pi Day post last year. I was looking for some simple algorithms, that was all. I needed 5 good algorithms by Pi Day.

A good approach when solving a problem is to start with what you know. What do we know about π? It's the area of a circle with radius 1, circumference of a circle with diameter 1, and it's equal to 180 degrees in radians.

Yes, π can be an angle. As I explained in my 2015 pi day post:

When most people talk about angles, they use degrees – as in 90° (a right angle), 180° (a U-turn), and 360° (1 full turn, or very dizzy). But mathematicians use radians, which involves looking at the circumference of a unit circle, so instead of a full circle being 360°, it's equal to 2π (the circumference of a unit circle). A U-turn, or 180°, is equal to π, and a right-angle (90°) is equal to π/2.

Now that we're looking at π as an angle, the sine and cosine functions come to mind. Sin(π) is zero, and cos(π/2) is zero (see my blog post about sine and cosine). We could use these properties to approximate π. We may choose to go with cos(π) = 0, since sin(0) also equals zero.

An algorithm might look something like this:

1. Start with

2. Check whether cos(

3. If not, try again with another value for

*a*= 02. Check whether cos(

*a*/2) is close enough to zero3. If not, try again with another value for

*a*One important thing to note is that, for a number

*x*slightly greater than π/2, cos(

*x*) < 0. For a number

*x*slightly less than π/2, cos(

*x*) > 0. Based on this knowledge, if cos(

*a*/2) is greater than zero, we need to increase our approximation. If cos(

*a*/2) is less than zero, we need to decrease our approximation. So we could change step 3 to the following (where

*v*is a small number):

3. If cos(

*a*/2) > 0, add*v*to*a*. Else add -*v*to*a*.Notice that cos(

*a*/2) has the same sign as the value we're adding. So instead of adding

*v*, why not add cos(

*a*/2)? Here's the final version of the algorithm:

1. Start with

2. Check whether cos(

3. If it's not, add cos(

*a*= 02. Check whether cos(

*a*/2) is close enough to zero3. If it's not, add cos(

*a*/2) to*a*and go back to step 1Or, for math people, we could write it as a series:

*x*

_{0}= 0

*x*

_{n+1}=

*x*

_{n}+ cos(

*x*

_{n})

Now the important question is, does the algorithm converge? The answer is yes: it converges to π/2. This is because:

error = π/2 -

error = π/2 -

*x*_{n}> sin(π/2 -*x*_{n}) = cos(*x*_{n})error = π/2 -

*x*_{n}≈ cos(*x*_{n}), for*x*_{n}close to π/2Or because:

It's a fixed-point iteration with g'(

*r*) = 0 - namely, the algorithm is of the form

*x*

_{n+1}= g(

*x*

_{n}), which will converge to

*r*= g(

*r*) when g'(

*r*) < 1.

Now, of the three types of algorithms I mentioned earlier, which type is this? Well, it converges very quickly, but not efficiently. The function cos(

*x*) is very costly to compute. We clearly wouldn't use this algorithm in a computer. Still, the algorithm is interesting because of its simplicity: it relies only on the properties of the cosine function. Hence, it fits pretty well into the second category.

...on the other hand, if you find math boring, you may feel that the algorithm fits the third category. In which case, today is just a regular Monday. No pi for you.

*New posts every month - subscribe for free!*

This is pretty cool! Hi, by the way :-) Dunno if you remember me...

ReplyDeleteThanks! Yes, I remember you. Been a long time.

Delete