Friday, February 12, 2010

Converge Slow, Homie

Since a string of digits recently asked me to mention something about math on here, I'll bring up a problem I am working on that should be easier than it is.

Hopefully, we are all familiar with convergence in the numerical sense, but if not, I'll try to hand-wave at it so that even an eighth grader can understand it. I've been told a good teacher can explain anything so that an eighth grader can understand it. It seems like an arbitrary line to me, but maybe a good enough one. Perhaps that is when people start displaying abstract thinking ability.

So, we can start with a sequence. A sequence is just a special kind of function, and for our purposes, we'll stick to sequences of real numbers. As to what a real number is, it's just about any kind of number you can think of that doesn't involve i somewhere. So whole numbers, 0, fractions, even irrational stuff, like 2^(1/2) or pi.

That said, a sequence is just a function of natural numbers, so something like

1, 2, 4, 9, 16, ... you can see how this sequence "goes to infinity," in that it just keeps getting bigger (I am purposefully being vague about this concept). On the other hand, the sequence s(n) = 1/n, that is

1, 1/2, 1/3, 1/4, ... doesn't keep getting bigger; it keeps getting smaller. However, it doesn't "go to negative infinity." In fact it demonstrates the central idea of calculus, which is convergence. In particular, it is said to converge to 0, or that the limit as n approaches infinity of s(n) is 0. What do I mean by that? I mean that we can think of this sequence as approximating 0, as if I didn't know what 0 was, but I was guessing at it, and each time I guessed, my guess got closer. The sequence is said to converge to a number if it approximates that number to any error. More formally,

A sequence s(n) is said to converge to a number L if and only if for all E > 0, there exists an index N such that if n > N, |s(n) -L| < E.

If you think about it, it just says that the sequence gets as close to L as we would like and stays at least that close; that we can approximate L infinitely well with a big enough term of s(n).

A proof of convergence usually goes like this: Given E > 0 (we actually use epsilon, usually)

|s(n) - L |

(bunch of algebra with inequalities)

< E for n such and such

That is, we usually set E and find an expression for N in terms of E that suffices. In the simple case above, you just set N = 1/E and you're good.

Sometimes it isn't so easy, and that is what I'm dealing with at the moment. The sequence I'm looking at is

S(n) = 1, 1/2, (1/2)(3/4), (1/2)(3/4)(5/6), ...

And so on. The denominators are just the product of the even numbers and the numerators are the product of the odds, always smaller. Each factor is less than 1, so each term is smaller than the last term, but that's not enough to show that it converges to 0. One idea might be a comparison test.

That is, if I can show that for any positive integer k, there's a positive integer N such that S(N) < 1/k, I can just compare it with the previous limit problem and say the limits must be the same. [It is easy enough to show that a sequence of positive numbers cannot converge to a negative number, so the new sequence must be "squeezed" between the old 1/n sequence and 0.]

Just working out some terms of the sequence explicitly, I've found that the limit must be less than .15, and I'm convinced that it is actually 0; that I can somehow show it is squeezed down by 1/n if we look far enough along the sequence. The problem is that this convergence is very slow. You'll note that each subsequent factor is bigger than the last, in fact, the last factor converges to 1. However, they still are less than one, so they make each term decrease, just by less and less. It is rather annoying and making it hard for me to find the right expression or technique.

Anyway, it is just part of a somewhat bigger problem related to a theorem of Tauber.

No comments: