Alternating series Information
This article needs additional citations for
verification. (January 2010) |
Part of a series of articles about |
Calculus |
---|
In mathematics, an alternating series is an infinite series of the form
- or
with a_{n} > 0 for all n. The signs of the general terms alternate between positive and negative. Like any series, an alternating series converges if and only if the associated sequence of partial sums converges.
Examples
The geometric series 1/2 − 1/4 + 1/8 − 1/16 + ⋯ sums to 1/3.
The alternating harmonic series has a finite sum but the harmonic series does not.
The Mercator series provides an analytic expression of the natural logarithm:
The functions sine and cosine used in trigonometry can be defined as alternating series in calculus even though they are introduced in elementary algebra as the ratio of sides of a right triangle. In fact,
- , and
When the alternating factor (–1)^{n} is removed from these series one obtains the hyperbolic functions sinh and cosh used in calculus.
For integer or positive index α the Bessel function of the first kind may be defined with the alternating series
- where Γ(z) is the gamma function.
If s is a complex number, the Dirichlet eta function is formed as an alternating series
that is used in analytic number theory.
Alternating series test
The theorem known as "Leibniz Test" or the alternating series test tells us that an alternating series will converge if the terms a_{n} converge to 0 monotonically.
Proof: Suppose the sequence converges to zero and is monotone decreasing. If is odd and , we obtain the estimate via the following calculation:
Since is monotonically decreasing, the terms are negative. Thus, we have the final inequality: . Similarly, it can be shown that . Since converges to , our partial sums form a Cauchy sequence (i.e. the series satisfies the Cauchy criterion) and therefore converge. The argument for even is similar.
Approximating sums
The estimate above does not depend on . So, if is approaching 0 monotonically, the estimate provides an error bound for approximating infinite sums by partial sums:
Absolute convergence
A series converges absolutely if the series converges.
Theorem: Absolutely convergent series are convergent.
Proof: Suppose is absolutely convergent. Then, is convergent and it follows that converges as well. Since , the series converges by the comparison test. Therefore, the series converges as the difference of two convergent series .
Conditional convergence
A series is conditionally convergent if it converges but does not converge absolutely.
For example, the harmonic series
diverges, while the alternating version
converges by the alternating series test.
Rearrangements
For any series, we can create a new series by rearranging the order of summation. A series is unconditionally convergent if any rearrangement creates a series with the same convergence as the original series. Absolutely convergent series are unconditionally convergent. But the Riemann series theorem states that conditionally convergent series can be rearranged to create arbitrary convergence.^{ [1]} The general principle is that addition of infinite sums is only commutative for absolutely convergent series.
For example, one false proof that 1=0 exploits the failure of associativity for infinite sums.
As another example, by Mercator series
But, since the series does not converge absolutely, we can rearrange the terms to obtain a series for :
Series acceleration
In practice, the numerical summation of an alternating series may be sped up using any one of a variety of series acceleration techniques. One of the oldest techniques is that of Euler summation, and there are many modern techniques that can offer even more rapid convergence.
See also
Notes
- ^ Mallik, AK (2007). "Curious Consequences of Simple Sequences". Resonance. 12 (1): 23–37. doi: 10.1007/s12045-007-0004-7.
References
- Earl D. Rainville (1967) Infinite Series, pp 73–6, Macmillan Publishers.
- Weisstein, Eric W. "Alternating Series". MathWorld.