# Calculus/Series

## Introduction

A series is the sum of a sequence of terms. An infinite series is the sum of an infinite number of terms (the actual sum of the series need not be infinite, as we will see below).

An arithmetic series is the sum of a sequence of terms with a common difference (the difference between consecutive terms). For example:

$1+4+7+10+13+\dots$

is an arithmetic series with common difference 3, since $a_2-a_1=3$, $a_3-a_2=3$, and so forth.

A geometric series is the sum of terms with a common ratio. For example, an interesting series which appears in many practical problems in science, engineering, and mathematics is the geometric series $r + r^2 + r^3 + r^4 + ...$ where the $...$ indicates that the series continues indefinitely. A common way to study a particular series (following Cauchy) is to define a sequence consisting of the sum of the first $n$ terms. For example, to study the geometric series we can consider the sequence which adds together the first n terms:

$S_n(r) = \sum_{i=1}^{n} r^i.$

Generally by studying the sequence of partial sums we can understand the behavior of the entire infinite series.

Two of the most important questions about a series are:

• Does it converge?
• If so, what does it converge to?

For example, it is fairly easy to see that for $r > 1$, the geometric series $S_n(r)$ will not converge to a finite number (i.e., it will diverge to infinity). To see this, note that each time we increase the number of terms in the series, $S_n(r)$ increases by $r^{n+1}$, since $r^{n+1} > 1$ for all $r > 1$ (as we defined), $S_n(r)$ must increase by a number greater than one every term. When increasing the sum by more than one for every term, it will diverge.

Perhaps a more surprising and interesting fact is that for $|r| < 1$, $S_n(r)$ will converge to a finite value. Specifically, it is possible to show that

$\lim_{n \to \infty} S_n(r) = \frac{r}{1-r}.$

Indeed, consider the quantity

$(1-r)S_n(r) = (1-r)\sum_{i=1}^{n} r^n = \sum_{i=1}^{n} r^n - \sum_{i=2}^{n+1} r^n = r - r^{n+1}$

Since $r^{n+1}\to 0$ as $n\to \infty$ for $|r|<1$, this shows that $(1-r)S_n(r)\to r$ as $n\to \infty$. The quantity $1-r$ is non-zero and doesn't depend on $n$ so we can divide by it and arrive at the formula we want.

We'd like to be able to draw similar conclusions about any series.

Unfortunately, there is no simple way to sum a series. The most we will be able to do in most cases is determine if it converges. The geometric and the telescoping series are the only types of series in which we can easily find the sum of.

## Convergence

It is obvious that for a series to converge, the an must tend to zero (because sum of an infinite number of terms all greater than any given positive number will be infinity), but even if the limit of the sequence is 0, this is not sufficient to say it converges.

Consider the harmonic series, the sum of 1/n, and group terms

$\begin{matrix} \sum_1^{2^m} \frac{1}{n} & = &1+\frac{1}{2} & + &\frac{1}{3}+\frac{1}{4} & + &\frac{1}{5}+\frac{1}{6}+\frac{1}{7}+\frac{1}{8} &+\ldots & + &\sum_{p=1+2^{n-1}}^{2^n} \frac{1}{p} \\ & > &\frac{3}{2} & + &\frac{1}{4}(2) & + &\frac{1}{8}(4) &+\ldots & + & \frac{1}{2^n}2^{n-1} \\ & = &\frac{3}{2} & + &\frac{1}{2} & + &\frac{1}{2} &+\ldots & + & \frac{1}{2} \quad (m \mbox{ terms}) \end{matrix}$

As m tends to infinity, so does this final sum, hence the series diverges.

We can also deduce something about how quickly it diverges. Using the same grouping of terms, we can get an upper limit on the sum of the first so many terms, the partial sums.

$1+\frac{m}{2} \le \sum_1^{2^m} \frac{1}{n} \le 1+m$

or

$1+\frac{\log_2 m}{2}\le \sum_1^m \frac{1}{n} \le 1+ \log_2 m$

and the partial sums increase like log m, very slowly.

### Comparison test

The argument above, based on considering upper and lower bounds on terms, can be modified to provide a general-purpose test for convergence and divergence called the comparison test (or direct comparison test). It can be applied to any series with nonnegative terms:

• If $\sum b_n$ converges and $0 \le a_n \le b_n$, then $\sum a_n$ converges.
• If $\sum b_n$ diverges and $0 \le b_n \le a_n$, then $\sum a_n$ diverges.

There are many such tests for convergence and divergence, the most important of which we will describe below.

### Absolute convergence

Theorem: If the series of absolute values, $\sum_{n=1}^\infty \left| a_n \right|$, converges, then so does the series $\sum_{n=1}^\infty a_n$

We say such a series converges absolutely.

Proof:

Let $\epsilon>0$

According to the Cauchy criterion for series convergence, exists $N$ so that for all $N:

$\sum_{k=n}^{m}|a_{k}|<\epsilon$

We know that:

$|\sum_{k=n}^{m}a_{k}|\leq\sum_{k=n}^{m}|a_{k}|$

And then we get:

$|\sum_{k=n}^{m}a_{k}|\leq\sum_{k=n}^{m}|a_{k}|<\epsilon$

Now we get:

$|\sum_{k=n}^{m}a_{k}|<\epsilon$

Which is exactly the Cauchy criterion for series convergence.

$Q.E.D$

The converse does not hold. The series 1-1/2+1/3-1/4 ... converges, even though the series of its absolute values diverges.

A series like this that converges, but not absolutely, is said to converge conditionally.

If a series converges absolutely, we can add terms in any order we like. The limit will still be the same.

If a series converges conditionally, rearranging the terms changes the limit. In fact, we can make the series converge to any limit we like by choosing a suitable rearrangement.

E.g., in the series 1-1/2+1/3-1/4 ..., we can add only positive terms until the partial sum exceeds 100, subtract 1/2, add only positive terms until the partial sum exceeds 100, subtract 1/4, and so on, getting a sequence with the same terms that converges to 100.

This makes absolutely convergent series easier to work with. Thus, all but one of convergence tests in this chapter will be for series all of whose terms are positive, which must be absolutely convergent or divergent series. Other series will be studied by considering the corresponding series of absolute values.

### Ratio test

For a series with terms an, if

$\lim_{n \to \infty } \left| \frac{a_{n+1}}{a_n} \right| = r$

then

• the series converges (absolutely) if r<1
• the series diverges if r>1 (or if r is infinity)
• the series could do either if r=1, so the test is not conclusive in this case.

E.g., suppose

$a_n=\frac{n!n!}{(2n)!}$

then

$\frac{a_{n+1}}{a_n}=\frac{(n+1)^2}{(2n+1)(2n+2)}=\frac{n+1}{4n+2} \to \frac{1}{4}$

so this series converges.

### Integral test

If f(x) is a monotonically decreasing, always positive function, then the series

$\sum_{n=1}^\infty f(n)$

converges if and only if the integral

$\int_1^\infty f(x)dx$

converges.

E.g., consider f(x)=1/xp, for a fixed p.

• If p=1 this is the harmonic series, which diverges.
• If p<1 each term is larger than the harmonic series, so it diverges.
• If p>1 then
$\begin{matrix}\int_1^\infty x^{-p}dx & = & \lim_{s \to \infty}\int_1^s x^{-p}dx & \\ & = & \lim_{s \to \infty } \left. \frac{-1}{(p-1)x^{p-1}} \right|^s_1 & \\ & = & \lim_{s \to \infty } \left( \frac{1}{p-1}-\frac{1}{(p-1)s^{p-1}} \right) & =\frac{1}{p-1} \end{matrix}$

The integral converges, for p>1, so the series converges.

We can prove this test works by writing the integral as

$\int_1^\infty f(x)dx=\sum_{n=1}^\infty \int_n^{n+1} f(x)dx$

and comparing each of the integrals with rectangles, giving the inequalities

$f(n) \ge \int_n^{n+1} f(x)dx \ge f(n+1)$

Applying these to the sum then shows convergence.

### Limit comparison test

Given an infinite series $\textstyle\sum a_n$ with positive terms only, if one can find another infinite series $\textstyle\sum b_n$ with positive terms for which

$\lim_{n \to \infty} \frac{a_n}{b_n} = L$

for a positive and finite L (i.e., the limit exists and is not zero), then the two series either both converge or both diverge. That is,

• $\textstyle\sum a_n$ converges if $\textstyle\sum b_n$ converges, and
• $\textstyle\sum a_n$ diverges if $\textstyle\sum b_n$ diverges.

Example:

$a_n = n^{-\frac{n+1}{n}}$

For large n, the terms of this series are similar to, but smaller than, those of the harmonic series. We compare the limits.

$\lim \frac{a_n}{b_n} = \lim \frac{n^{-\frac{n+1}{n}}}{1/n} = \lim \frac{n}{n^{\frac{n+1}{n}}} = \lim \frac {1}{n^{\frac {1}{n}}}=1>0$

so this series diverges.

### Alternating series

Given an infinite series $\sum a_n$, if the signs of the an alternate, that is if

$a_n=(-1)^n |a_n| \,$

for all n or

$a_n=(-1)^{n+1} |a_n| \,$

for all n, then we call it an alternating series.

The alternating series test states that such a series converges if

$\lim_{n \to \infty}a_n=0$

and

$\ |a_{n+1}| < |a_n|$

(that is, the magnitude of the terms is decreasing).

Note that this test cannot lead to the conclusion that the series diverges; if one cannot conclude that the series converges, this test is inconclusive, although other tests may, of course, be used to give a conclusion.

#### Estimating the sum of an alternating series

The absolute error that results in using a partial sum of an alternating series to estimate the final sum of the infinite series is smaller than the magnitude of the first omitted term.

$\left| \sum_{n=1}^\infty a_n - \sum_{n=1}^m a_n \right| < |a_{m+1}|$

## Geometric series

The geometric series can take either of the following forms

$\sum_{n=0}^\infty ar^n$ or $\sum_{n=1}^\infty ar^{n-1}$

As you have seen at the start, the sum of the geometric series is

$s=\lim_{n\to\infty} S_n =\lim_{n\to\infty}\frac{a(1-r^n)}{1-r}=\frac{a}{1-r}\quad\mbox{ for } |r| < 1$.

## Telescoping series

$\sum_{n=0}^\infty (b_n - b_{n+1})$

Expanding (or "telescoping") this type of series is informative. If we expand this series, we get:

$\sum_{n=0}^k (b_n - b_{n+1})= (b_0 - b_1) + (b_1 - b_2) + ... + (b_{k-1} - b_k)$

$\sum_{n=0}^k (b_n - b_{n+1})= b_0 - b_k$
$\sum_{n=0}^\infty (b_n - b_{n+1}) = \lim_{k \to \infty} \sum_{n=0}^k (b_n - b_{n+1}) = \lim_{k \to \infty} (b_0 - b_k) = b_0 - \lim_{k \to \infty} b_k$