# Calculus/Power series

 ← Taylor series Calculus Sequences and Series/Exercises → Power series

The study of power series is aimed at investigating series which can approximate some function over a certain interval.

## Motivations

Elementary calculus (differentiation) is used to obtain information on a line which touches a curve at one point (i.e. a tangent). This is done by calculating the gradient, or slope of the curve, at a single point. However, this does not provide us with reliable information on the curve's actual value at given points in a wider interval. This is where the concept of power series becomes useful.

### An example

Consider the curve of ${\displaystyle y=\cos(x)}$ , about the point ${\displaystyle x=0}$ . A naïve approximation would be the line ${\displaystyle y=1}$ . However, for a more accurate approximation, observe that ${\displaystyle \cos(x)}$ looks like an inverted parabola around ${\displaystyle x=0}$ - therefore, we might think about which parabola could approximate the shape of ${\displaystyle \cos(x)}$ near this point. This curve might well come to mind:

${\displaystyle y=1-{\frac {x^{2}}{2}}}$

In fact, this is the best estimate for ${\displaystyle \cos(x)}$ which uses polynomials of degree 2 (i.e. a highest term of ${\displaystyle x^{2}}$) - but how do we know this is true? This is the study of power series: finding optimal approximations to functions using polynomials.

## Definition

A power series (in one variable) is a infinite series of the form

${\displaystyle f(x)=a_{0}(x-c)^{0}+a_{1}(x-c)^{1}+a_{2}(x-c)^{2}+\cdots }$ (where ${\displaystyle c}$ is a constant)

or, equivalently,

${\displaystyle f(x)=\sum _{n=0}^{\infty }a_{n}(x-c)^{n}}$

When using a power series as an alternative method of calculating a function's value, the equation

${\displaystyle f(x)=\sum _{n=0}^{\infty }a_{n}(x-c)^{n}}$

can only be used to study ${\displaystyle f(x)}$ where the power series converges - this may happen for a finite range, or for all real numbers.

The size of the interval (around its center) in which the power series converges to the function is known as the radius of convergence.

### An example

${\displaystyle {\frac {1}{1-x}}=\sum _{n=0}^{\infty }x^{n}}$ (a geometric series)

this converges when ${\displaystyle |x|<1}$ , the range ${\displaystyle f(x)-1 , so the radius of convergence - centered at 0 - is 1. It should also be observed that at the extremities of the radius, that is where ${\displaystyle x=1}$ and ${\displaystyle x=-1}$ , the power series does not converge.

### Another example

${\displaystyle e^{x}=\sum _{n=0}^{\infty }{\frac {x^{n}}{n!}}}$

Using the ratio test, this series converges when the ratio of successive terms is less than one:

${\displaystyle \lim _{n\to \infty }\left|{\frac {x^{n+1}}{(n+1)!}}{\frac {n!}{x^{n}}}\right|<1}$
${\displaystyle =\lim _{n\to \infty }\left|{\frac {x^{n}x^{1}}{n!(n+1)}}{\frac {n!}{x^{n}}}\right|<1}$
${\displaystyle =\lim _{n\to \infty }\left|{\frac {x}{n+1}}\right|<1}$

which is always true - therefore, this power series has an infinite radius of convergence. In effect, this means that the power series can always be used as a valid alternative to the original function, ${\displaystyle e^{x}}$ .

### Abstraction

If we use the ratio test on an arbitrary power series, we find it converges when

${\displaystyle \lim _{n\to \infty }{\frac {|a_{n+1}x|}{|a_{n}|}}<1}$

and diverges when

${\displaystyle \lim _{n\to \infty }{\frac {|a_{n+1}x|}{|a_{n}|}}>1}$

The radius of convergence is therefore

${\displaystyle r=\lim _{n\to \infty }{\frac {|a_{n}|}{|a_{n+1}|}}}$

If this limit diverges to infinity, the series has an infinite radius of convergence.

## Differentiation and Integration

Within its radius of convergence, a power series can be differentiated and integrated term by term.

${\displaystyle {\frac {d}{dx}}\left[\sum _{n=0}^{\infty }a_{n}x^{n}\right]=\sum _{n=0}^{\infty }a_{n+1}(n+1)(x-c)^{n}}$
${\displaystyle \int \sum _{n=0}^{\infty }a_{n}(x-c)^{n}dx=\sum _{n=1}^{\infty }{\frac {a_{n-1}(x-c)^{n}}{n}}+k}$

Both the differential and the integral have the same radius of convergence as the original series.

This allows us to sum exactly suitable power series. For example,

${\displaystyle {\frac {1}{1+x}}=1-x+x^{2}-x^{3}\pm \cdots }$

This is a geometric series, which converges for ${\displaystyle |x|<1}$ . Integrating both sides, we get

${\displaystyle \ln(1+x)=x-{\frac {x^{2}}{2}}+{\frac {x^{3}}{3}}\pm \cdots }$

which will also converge for ${\displaystyle |x|<1}$ . When ${\displaystyle x=-1}$ this is the harmonic series, which diverges; when ${\displaystyle x=1}$ this is an alternating series with diminishing terms, which converges to ${\displaystyle \ln(2)}$ - this is testing the extremities.

It also lets us write series for integrals we cannot do exactly such as the error function:

${\displaystyle e^{-x^{2}}=\sum (-1)^{n}{\frac {x^{2n}}{n!}}}$

The left hand side can not be integrated exactly, but the right hand side can be.

${\displaystyle \int \limits _{0}^{z}e^{-x^{2}}dx=\sum {\frac {(-1)^{n}z^{2n+1}}{(2n+1)n!}}}$

This gives us a series for the sum, which has an infinite radius of convergence, letting us approximate the integral as closely as we like.

Note that this is not a power series, as the power of ${\displaystyle z}$ is not the index.