# Analytic Combinatorics/Cauchy-Hadamard theorem and Cauchy's inequality

## Introduction

Two of the most basic means of estimating coefficients of generating functions are the Cauchy-Hadamard theorem and Cauchy's inequality.

We also include some background knowledge which will be useful for future chapters.

### Limit superior

One key concept in analysis is a sequence of numbers. In our case, the sequence of numbers could be the coefficients of the generating function we are interested in, written ${\displaystyle \{a_{n}\}}$.

A point of accumulation of a sequence is a number ${\displaystyle t}$ such that, given ${\displaystyle \epsilon >0}$, there are infinitely many ${\displaystyle n}$ such that[1]

${\displaystyle |a_{n}-t|<\epsilon }$.

For example, the sequence of coefficients of ${\displaystyle {\frac {1}{1-z}}}$ (${\displaystyle \{1,1,1,\cdots \}}$) has point of accumulation ${\displaystyle 1}$. ${\displaystyle e^{z}}$ (${\displaystyle \{1,{\frac {1}{2!}},{\frac {1}{3!}},\cdots \}}$) has point of accumulation ${\displaystyle 0}$. ${\displaystyle {\frac {1}{1-z^{2}}}}$ (${\displaystyle \{1,0,1,0,\cdots \}}$) has two points of accumulation, ${\displaystyle 0}$ and ${\displaystyle 1}$.

One useful property of a sequence of numbers is its limit superior, written ${\displaystyle \limsup a_{n}}$. This is the least upper bound of the set of points of accumulation of the sequence ${\displaystyle \{a_{n}\}}$[2].

In our above examples, these would be ${\displaystyle 1}$, ${\displaystyle 0}$ and ${\displaystyle 1}$ respectively.

### Convergence

${\displaystyle f(z)}$ is said to converge if its series expansion ${\displaystyle \sum a_{n}z^{n}}$ equals a finite value.

It may only do so for particular values of ${\displaystyle z}$. There are various tests for whether or not a series converges and for which values of ${\displaystyle z}$.

For example, ${\displaystyle {\frac {1}{1-z}}}$ has series expansion ${\displaystyle \sum z^{n}}$. We can test this series for convergence with the D'Alembert's ratio test[3] which states that the series converges if

${\displaystyle \lim _{n\to \infty }{\frac {a_{n+1}}{a_{n}}}<1}$

In our example, the ratio is ${\displaystyle {\frac {z^{n+1}}{z^{n}}}}$ which is only less than ${\displaystyle 1}$ if ${\displaystyle |z|<1}$. Therefore, the series converges for values less than ${\displaystyle 1}$.

The radius of convergence of ${\displaystyle f(z)}$ is the value ${\displaystyle z_{0}}$ such that for ${\displaystyle |z| the series expansion converges.

In our example, the radius of convergence of ${\displaystyle {\frac {1}{1-z}}}$ is ${\displaystyle 1}$.

It should be noted that the radius of convergence is equal to the smallest singularity of a function.[4] We will read about singularities later.

### Theorem

If ${\displaystyle f(z)=\sum a_{n}z^{n}}$ and ${\displaystyle R}$ its radius of convergence then[5]:

${\displaystyle {\frac {1}{R}}=\limsup _{n\to \infty }|a_{n}|^{1/n}}$

One consequence of this theorem is[6]:

${\displaystyle \left({\frac {1}{R}}-\epsilon \right)^{n}\leq a_{n}\leq \left({\frac {1}{R}}+\epsilon \right)^{n}}$ (for all ${\displaystyle \epsilon >0}$ and sufficiently large ${\displaystyle n}$)

### Proof

Proof due to Wilf[7] and Lang[8].

The radius of convergence ${\displaystyle R}$ of a function ${\displaystyle f(z)}$ means that if ${\displaystyle |z| then ${\displaystyle f(z)\neq \infty }$[9].

Take ${\displaystyle f(z)=\sum a_{n}z^{n}}$, ${\displaystyle R}$ its radius of convergence and ${\displaystyle t=\limsup _{n\to \infty }|a_{n}|^{1/n}}$. By definition of ${\displaystyle \limsup }$[10], for all but a finite number of ${\displaystyle n}$

${\displaystyle |a_{n}|\leq (t+\epsilon )^{n}}$.

${\displaystyle f(z)}$ does not converge if ${\displaystyle |z|\geq {\frac {1}{(t+\epsilon )}}}$ (because otherwise ${\displaystyle {\frac {a_{n+1}z^{n+1}}{a_{n}z^{n}}}>1}$ for all ${\displaystyle n}$ and so diverges by D'Alembert's ratio test), therefore ${\displaystyle R\geq {\frac {1}{(t+\epsilon )}}}$[11].

By definition of ${\displaystyle \limsup }$, there exist infinitely many ${\displaystyle n}$

${\displaystyle |a_{n}|\geq (t-\epsilon )^{n}}$.

${\displaystyle f(z)}$ does not converge if ${\displaystyle |z|={\frac {1}{(t-\epsilon )}}}$, therefore ${\displaystyle R\leq {\frac {1}{(t-\epsilon )}}}$[12].

If ${\displaystyle R\geq {\frac {1}{(t+\epsilon )}}}$ and ${\displaystyle R\leq {\frac {1}{(t-\epsilon )}}}$ then ${\displaystyle R={\frac {1}{t}}}$ and ${\displaystyle {\frac {1}{R}}=\limsup _{n\to \infty }|a_{n}|^{1/n}}$[13].

Now, we prove the consequence of the theorem.

If, ${\displaystyle {\frac {1}{R}}=\limsup _{n\to \infty }|a_{n}|^{1/n}}$ and, by definition of ${\displaystyle \limsup }$, for all but a finite number of ${\displaystyle n}$

${\displaystyle |a_{n}|\leq \left({\frac {1}{R}}+\epsilon \right)^{n}}$

and there exist infinitely many ${\displaystyle n}$

${\displaystyle |a_{n}|\geq \left({\frac {1}{R}}-\epsilon \right)^{n}}$.

## Cauchy's inequality

### Complex numbers

A complex number is a number ${\displaystyle z=x+iy}$ where ${\displaystyle x}$ and ${\displaystyle y}$ are both real numbers and ${\displaystyle i}$ is the imaginary unit where ${\displaystyle i^{2}=-1}$. ${\displaystyle x}$ is called the real component and ${\displaystyle y}$ the imaginary component (even though ${\displaystyle y}$ is itself a real number).

### Contour integration

Because a complex number has two components, real and imaginary, complex integration involves integrating around a curve in the two-dimensional plane. This is called contour integration.

We denote this:

${\displaystyle \int _{C}f(z)dz}$

where ${\displaystyle C}$ denotes the contour.

It is not necessary to know how to compute contour integrals in order to understand the later material in this book.

### Analytic functions

A function ${\displaystyle f(z)}$ is analytic at a point ${\displaystyle z_{0}}$ if it is defined, single-valued and has a derivative at every point at and around ${\displaystyle z_{0}}$[14].

We say a function ${\displaystyle f(z)}$ is analytic on a set of points ${\displaystyle U}$ if it is analytic at every point of ${\displaystyle U}$[15].

One property of an analytic function is that when performing contour integration on a closed contour ${\displaystyle C}$ we can continuously deform the contour ${\displaystyle C}$ into another closed contour ${\displaystyle C^{'}}$ without changing the value of the integral (as long as in deforming the contour we do not pass through any singularities)[16].

### Cauchy's integral formula

Cauchy's integral formula states:[17]

${\displaystyle f(z_{0})={\frac {1}{2\pi i}}\int _{C}f(z){\frac {dz}{z-z_{0}}}}$

where ${\displaystyle C}$ is a contour, ${\displaystyle z_{0}}$ is a point inside ${\displaystyle C}$ and ${\displaystyle f(z)}$ is analytic on and inside the contour.

Proof: Because ${\displaystyle f(z)}$ is analytic, we can replace the integral around ${\displaystyle C}$ with a contour ${\displaystyle \gamma }$ with centre ${\displaystyle z_{0}}$ and radius ${\displaystyle \rho }$

${\displaystyle {\frac {1}{2\pi i}}\int _{C}f(z){\frac {dz}{z-z_{0}}}={\frac {1}{2\pi i}}\int _{\gamma }f(z){\frac {dz}{z-z_{0}}}}$

As ${\displaystyle f(z)}$ is analytic it is also continuous. This means for any ${\displaystyle \epsilon >0}$ there exists a ${\displaystyle \delta >0}$ such that ${\displaystyle |z-z_{0}|<\delta \implies |f(z)-f(z_{0})|<\epsilon }$. We can do this by setting ${\displaystyle \rho \leq \delta }$.

${\displaystyle \int _{\gamma }f(z){\frac {dz}{z-z_{0}}}=f(z_{0})\int _{\gamma }{\frac {dz}{z-z_{0}}}+\int _{\gamma }{\frac {f(z)-f(z_{0})}{z-z_{0}}}dz}$
${\displaystyle f(z_{0})\int _{\gamma }{\frac {dz}{z-z_{0}}}=2\pi if(z_{0})}$
${\displaystyle \int _{\gamma }{\frac {f(z)-f(z_{0})}{z-z_{0}}}dz\leq {\frac {\epsilon }{\rho }}2\pi \rho =2\pi \epsilon }$

Finally,

${\displaystyle {\frac {1}{2\pi i}}\int _{C}f(z){\frac {dz}{z-z_{0}}}={\frac {1}{2\pi i}}\int _{\gamma }f(z){\frac {dz}{z-z_{0}}}={\frac {1}{2\pi i}}(2\pi if(z_{0})+2\pi \epsilon )=f(z_{0})}$ as ${\displaystyle \epsilon \to 0}$.

### Taylor series

If ${\displaystyle f(z)}$ is analytic inside and on a contour ${\displaystyle C}$, the Taylor series expansion of ${\displaystyle f(z)}$ around the point ${\displaystyle z_{0}}$ inside ${\displaystyle C}$:

${\displaystyle f(z)=f(z_{0})+f'(z_{0})(z-z_{0})+{\frac {f''(z_{0})(z-z_{0})^{2}}{2!}}+\cdots }$

### Cauchy's coefficient formula

Cauchy's coefficient formula states that:

${\displaystyle a_{n}={\frac {1}{2\pi i}}\int _{C}f(z){\frac {dz}{z^{n+1}}}}$

Proof: Cauchy's integral formula states:

${\displaystyle f(z_{0})={\frac {1}{2\pi i}}\int _{C}f(z){\frac {dz}{z-z_{0}}}}$

If you differentiate both sides with respect to ${\displaystyle z_{0}}$ ${\displaystyle n}$ times, you get:

${\displaystyle {\frac {f^{(n)}(z_{0})}{n!}}={\frac {1}{2\pi i}}\int _{C}f(z){\frac {dz}{(z-z_{0})^{n+1}}}}$

The Taylor series expansion of ${\displaystyle f(z)}$ around ${\displaystyle 0}$:

${\displaystyle f(z)=f(0)+f'(0)z+{\frac {f''(0)z^{2}}{2!}}+\cdots }$

Therefore:

${\displaystyle a_{n}={\frac {f^{(n)}(0)}{n!}}={\frac {1}{2\pi i}}\int _{C}f(z){\frac {dz}{z^{n+1}}}}$

### Theorem

Theorem due to Titchmarsh[18].

If ${\displaystyle R}$ is the radius of convergence of ${\displaystyle f(z)}$, for all ${\displaystyle n\geq 0}$ and ${\displaystyle 0

${\displaystyle |a_{n}|\leq {\frac {\max _{|z|=r}|f(z)|}{r^{n}}}}$

### Proof

Proof due to Titchmarsh[19].

By Cauchy's coefficient formula:

${\displaystyle a_{n}={\frac {1}{2\pi i}}\int _{|z|=r}f(z){\frac {dz}{z^{n+1}}}}$

We have[20]:

${\displaystyle |a_{n}|=\left|{\frac {1}{2\pi i}}\int _{|z|=r}f(z){\frac {dz}{z^{n+1}}}\right|={\frac {1}{2\pi i}}\int _{|z|=r}|f(z)|{\frac {dz}{z^{n+1}}}}$

and

${\displaystyle \int _{|z|=r}|f(z)|dz\leq \max _{|z|=r}|f(z)|2\pi r}$

Therefore:

${\displaystyle |a_{n}|={\frac {1}{2\pi i}}\int _{|z|=r}|f(z)|{\frac {dz}{z^{n+1}}}\leq {\frac {1}{2\pi }}\max _{|z|=r}|f(z)|{\frac {2\pi r}{r^{n+1}}}={\frac {\max _{|z|=r}|f(z)|}{r^{n}}}}$

Pictorially, we are estimating the contour integral by taking ${\displaystyle |f(z)|}$ always at its maximum around the entire contour, shown by the green ring below.

## Notes

1. Lang 1999, pp. 53-54.
2. Lang 1999, pp. 54.
3. Stroud 2001, pp. 765.
4. Stroud 2003, pp. 916. Wilf 2006, pp. 50-51.
5. Lang 1999, pp. 55.
6. Wilf 2006, pp. 52.
7. Wilf 2006, pp. 48-52.
8. Lang 1999, pp. 55-56.
9. Stroud 2003, pp. 914.
10. See Wilf 2006, pp. 49.
11. Lang 1999, pp. 55.
12. Lang 1999, pp. 56.
13. This does not prove the case when ${\displaystyle t=0,\infty }$.
14. Stroud 2003, pp. 863.
15. Lang 1999, pp. 69.
16. Lang 1999, pp. 116-117.
17. Titchmarsh 1939, pp. 80-81.
18. Titchmarsh 1939, pp. 84.
19. Titchmarsh 1939, pp. 84.
20. Titchmarsh 1939, pp. 74.

## References

• Lang, Serge (1999). Complex Analysis (4th ed.). Springer Science+Business Media, LLC.
• Stroud, K. A. (2003). Advanced Engineering Mathematics (4th ed.). Palgrave Macmillan.
• Stroud, K. A. (2001). Engineering Mathematics (5th ed.). Palgrave Macmillan.
• Titchmarsh, E. C. (1939). The Theory of Functions (2nd ed.). Oxford University Press.
• Wilf, Herbert S. (2006). Generatingfunctionology (PDF) (3rd ed.). A K Peters, Ltd.