# Partial Differential Equations/The Malgrange-Ehrenpreis theorem

Jump to navigation Jump to search

## Vandermonde's matrix

Definition 10.1:

Let $n\in \mathbb {N}$ and let $x_{1},\ldots ,x_{n}\in \mathbb {R}$ . Then the Vandermonde matrix associated to $x_{1},\ldots ,x_{n}$ is defined to be the matrix

${\begin{pmatrix}x_{1}&\cdots &x_{n}\\x_{1}^{2}&\cdots &x_{n}^{2}\\\vdots &\ddots &\vdots \\x_{1}^{n}&\cdots &x_{n}^{n}\end{pmatrix}}$ .

For $x_{1},\ldots ,x_{n}$ pairwise different (i. e. $x_{k}\neq x_{m}$ for $k\neq m$ ) matrix is invertible, as the following theorem proves:

Theorem 10.2:

Let $\mathbf {A}$ be the Vandermonde matrix associated to the pairwise different points $x_{1},\ldots ,x_{n}$ . Then the matrix $\mathbf {B}$ whose $k,m$ -th entry is given by

$\mathbf {b} _{k,m}:={\begin{cases}{\frac {\sum _{1\leq l_{1}<\cdots is the inverse matrix of $\mathbf {A}$ .

Proof:

We prove that $\mathbf {B} \mathbf {A} =\mathbf {I} _{n}$ , where $\mathbf {I} _{n}$ is the $n\times n$ identity matrix.

Let $1\leq k,m\leq n$ . We first note that, by direct multiplication,

$x_{m}\prod _{1\leq l\leq n \atop l\neq k}(x_{l}-x_{m})=\sum _{j=1}^{n}x_{m}^{j}{\begin{cases}\sum _{1\leq l_{1}<\cdots .

Therefore, if $\mathbf {c} _{k,m}$ is the $k,m$ -th entry of the matrix $\mathbf {B} \mathbf {A}$ , then by the definition of matrix multiplication

$\mathbf {c} _{k,m}=\sum _{j=1}^{n}{\frac {x_{m}^{j}{\begin{cases}\sum _{1\leq l_{1}<\cdots .$\Box$ ## The Malgrange-Ehrenpreis theorem

Lemma 10.3:

Let $x_{1},\ldots ,x_{n}\in \mathbb {R}$ be pairwise different. The solution to the equation

${\begin{pmatrix}x_{1}&\cdots &x_{n}\\x_{1}^{2}&\cdots &x_{n}^{2}\\\vdots &\ddots &\vdots \\x_{1}^{n}&\cdots &x_{n}^{n}\end{pmatrix}}{\begin{pmatrix}y_{1}\\\vdots \\y_{n}\end{pmatrix}}={\begin{pmatrix}0\\\vdots \\0\\1\end{pmatrix}}$ is given by

$y_{k}={\frac {1}{x_{k}\prod _{1\leq l\leq n \atop l\neq k}(x_{l}-x_{k})}}$ , $k\in \{1,\ldots ,n\}$ .

Proof:

We multiply both sides of the equation by $\mathbf {B}$ on the left, where $\mathbf {B}$ is as in theorem 10.2, and since $\mathbf {B}$ is the inverse of

${\begin{pmatrix}x_{1}&\cdots &x_{n}\\x_{1}^{2}&\cdots &x_{n}^{2}\\\vdots &\ddots &\vdots \\x_{1}^{n}&\cdots &x_{n}^{n}\end{pmatrix}}$ ,

we end up with the equation

${\begin{pmatrix}y_{1}\\\vdots \\y_{n}\end{pmatrix}}=\mathbf {B} {\begin{pmatrix}0\\\vdots \\0\\1\end{pmatrix}}$ .

Calculating the last expression directly leads to the desired formula.$\Box$ 