Engineering Analysis/Matrix Exponentials

From Wikibooks, open books for an open world
< Engineering Analysis
Jump to: navigation, search

Matrix Exponentials[edit]

If we have a matrix A, we can raise that matrix to a power of e as follows:

e^{A}

It is important to note that this is not necessarily (not usually) equal to each individual element of A being raised to a power of e. Using taylor-series expansion of exponentials, we can show that:

e^{A} = I + A + \frac{1}{2}A^2 + \frac{1}{6}A^3 + ...  = \sum_{k=0}^\infty{1 \over k!}A^k.

In other words, the matrix exponential can be reducted to a sum of powers of the matrix. This follows from both the taylor series expansion of the exponential function, and the cayley-hamilton theorem discussed previously.

However, this infinite sum is expensive to compute, and because the sequence is infinite, there is no good cut-off point where we can stop computing terms and call the answer a "good approximation". To alleviate this point, we can turn to the Cayley-Hamilton Theorem. Solving the Theorem for An, we get:

A^n = -c_{n-1}A^{n-1} - c_{n-2}A^{n-2} - \cdots - c_1A - c_0I

Multiplying both sides of the equation by A, we get:

A^{n+1} = -c_{n-1}A^n - c_{n-2}A^{n-1} - \cdots - c_1A^2 - c_0A

We can substitute the first equation into the second equation, and the result will be An+1 in terms of the first n - 1 powers of A. In fact, we can repeat that process so that Am, for any arbitrary high power of m can be expressed as a linear combination of the first n - 1 powers of A. Applying this result to our exponential problem:

e^A = \alpha_0I + \alpha_1A + \cdots + \alpha_{n-1}A^{n-1}

Where we can solve for the α terms, and have a finite polynomial that expresses the exponential.

Inverse[edit]

The inverse of a matrix exponential is given by:

(e^{A})^{-1} = e^{-A}

Derivative[edit]

The derivative of a matrix exponential is:

\frac{d}{dx}e^{Ax} = Ae^{Ax} = e^{Ax}A

Notice that the exponential matrix is commutative with the matrix A. This is not the case with other functions, necessarily.

Sum of Matrices[edit]

If we have a sum of matrices in the exponent, we cannot separate them:

e^{(A+B)x} \ne e^{Ax}e^{Bx}

Differential Equations[edit]

If we have a first-degree differential equation of the following form:

x'(t) = Ax(t) + f(t)

With initial conditions

x(t_0) = c

Then the solution to that equation is given in terms of the matrix exponential:

x(t) = e^{A(t - t_0)}c + \int_{t_0}^t e^{A(t - \tau)}f(\tau)d\tau

This equation shows up frequently in control engineering.

Laplace Transform[edit]

As a matter of some interest, we will show the Laplace Transform of a matrix exponential function:

\mathcal{L}[e^{At}] = (sI - A)^{-1}

We will not use this result any further in this book, although other books on engineering might make use of it.