# Control Systems/Noise Driven Systems

## Noise-Driven Systems

Systems frequently have to deal with not only the control input u, but also a random noise input v. In some disciplines, such as in a study of electrical communication systems, the noise and the data signal can be added together into a composite input r = u + v. However, in studying control systems, we can not combine these inputs together, for a variety of different reasons:

1. The control input works to stabilize the system, and the noise input works to destabilize the system.
2. The two inputs are independent random variables.
3. The two inputs may act on the system in completely different ways.

As we will show in the next example, it is frequently a good idea to consider the noise and the control inputs separately:

Example: Consider a moving automobile. The control signals for the automobile consist of acceleration (gas pedal) and deceleration (brake pedal) inputs acting on the wheels of the vehicle, and working to create forward motion. The noise inputs to the system can consist of wind pushing against the vertical faces of the automobile, rough pavement (or even dirt) under the tires, bugs and debris hitting the front windshield, etc. As we can see, the control inputs act on the wheels of the vehicle, while the noise inputs can act on multiple sides of the vehicle, in different ways.

## Probability Refresher

We are going to have a brief refesher here for calculus-based probability, specifically focusing on the topics that we will use in the rest of this chapter.

### Expectation

The expectation operatior, E, is used to find the expected, or mean value of a given random variable. The expectation operator is defined as:

$E[x] = \int_{-\infty}^\infty x f_x(x)dx$

If we have two variables that are independent of one another, the expectation of their product is zero.

### Covariance

The covariance matrix, Q, is the expectation of a random vector times it's transpose:

$E[x(t)x'(t)] = Q(t)$

If we take the value of the x transpose at a different point in time, we can calculate out the covariance as:

$E[x(t)x'(s)] = Q(t)\delta(t-s)$

Where δ is the impulse function.

## Noise-Driven System Description

We can define the state equation to a system incorporating a noise vector v:

$x'(t) = A(t)x(t) + H(t)u(t) +B(t)v(t)$

For generality, we will discuss the case of a time-variant system. Time-invariant system results will then be a simplification of the time-variant case. Also, we will assume that v is a gaussian random variable. We do this because physical systems frequently approximate gaussian processes, and because there is a large body of mathematical tools that we can use to work with these processes. We will assume our gaussian process has zero-mean.

## Mean System Response

We would like to find out how our system will respond to the new noisy input. Every system iteration will have a different response that varies with the noise input, but the average of all these iterations should converge to a single value.

For the system with zero control input, we have:

$x'(t) = A(t)x(t) + B(t)v(t)$

For which we know our general solution is given as:

$x(t) = \phi(t, t_0)x_0 + \int_{t_0}^t \phi(t, \tau)B(\tau)v(\tau)d\tau$

If we take the expected value of this function, it should give us the expected value of the output of the system. In other words, we would like to determine what the expected output of our system is going to be by adding a new, noise input.

$E[x(t)] = E[\phi(t, t_0)x_0] + E[\int_{t_0}^t \phi(t, \tau)B(\tau)v(\tau)d\tau]$

In the second term of this equation, neither &\phi; nor B are random variables, and therefore they can come outside of the expectaion operation. Since v is zero-mean, the expectation of it is zero. Therefore, the second term is zero. In the first equation, \$phi; is not a random variable, but x0 does create a dependancy on the output of x(t), and we need to take the expectation of it. This means that:

$E[x(t)] = \phi(t, t_0)E[x_0]$

In other words, the expected output of the system is, on average, the value that the output would be if there were no noise. Notice that if our noise vector v was not zero-mean, and if it was not gaussian, this result would not hold.

## System Covariance

We are now going to analyze the covariance of the system with a noisy input. We multiply our system solution by its transpose, and take the expectation: (this equation is long and might break onto multiple lines)

$E[x(t)x'(t)] = E[\phi(t, t_0)x_0 + \int_{t_0}^t\phi(\tau, t_0)B(\tau)v(\tau)d\tau]$$E[(\phi(t, t_0)x_0 + \int_{t_0}^t\phi(\tau, t_0)B(\tau)v(\tau)d\tau)']$

If we multiply this out term by term, and cancel out the expectations that have a zero-value, we get the following result:

$E[x(t)x'(t)] = \phi(t, t_0)E[x_0x_0']\phi'(t, t_0) = P$

We call this result P, and we can find the first derivative of P by using the chain-rule:

$P'(t) = A(t)\phi(t, t_0)P_0\phi(t, t_0) + \phi(t, t_0)P_0\phi'(t, t_0)A'(t)$

Where

$P_0 = E[x_0x_0']$

We can reduce this to:

$P'(t) = A(t)P(t) + P(t)A'(t) + B(t)Q(t)B'(t)$

In other words, we can analyze the system without needing to calculate the state-transition matrix. This is a good thing, because it can often be very difficult to calculate the state-transition matrix.

## Alternate Analysis

Let us look again at our general solution:

$x(t) = \phi(t, t_0)x(t_0) + \int_{t_0}^t \phi(t, \tau)B(\tau)v(\tau)d\tau$

We can run into a problem because in a gaussian distribution, especially systems with high variance (especially systems with infinite variance), the value of v can momentarily become undefined (approach infinity), which will cause the value of x to likewise become undefined at certain points. This is unacceptable, and makes further analysis of this problem difficult. Let us look again at our original equation, with zero control input:

$x'(t) = A(t)x(t)+B(t)v(t)$

We can multiply both sides by dt, and get the following result:

$dx = A(t)x(t)dt + B(t)v(t)dt$
This new term, dw, is a random process known as a Weiner Process, which the result of transforming a gaussian process in this manner.

We can define a new differential, dw(t), which is an infinitesimal function of time as:

$dw(t) = v(t)dt$

Now, we can integrate both sides of this equation:

$x(t) = x(t_0) + \int_{t_0}^t A(\tau)x(\tau)d\tau + \int_{t_0}^tB(\tau)dw(\tau)$

However, this leads us to an unusual place, and one for which we are (probably) not prepared to continue further: in the third term on the left-hand side, we are attempting to integrate with respect to a function, not a variable. In this instance, the standard Riemann integrals that we are all familiar with cannot solve this equation. There are advanced techniques known as Ito Calculus however that can solve this equation, but these methods are currently outside the scope of this book.