# Introduction to Numerical Methods/Numerical Differentiation

# Numerical Differentiation

[edit | edit source]Objectives:

- explain the definitions of forward, backward, and center divided methods for numerical differentiation
- find approximate values of the first derivative of continuous functions
- reason about the accuracy of the numbers
- find approximate values of the first derivative of discrete functions (given at discrete data points)

Resources

## Forward Divided Difference Method

[edit | edit source]

The following code implements this method:

```
from math import exp
def forward_diff(f, x, h=0.0001):
df = (f(x+h) - f(x))/h
return df
x = 0.5
df = forward_diff(exp, x)
print 'first derivative = ', df
print 'exp(', x, ') = ', exp(x)
```

## Backward Divided Difference Method

[edit | edit source]

The following code implements this method:

```
from math import exp
def backward_diff(f, x, h=0.0001):
df = (f(x) - f(x-h))/h
return df
x = 0.5
df = backward_diff(exp, x)
print 'first derivative = ', df
print 'exp(', x, ') = ', exp(x)
```

## Center Divided Difference Method

[edit | edit source]

The following code implements this method:

```
from math import exp
def center_diff(f, x, h=0.0002):
df = (f(x+h) - f(x-h))/(2.0*h)
return df
x = 0.5
df = center_diff(exp, x)
print 'first derivative = ', df
print 'exp(', x, ') = ', exp(x)
```

## Second Derivative

[edit | edit source]

## Taylor Series

[edit | edit source]Taylor series allows us to taylor expand a function into an infinite series. If the function is infinitely differentiable at number h, we can use the Taylor series to approximate the function. We can derive the backward, the forward, and the center divided difference methods using Taylor series, which also give the quantitative estimate of the error in the approximation.

For instance the function can be approximated by the truncated Taylor series:

## Effect of Step Size

[edit | edit source]The following is a program that calculates the first derivative of at x=0.5 from the center divided difference formula using different values of h. The result shows that the approximation becomes more accurate (more significant digits) as step size becomes smaller but the when the step size becomes too small the rounding off error become significant.

```
from math import exp
def center_diff(f, x, h=0.0001):
df = (f(x+h) - f(x-h))/(2.0*h)
return df
x = 0.5
df = center_diff(exp, x)
print 'first derivative = ', df
print 'exp(', x, ') = ', exp(x)
h=0.125
for i in range(50):
df = center_diff(exp, x, h)
print "h=", h, "df=", df
h = h/2.0
```