# Calculus/Multivariable Calculus/Directional Derivative and the Gradient Vector

### Directional derivatives[edit | edit source]

Normally, a partial derivative of a function with respect to one of its variables, say, *x*_{j}, takes the derivative of that "slice" of that function parallel to the *x*_{j}'th axis.

More precisely, we can think of cutting a function * f*(

*x*

_{1},...,

*x*

_{n}) in space along the

*x*

_{j}'th axis, with keeping everything but the

*x*

_{j}variable constant.

From the definition, we have the partial derivative at a point * p* of the function along this slice as

provided this limit exists.

Instead of the basis vector, which corresponds to taking the derivative along that axis, we can pick a vector in any direction (which we usually take as being a unit vector), and we take the *directional derivative* of a function as

where * d* is the direction vector.

If we want to calculate directional derivatives, calculating them from the limit definition is rather painful, but, we have the following: if * f* :

**R**

^{n}→

**R**is differentiable at a point

*, |*

**p***|=1,*

**p**There is a closely related formulation which we'll look at in the next section.

### Gradient vectors[edit | edit source]

The partial derivatives of a scalar tell us how much it changes if we move along one of the axes. What if we move in a different direction?

We'll call the scalar *f*, and consider what happens if we move an infintesimal direction * dr*=(

*dx,dy,dz*), using the chain rule.

This is the dot product of **dr** with a vector whose components are the partial derivatives of * f*, called the gradient of

**f**

We can form directional derivatives at a point * p*, in the direction

*then by taking the dot product of the gradient with*

**d**

**d**- .

Notice that grad *f* looks like a vector multiplied by a scalar. This particular combination of partial derivatives is commonplace, so we abbreviate it to

We can write the action of taking the gradient vector by writing this as an *operator*. Recall that in the one-variable case we can write *d*/*dx* for the action of taking the derivative with respect to *x*. This case is similar, but **∇** acts like a vector.

We can also write the action of taking the gradient vector as:

#### Properties of the gradient vector[edit | edit source]

##### Geometry[edit | edit source]

- Grad
(**f**) is a vector pointing in the direction of steepest slope of**p**. |grad**f**(**f**)| is the rate of change of that slope at that point.**p**

For example, if we consider h(*x*, *y*)=*x*^{2}+*y*^{2}. The level sets of *h* are concentric circles, centred on the origin, and

grad *h* points directly away from the origin, at right angles to the contours.

- Along a level set, (∇
)(**f**) is perpendicular to the level set {**p**|**x**(**f**)=**x**(**f**) at**p**=**x**}.**p**

If **dr** points along the contours of *f*, where the function is constant, then *df* will be zero. Since *df* is a dot product, that means that the two vectors, **df** and grad *f*, must be at right angles, i.e. the gradient is at right angles to the contours.

##### Algebraic properties[edit | edit source]

Like *d/dx*, ∇ is linear. For any pair of constants, *a* and *b*, and any pair of scalar functions, *f* and *g*

Since it's a vector, we can try taking its dot and cross product with other vectors, and with itself.