Mathematics for Chemistry/Integration

From Wikibooks, open books for an open world
Jump to: navigation, search

Free Web Based Material from HEFCE[edit]

There is a DVD on integration at Math Tutor.

The basic polynomial[edit]

f (x) = x^n

\int { f(x) }dx  = \frac {x^{n+1}}{n+1} + c

This works fine for all powers except -1, for instance the integral of

\frac {1} {x^7}

is just

- \frac {1} {6 x^6} + c

-1 is clearly going to be a special case because it involves an infinity when x = 0 and goes to a steep spike as x gets small. As you have learned earlier this integral is the natural logarithm of x and the infinity exists because the log of zero is minus infinity and that of negative numbers is undefined.

The integration and differentiation of positive and negative powers[edit]

   >>>>>>>>>>>>>>>>>>>>>>Differentiation

   1/3 x*x*x  x*x       2x        2         0        0       0       0

   1/3 x*x*x  x*x       2x        2         ?        ?       ?       ?

   Integration<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<

   I(x)        H(x)     G(x)      F(x)      ln(x)   1/x    -1/(x*x)


Here I, H, G and F are more complicated functions involving \ln x.

You will be able to work them out easily when you have done more integration. The thing to notice is that the calculus of negative and positive powers is not symmetrical, essentially caused by the pole or singularity for \frac 1 {x^n} at x=0.

Logarithms[edit]

Logarithms were invented by Napier, a Scottish Laird, in the 17th-century. He made many inventions but his most enduring came from the necessity of doing the many long divisions and trigonometric calculations used in astronomy. The Royal Navy in later years devoted great time and expense to developing logarithm technology for the purposes of navigation, leading to the commissioning of the first mechanical stored program computer, which was theoretically sound but could not be made by Charles Babbage between 1833 and 1871.

The heart of the system is the observation of the properties of powers:

\frac {e^a} {e^b} = e^{a-b}

This means that if we have the inverse function of e^x we can change a long division into a subtraction by looking up the exponents in a set of tables. Before the advent of calculators this was the way many calculations were done.

 y = e^x

\ln y = x

Napier initially used logs to the base e for his calculations, but after a year or so he was visited by Briggs who suggested it would be more practical to use base 10. However base e is necessary for the purposes of calculus and thermodynamics.

Integrating 1/x[edit]

x = e^y~~~~~~~~~\frac {{\rm d} x}  { {\rm d} y} = e^y~~~~~ {\rm also} = x

This is true because e^y is our function which reduces or grows at the rate of its own quantity.

y = \ln x

This is our definition of a logarithm.

\frac {{\rm d} x}  { {\rm d} y}  = \frac 1 { \frac {{\rm d} y}  { {\rm d} x}  } ~~~~~~ {\rm also} = x

therefore

\frac {{\rm d} y}  { {\rm d} x}  = \frac 1 x

\int {\frac{dy}{dx}} dx = y

therefore

\int { \frac{1}{x}} dx = \ln x

Integrating 1/x like things[edit]

Just as

 \frac { \rm d}  {  {\rm d} x}  \ln x = \frac 1 x

so

\frac { \rm d}  {  {\rm d} x}  \ln u = \frac 1 u \frac { { \rm d} u}  {  {\rm d} x}

therefore by the chain rule

\frac { { \rm d}}  {  {\rm d} x}  \ln f(x) =  \frac { f^{'} (x)}  {   f (x)}

therefore

\int { \frac { f^{'} (x)}  {   f (x)} } {\rm d}x = \ln | f (x) | + c

Examples of this are:

\int { \frac { \cos \phi}  {   1 + \sin \phi  } }  {\rm d}\phi = \ln (  1 + \sin \phi ) + c

or

\int { \frac { \sin \phi}  {   1 + \cos \phi  } }  {\rm d}\phi = - \ln (  1 + \cos \phi ) + c

and

\int { \frac { e^y}  {   e^y + 2  } }  {\rm d}y = \ln (  e^y + 2 ) + c

As the integral of 1/x is \ln so the differential of \ln is 1/x so

\frac { \rm d}  {  {\rm d} x} \ln 5 x^3 = \ln 5 + \ln x^3 = \ln 5 + 3 \ln x

\ln 5 is just a constant so \frac { \rm d}  {  {\rm d} x} = 0 so

\frac { \rm d}  {  {\rm d} x} \ln 5 x^3 = \frac 3 x

This can also be done by the chain rule

\frac { \rm d}  {  {\rm d} x} \ln 5 x^3 = \frac 1 {5 x^3} .\frac { { \rm d}(5x^3) }  {  {\rm d} x}= \frac { 15x^2 }  {  5x^3 } = \frac 3 x

What is interesting here is that the 5 has disappeared completely. The gradient of the log function is unaffected by a multiplier! This is a fundamental property of logs.

Some observations on infinity[edit]

Obviously \frac{1}{0} is \infty.

\frac 0 0nd \frac \infty \infty are undefined but sometimes a large number over a large number can have defined values. An example is the \sin of 90 degrees, which you will remember has a large opposite over a large hypotenuse but in the limit of an infinitesimally thin triangle they become equal. Therefore the \sin is 1.

Definite integrals (limits)[edit]

Remember how we do a definite integral

\int_0^3 { ( f (x) )}  = {\left[ F (x)  \right] }_0^3 =  F (3)  -  F (0)

where F is the indefinite integral of f.

Here is an example where limits are used to calculate the 3 areas cut out by a quartic equation:

x^4 -2x^3-x^2+2x.

We see that x=1 is a solution so we can do a polynomial division:


        x3  -x2 -2x
      -------------------
 x-1  ) x4 -2x3 -x2 +2x
        x4  -x3
       --------
            -x3 -x2
            -x3 +x2
            -------
                -2x2 +2x
                -2x2 +2x
                --------
                     0

So the equation is x(x^2-x-2)(x-1) which factorises to

x(x-2)(x+1)(x-1).

\int_a^b { x^4 -2x^3-x^2+2x}  = {\left[ x^5/5 -x^4/2-x^3/3+ x^2\right] }_a^b

Integration by substitution[edit]

\int \sin \theta \cos \theta  {\rm d} \theta =\int u \frac  { {\rm d}  u } { {\rm d}  \theta  } {\rm d} \theta

where  u = \sin \theta.

\int \sin \theta \cos \theta  {\rm d} \theta =\frac  { \sin^2 \theta } { 2  } + c

\int \sin^9 \theta \cos \theta  {\rm d} \theta ~~~~~~~{\rm similarly}~~~~~~~~ =\frac  { \sin^{10} \theta } { 10  } + c

Simple integration of trigonometric and exponential Functions[edit]

\int { 4e^{2x}} dx

\int { -9e^{-3x}} dx

\int { \sin 2 \theta } d \theta

\int { \sin^2 \theta + \cos^2 \theta } d \theta

\int {  -\cos \phi } d \phi

Answers[edit]

2e^{2x} + c

3e^{-3x} + c

- \frac 1 2 \cos 2 \theta + c

\theta + c

i.e. the integral of  1 d \theta.

\sin \phi + c

Integration by parts[edit]

This is done in many textbooks and Wikipedia. Their notation might be different to the one used here, which hopefully is the most clear. You derive the expression by taking the product rule and integrating it. You then make one of the UV^{'} into a product itself to produce the expression.

\int {UV} = U \int {V}  - \int {\left( U^{'} \int { V} \right) }

(all integration with respect to x. Remember by

\int {UV} = U [ {\rm int} ] - \int { [ {\rm diff} ] [ {\rm int} ]  }

(U gets differentiated.)

The important thing is that you have to integrate one expression of the product and differentiate the other. In the basic scheme you integrate the most complicated expression and differentiate the simplest. Each time you do this you generate a new term but the function being differentiated goes to zero and the integral is solved. The expression which goes to zero is U.

The other common scheme is where the parts formula generates the expression you want on the right of the equals and there are no other integral signs. Then you can rearrange the equation and the integral is solved. This is obviously very useful for trig functions where \sin >> \cos >> -\sin >> -\cos >> sin ad infinitum.

e^x also generates itself and is susceptible to the same treatment.

\int { e^{-x} \sin x }~ dx = ( -e^{-x} ) \sin x - \int { (-e^{-x}) \cos x} ~ dx

 =  -e^{-x}  \sin x + \int { e^{-x} \cos x } ~ dx

 =  -e^{-x}  (\sin x +  \cos x ) - \int { e^{-x} \sin x } ~ dx + c

We now have our required integral on both sides of the equation so

= - \frac 1 2 e^{-x} ( \sin x + \cos x ) + c

Integration Problems[edit]

Integrate the following by parts with respect tox.

x e^x

x^2 \sin x

x \cos x

x^2  e^x

x^2  \ln x

e^{x} \sin x

e^{2x} \cos x

x {( 1 + x )}^7

Actually this one can be done quite elegantly by parts, to give a two term expression. Work this one out. Expanding the original integrand by Pascal's Triangle gives:

         2       3       4       5       6      7    8
  x + 7 x  + 21 x  + 35 x  + 35 x  + 21 x  + 7 x  + x


The two term integral expands to

      2        3         4      5         6      7        8        9
 1/2 x  + 7/3 x  + 21/4 x  + 7 x  + 35/6 x  + 3 x  + 7/8 x  + 1/9 x  - 1/72

So one can see it is correct on a term by term basis.

3 x \sin x

2 x^2  \cos 2 x

If you integrate x^7 \sin x you will have to apply parts 7 times to get x to become 1 thereby generating 8 terms:



    7             6              5               4               3
  -x  cos(x) + 7 x  sin(x) + 42 x  cos(x) - 210 x  sin(x) - 840 x  cos(x) +

        2
  2520 x  sin(x) +  5040 x cos(x) - 5040 sin(x)  + c


(Output from Maple.) 

Though it looks nasty there is quite a pattern to this, 7, 7x6,7x6x5 ------7! and sin, cos, -sin, -cos, sin, cos etc so it can easily be done by hand.

Differential equations[edit]

First order differential equations are covered in many textbooks. They are solved by integration. (First order equations have \frac{dy}{dx}, second order equations have \frac{dy}{dx} and \frac{d^2 y}{dx^2}.)

The arbitrary constant means another piece of information is needed for complete solution, as with the Newton's Law of Cooling and Half Life examples.

Provided all the xs can be got to one side and the ys to another the equation is separable.

 2 y \frac { {\rm d} y }  { {\rm d} x}  = 6x^2

\int 2 y  {\rm d} y  = \int 6x^2 {\rm d} x

y^2 = 2x^3 + c

This is the general solution.

Typical examples are:

y = x  \frac { {\rm d} y }  { {\rm d} x} ~~~~~~~~~~~~~~~~~~~~~~~~~\int \frac { {\rm d} y }  { y} = \int \frac { {\rm d} x }  { x}

\ln y = \ln x + \ln A ~~~~~~(constant)~~~~ i.e.~~~~~ y = A x

by definition of logs.

\frac { {\rm d} y }  { {\rm d} x} = ky~~~~~~~~~~~~~~ (1)

{\rm k} x y = \frac { {\rm d} y }  { {\rm d} x}

 \int {\frac  {{\rm d} y }  { y}  } = \int  kx {\rm d} x   ~~~~~~~~~~~~~~ (2)

~~~~~~~~\ln y = \frac 1 2 kx^2 + A ~~~~~~~ (3)

This corresponds to:

y = B e ^{\frac 1 2 kx^2 }

The Schrödinger equation is a 2nd order differential equation e.g. for the particle in a box

 \frac {\hbar^2 } {2m} \frac {{\rm d}^2 \Psi } { {\rm d}x^2 }  + E \Psi = 0

It has taken many decades of work to produce computationally efficient solutions of this equation for polyatomic molecules. Essentially one expands in coefficients of the atomic orbitals. Then integrates to make a differential equation a set of numbers, integrals, in a matrix. Matrix algebra then finishes the job and finds a solution by solving the resultant simultaneous equations.

The calculus of trigonometric functions[edit]

There are many different ways of expressing the same thing in trig functions and very often successful integration depends on recognising a trig identity.

\int { \sin 2x dx} = -\frac 1 2 \cos 2x + c

but could also be

\sin^2 x ~~{\rm or} ~~ -\cos^2 x

(each with an integration constant!).

When applying calculus to these functions it is necessary to spot which is the simplest form for the current manipulation. For integration it often contains a product of a function with its derivative like 2 \sin x \cos x where integration by substitution is possible.

Where a derivative can be spotted on the numerator and its integral below we will get a \ln function. This is how we integrate tan.

\int { \tan x dx} = \int { \frac {\sin x} { \cos x} } dx =- \ln~ ( \cos x ) + c

We can see this function goes to infinity at \pi / 2 as it should do.

Integration by rearrangement[edit]

Take for example:

\int { \cos^2 x dx}

Here there is no \sin function producted in with the \cos powers so we cannot use substitution. However there are the two trig identities

\cos^2 \theta + \sin^2 \theta = 1

and

\cos^2 \theta - \sin^2 \theta = \cos 2 \theta

Using these we have

\int { \cos^2 x dx}  = \int { \frac 1 2 { (1 + \cos 2 x ) } } dx

so we have two simple terms which we can integrate.

The Maclaurin series[edit]

We begin by making the assumption that a function can be approximated by an infinite power series in x:

f(x) = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + a_4 x^4 + \dots

By differentiating and setting x = 0 one gets

f(x) = f(0)  + f^{'}(0) x + \frac {f^{''}(0)} {2!} x^2 + \frac {f^{'''}(0)} {3!} x^3 + \frac {f^{''''}(0)} {4!} x^4 + \ldots

Sin, cos and e^x can be expressed by this series approximation

\sin x = x - \frac {x^3} {3!} + \frac {x^5} {5!} - \frac {x^7} {7!} \dots

\cos x = 1 - \frac {x^2} {2!} + \frac {x^4} {4!} - \frac {x^6} {6!} \dots

e^x = 1 + x + \frac {x^2} {2!} + \frac {x^3} {3!} + \frac {x^4} {4!} \dots

Notice e^x also works for negative x.

When differentiated or integrated e^x generates itself!

When differentiated \sin x generates \cos x.

By using series we can convert a complex function into a polynomial, and can use \sin x = x - x^3 / 6 for small x.

In actual fact the kind of approximation used inside computer programs is more like: \frac {a x^2 + b x + c } {A x^2 + B x + C }

These have greater range but are much harder to develop and a bit fiddly on the calculator or to estimate by raw brain power.

We cannot expand \ln x this way because \ln 0 is

- \infty. However \ln (1 + x) can be expanded.

Work out the series for \ln (1 + x).

Factorials[edit]

The factorials you have seen in series come from repeated differentiation. n! also has a statistical meaning as it is the number of unique ways you can arrange n objects.

0! is 1 by definition, i.e. the number of different ways you can arrange 0 objects is 1.

In statistical thermodynamics you will come across many factorials in expressions such as: W = \frac {N!}{n_0! n_1! n_2!...}

Factorials rapidly get unreasonably large: 6! = 720, 8! = 40320 but 12! = 479001600 so we need to divide them out into reasonable numbers if possible, so for example 8! /6! = 7 {\rm x} 8.

Stirling's approximation[edit]

Also in statistical thermodynamics you will find Stirling's approximation:

\ln x! \approx x \ln x -x

This is proved and discussed in Atkins' Physical Chemistry.

How can you use series to estimate \ln 2. Notice that the series for

\ln (1+x) converges extremely slowly. e^x is much faster because the

n! denominator becomes large quickly.

Trigonometric power series[edit]

Remember that when you use \sin x = x and \cos x = 1 - x^2 / 2

that x must be in radians.....

Calculus revision[edit]

Problems[edit]

integrate x to the power of x with respect to x

  1. Differentiate e^{-t} \cos 2t, with respect to t. (Hint - use the chain rule.)
  2. Differentiate x^{2} { ( 3x + 1 ) } ^4. (Chain rule and product rule here.)
  3. Differentiate \ln~ (7x^4). (Hint - split it into a sum of logs first.)
  4. Integrate \ln~ x. (Hint - use integration by parts and take the expression to be differentiated as 1.)

Answers[edit]

  1. It is just e^{-t} . -2 \sin 2t - e^{-t} \cos 2t. Bring a -e^{-t} out of each term to simplify to -a(b+c).
  2. 2x { ( 9x + 1) } { ( 3x + 1 ) } ^3.
  3. \ln~ 7 + 4 \ln~ x - therefore it is 4 times the derivative of \ln~ x.
  4. You should get x \ln~ x - x by 1 application of parts.