# History of Mathematics/Rebirth

## The Invention of Calculus

Despite currently being a near-universal component of mathematics, computer science, natural science and engineering education, the basic ideas of calculus only came into being in the 17th century in Europe. This work was predominantly done by two men: Sir Isaac Newton[1], an English scientist most famous for his three laws of motion and law of gravity, and Gottfried Wilhelm Leibniz, a German who is far less famous in the non-scientific community. They developed their ideas independently, although this has been disputed by parties from both countries, as both countries wished to take credit for the discovery. However, many textbooks today credit both men.

Newton invented his version of calculus in 1666[2] while Cambridge University was closed during the Plague. Despite entering the school with little knowledge of mathematics, he attended the lectures of Isaac Barrow, where he learned, among other things, of the methods Barrow used to determine tangent lines. These methods would prove influential on Newton, who would later develop his own methods as part of his differential calculus. During this break from Cambridge, Newton continued his research, making many of his notable discoveries all in the same year.[3] These include his three laws of motion, how to represent functions as sums of infinite series, the universal law of gravitation (now superseded by Albert Einstein's General Theory of Relativity), and an explanation of the dispersion of light to form a rainbow. Newton's version of calculus was heavily applied to develop his laws of motion, although they may be re-formulated without the use of calculus.

Newton's Second Law of motion, for example, was originally written F = dp/dt (in modern notation), where p stands for the momentum of an object (then called "the quantity of motion" by Newton.) This was the form Newton used to solve his problems, including a proof of Kepler's laws of planetary motion. The formula may be simplified to the better-known (and more commonly used in the most elementary mechanics) form F = ma by the following argument.

Suppose a point particle of mass m is pushed by a net force F in a given direction. Then F = dp/dt. But this mass is constant, so it is unaffected by the motion of the object. Assuming the relativistic effects are negligible, p = mv, so dp = d(mv) = m (dv). Then, factoring out the m, dp/dt = m[(dv)/dt]. However, this is simply the formula for instantaneous acceleration, so F = ma. Newton never wrote his equation in this form, leaving force as the time derivative of momentum instead.

Newton's notation was significantly different from that in common use today, as was the name for the science he invented: "the science of fluxions." Instead of the (now) common f'(x) notation for the derivative of f with respect to x, Newton employed a notation that used dots over the heads of a variable that was to be differentiated. This notation had some notable flaws; they can conflict with the use of the arrow notation for vectors and also give no indication of what variable is to be differentiated with respect to. Newton, however, used this notation for derivatives with respect to time.

\dot{x} was the symbol Newton used to denote the derivative of x with respect to time. The second derivative, as occurs, for example, in the definition of acceleration, was written \ddot{x}. Though this may be extended to include an arbitrary number of derivatives, its use has fallen out of style, except in mechanics, and both Lagrange's f'(x) notation and Leibniz's notation df/dt are more frequently used today in most branches of mathematics. Newton's notation was not wholly consistent; his "dot" notation was developed after writing Principia Mathematica, his best-known work, and he used geometrical arguments and none of his dot notation to explain the planetary motions he described therein[4]

Newton's work has come under a considerable amount of critisiscm. Notably, Newton's 1st law of motion, the law of inertia, has been attacked for being stolen from the work of Galileo; his corpuscle (particle) theory of light has been revoked both by Christian Huygens' wave theory and later by quantum mechanics, and, as previously mentioned, his laws of gravitation have been replaced by the theory of relativity. Newton's calculus, in the eyes of some pure mathematicians, was also riddled with errors. At the time, much of this controversy centered around Newton's use of infinitesimals. [Additional work needed on clarification of Newton's use of infinitesimals.]

## References

1. Stewart, James. Calculus: Early Transcendentals, 7th ed.
2. Stewart, James. Calculus: Early Transcendentals, 7th ed.
3. Stewart, James. Calculus: Early Transcendentals, 7th ed.