Page 1 of 1

Calculus Primer Part III- Newton-Raphson Method, Theorems & Contin

#1 macosxnerd101   User is offline

  • Games, Graphs, and Auctions
  • member icon

Reputation: 12680
  • View blog
  • Posts: 45,863
  • Joined: 27-December 08

Posted 28 April 2011 - 11:02 AM

In this tutorial, we will tie up derivatives with the Newton-Raphson method, as well as introduce our major theorems including the Intermediate Value Theorem, Mean Value Theorem, and Extreme Value Theorem. Lastly, I will discuss continuity of functions.

Newton-Raphson Method
The Newton-Raphson method is an iterative way to find the roots of a function. The formula for this method is as follows: x[i+1] = x[i] - f(x[i])/f'(x[i]). Since the derivative is the denominator, an f' value of 0 can throw us off, as well as a discontinuous or non-existant derivative at a root. Our last major condition to take into account is if the function does not converge onto a root. In this case, you will want to cap on the number of iterations.

One common use of the Newton-Raphson method is to find the square roots of numbers. This is quite easy to do by setting up a quadratic: x^2 - n = 0, where n is the number we want to find the square root of. For example, let's say n = 2. So our derivative is 2x, and we set up our equation: x[i+1] = x[i] - (x^2-2)/2x. Now we just need a good initial estimate. Since we know the sqrt(2) is between 1-2, 1.75 is a good estimate. Try to guestimate rather than picking arbitrary numbers. Guestimation can improve accuracy and the number of iterations for convergence.

So for our iterations, we get:
x[1] = 1.75 - (1.0625)/3.5
x[2] = 1.45 - (1.45^2-2)/2.9
x[3] = 1.41 - (1.41^2 - 2)/2.82

After only a few iterations, we start to see x converge to the sqrt(2), which is 1.4142.

There are some functions though, for which the Newton-Raphson method is not effective. One such function is f(x) = x^2+3. If we try to solve this using the quadratic formula: x = +-sqrt(-4)/2, we get imaginary roots. So when we apply the Newton-Raphson method here, it will not converge.

A good initial guess here is x = 1. So applying that, we get:
x[1] = 1 - 4/2 = -1
x[2] = -1 + 4/2 = 1

As we see, the Newton-Raphson method will return a list of oscillating values.

Intermediate Value Theorem
The intermediate value theorem is pretty simple. If we think about driving on the highway, we can apply the intermediate value theorem. In the given context, it means we go from 60-65 mpr, then we have to go through 61-64 mph to get there. Generalizing it back to math, the formal definition is (given that the function is continuous) that if there is an interval [a,b], there is some number c between f(a) and f(B) inclusive such that f(x) = c.

The squeeze theorem is an extension of the intermediate value theorem. It states that given functions f(x), g(x), and h(x) such that f(x) <= g(x) <= h(x), if lim(x-->a)f(x) and lim(x-->a)h(x) converge on the same value at x = c, then lim(x-->a)g© converges on that same value. Essentially, g(x) is being squeezed between f(x) and h(x). The following image demonstrates graphically how this appears.
Posted Image

Mean Value Theorem
The mean value theorem states that at some point on the interval [a,b], there exists a point c where the instantaneous velocity is equivelant to the average velocity. Recall that the average velocity is simply the slope formula: deltaY/deltaX. So f'© = (y2 - y1)/(x2 - x1). If we know f'(x) or can derive it, solving for c becomes relatively straight-forward.

Extreme Value Theorem
As the name implies, the extreme value theorem deals with the extrema (maxima and minima) of a function on a closed interval [a,b]. It states that on such an interval, the function takes on a maximum value at some point, as well as a minimum value at another point. It is this theorem that tells us when examining an interval to check our endpoints as well as critical points (points where the derivative = 0) to find the local extrema.

The one factor that makes Calculus possible is continuity. We can think of this graphically as us putting pen to paper and tracing over a graph without having to lift the pen up. That is, the function has no gaps or breaks in it. We can also evaluate continuity on an interval [a,b].

So why do we need continuity? We know that derivatives are slopes, and slopes require two points to calculate. So if we were limited to a discrete point, there would be know f(x+h) to evaluate. Note that continuity does not necessarily imply differentiability. As we saw with absolute value functions in my first tutorial, they are not differentiable at the vertex/cusp. This applies to all functions, not just absolute value functions. No function is differentiable at a cusp/vertex/sharp turn. As we will also see with integrals, which are used to find area under curves, there is no area under a discrete point. Without a continuous function, we cannot find the area.

So what exactly is continuity? Let's use game and graphics programming as an analogy. When animating an object programatically, we deal in how many frames/second an object moves. Each second, we might move an object 45 frames/second; and for each frame, we might move the object 3 pixels. So these are discrete movements that appear continuous. If an objects starts at (0,0) and moves to (3,3), it didn't actually go through (1,1), (1.5, 1.5), etc. What if we make our step-size 1 instead of 3? It is still discrete. We never go through (.5, .5) on the way to (2,2). So how is this relevant? Basically, if there are enough discrete points along a function, it can be called continuous. The more points we have, the more accurate our calculus is. In other words, a function becomes increasingly continuous as lim(numPoints-->infinity).

This tutorial provides a lot of the background and theorems to support the Calculus covered up to this point. In my next tutorial, I will introduce integration and the fundamental theorem of calculus.

Is This A Good Question/Topic? 1
  • +

Replies To: Calculus Primer Part III- Newton-Raphson Method, Theorems & Contin

#2 elgose   User is offline

  • D.I.C Head

Reputation: 102
  • View blog
  • Posts: 228
  • Joined: 03-December 09

Posted 02 May 2011 - 02:12 AM

Nice overview! Whole books can be (and have been) written on the "simplest" of concepts in calculus, so props on getting what you have into this post!

Some things I found interesting/was reminded of when reading this:

Newton's Method, when really examined, is a truly amazing method of approximation. In general, you tend to double your precision with each iteration. It does have obvious issues, though, such as "bad" guesses possibly DIVERGING from an x-intercept, and the obvious example you gave where there was no x-intercept to begin with (f(x) = x^2 + 3). I think it's important to note that when we apply this into the imaginary plane, a plot of the initial guess converging to a specific root produces great examples of fractals.


No function is differentiable at a cusp/vertex/sharp turn.
Be careful with your terminology. Granted, a sharp turn would not be differentiable, however a vertex often is.

Take, for example, a standard parabola y = a (x - h) ^ 2 + k. The vertex of a parabola is point (h, k). If we check a basic parabola, y = x^2 (therefore, a = 1, h = 0, k = 0), we find a vertex at (0, 0). y'(x) = dy/dx = 2x, and y'(0) = 0. Here we find that the derivative exists at a vertex.

What you're referring to here (the sharp cusps, turns, etc.) are usually traits of piecewise functions, or can be expanded to mean functions with asymptotes. In order for a function to be differential at a point, there's a general 2-step rule we can look at:

- Is it continuous at our point? That is, does lim(x->a+) f(x) = lim(x->a-) f(x) = f(a) (*And all three expressions exist/Are finite)
- Does the left and right hand limits agree? That is, does lim(h->0+) (f(x+h)-f(x))/h = lim(h->0-) (f(x+h)-f(x)/h (*Both exist and are finite, i.e. not infinity)

Taking f(x) = |x|, your example, we find that |x| is really a piecewise function:
f(x) = {x < 0, -x}, {x >= 0, +x}

Either by looking at the graph or by taking the left and right hand limits of f(x) as x->0, we see that f(x)=|x| is continuous and can go to the next step.

lim(x->0-) (|x+h| - |x|)/h means x is always < 0, therefore the (x < 0, -x) portion of the piecewise function tells us we can use:
lim(x->0-) (-(x+h)-(-x))/h
(Applying signs) = lim(x->0-) (-x-h+x)/h
(Cancelling x's) = lim(x->0-) (-h)/h
(Rearranging sings) = lim(x->0-) -(h/h)
(Simplifying fraction) = lim(x->0-) -1
(Applying limit) = -1
So far, so good - the value exists and is finite!

Looking at lim(x->0+), we know that x will always be > 0, so {x >= 0, +x} applies and we can rewrite our difference quotient:
lim(x->0+) (|x+h| - |x|)/h
(Applying x > 0 replacement) = lim(x->0+)((x+h) - (x))/h
(Applying signs) = lim(x->0+) (x + h - x)/h
(Cancelling x's) = lim(x->0+) (h)/h
(Simplifying fraction) = lim(x->0+) = 1
(Applying limit) = 1
Since -1 != 1, we now know that f(x) = |x| is not differentiable at the point x=0, or f'(x) D.N.E.

Similar process can be done with functions with asymptotes, even if the function increases towards the same directions on both sides of the asymptote. For example, f(x) = 1/(x^2) has a vertical asymptote at x=0, such that lim(x->0+) = lim(x->0-) = Infinity, but f(0) D.N.E. and therefore f'(0) D.N.E.

In general, I think it's safe to warn people to be careful with piecewise functions!

Squeeze theorem is another remarkable point used in calculus. It makes things which are usually very difficult to differentiate by most standard means (i.e. d(sin(x)/x)/dy (L'Hospital's Rule can't be justified, as we discussed in Part 2), and what appears to be the graph in your post d(x^2*sin(1/x))dy since d(sin(1/x))/dy D.N.E.) into fairly simple games of algebra involving inequalities. Very powerful stuff.

Quote well as critical points (points where the derivative = 0) to find the local extrema.
Just a note: on a closed interval, the extrema are considered absolute, not just local. A cool thing about this theorem is that it states on the interval [a, b], the function achieves a max and min value, but there may be countless extrema each with points (x, max) or (x, min).

Very interesting read, and a lot that can be done with it. I saw you have Part 4 up so I'll be sure to check it out when I get some time. Keep it up!
Was This Post Helpful? 0
  • +
  • -

#3 macosxnerd101   User is offline

  • Games, Graphs, and Auctions
  • member icon

Reputation: 12680
  • View blog
  • Posts: 45,863
  • Joined: 27-December 08

Posted 02 May 2011 - 07:21 AM



No function is differentiable at a cusp/vertex/sharp turn.

Be careful with your terminology. Granted, a sharp turn would not be differentiable, however a vertex often is.

Yep- that's what I meant. Perhaps "sharp vertex" would have been more descriptive to go with cusp.

Glad you've enjoyed my tutorials! :)
Was This Post Helpful? 0
  • +
  • -

Page 1 of 1