Talk:Fundamental theorem of calculus

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Mathematics (Rated B-class, Top-importance)
WikiProject Mathematics
This article is within the scope of WikiProject Mathematics, a collaborative effort to improve the coverage of Mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Mathematics rating:
B Class
Top Importance
 Field: Analysis
One of the 500 most frequently viewed mathematics articles.
This article has comments.

State formula on top[edit]

Maybe have the formula actually stated nearer the top, and not just explained in text? — Preceding unsigned comment added by (talk) 10:48, 15 February 2012 (UTC)

Antiderivative need being continuous throughout inteval[edit]

I found

a subtle note. Pls consider this.


Isaac Barrow discovered the fundamental theorem of calculus. He's not mentioned in the article. This is dispicable. —Preceding unsigned comment added by (talk) 23:20, 17 November 2007 (UTC)

It's hardly despicable. Chill out! (talk) 19:59, 18 March 2008 (UTC)

Have any of you actually read Barrow's proof? It's not a "completely general" version of the theorem at all. Barrow's theorem proves it only for constantly increasing (resp. constantly decreasing) functions where f(x), the function under whose graph the area under the curve is evaluated, is always positive. —Preceding unsigned comment added by (talk) 23:38, 8 February 2011 (UTC)

Also, the functions Barrow deals with have to be both continuous and differentiable everywhere.

Redirection needed[edit]

Fundamental Theorem of the calculus and Fundamental Theorem of the Calculus needs to redirect to this article.
People who know about Fundamental Theorems tend to capitalize their names: it is is just like capitalizing "The Bible", "The Torah", "The Koran", "Ancient Greece", and so forth. Thank goodness for Archimedes. (talk) 05:01, 1 August 2010 (UTC)

Comment by Michael Hardy[edit]

These articles on calculus seem unable to get through a sentence without two or three inappropriate uses of the word "you". Instead of saying "two plus three equals five", the author of these articles would write "Suppose you want to know what two plus three is. You will find that it is five." -- Mike Hardy

I agree with this assessment. The use of the first person plural is an almost universal convention in mathematics writing. This is primarily because the use of the second person ("you") often tends to assume a condescending attitude toward the reader, as in "I already understand this, but you're still struggling with it." Using "we" gives the impression that the author and reader are "on the same side". (Although, anyone who has read an advanced math book knows this is little consolation when he or she has read "we find", "we see", "we observe", etc. for the 100th time and doesn't understand.

I must admit I find it grating too. I prefer to write maths with "we", but sparingly. -- Tarquin 20:46 Jan 6, 2003 (UTC) (but look at the TeX equations! the soft curves of the integral sign! the variables leaping from tree to tree, as they float down the mighty rivers of British Columbia. ! The Giant Redwood. The Larch. The Fir! The mighty Scots Pine! ... Sing! Sing! [singing] I'ma lumberjack, and I'm okay. ... (men in white coats enter and carry Tarquin offstage)

I, personally, find "one" to be the best, although this tends not be befitting in most cases of mathematics articles. "One might find that three plus two equals five." doesn't quite carry the same... power? He Who Is 22:42, 7 June 2006 (UTC)


May I suggest using a simpler example for those of us that are not engineers. Perhaps something like in short the distance traveled is equal to the speed in miles per hour divided by the time spent traveling?

I have been having trouble finding a rigorous, but still easy to understand proof of the fundamental theorem of calculus. All theorems should have proofs (the definition of a theorem is basically something that can be proved) including this one. 05:38, 23 Sep 2004 (UTC)

Can we add something perhaps on Taylor expansions etc? —Anonymous?

(I trimmed the redundant proofs out. They're available on my user page, and of course the history.)

The proof I presented on the article is a little messy. I believe it needs wikifying. It also doesn't lead from the previous section quite right, but the intuition section is so well written I hardly want to touch it. There seems to be a community in the math wiki pages, but I'm having trouble tracking it down. Anyhow, I hope this meets with their approval.

Daelin 09:19, 29 Nov 2004 (UTC)

The proof I used appears to be the Riemann Integral, however I use plainer notation than our article on that (\sum_{i=1}^n f(c_i)(x_i-x_{i-1}) vs \sum_{i=1}^{n-1} f(t_i)(x_{i+1}-x_{i}) for instance). The Riemann Integral also includes far more discussion than mathematical expression. I'm not certain it's exactly the Riemann Integral, however. Verification, anyone? —Daelin 18:31, 2 Dec 2004 (UTC)

I must add that it does state explitly in WP:MSM that proofs should be excluded unless they directly add to the content. (e.g. act as a bridge to help explain other points.) He Who Is 22:46, 7 June 2006 (UTC)

The first proof sites the 'mean value theorem for integration' whose result, although not dependant on FTC, is one that is not even a consequence of the mean value theorem and has subtle details that may not require a formal proof, but at least some lip service as to why it is the case. The 'mean value theorem for integrals' really relies on the crucial result of the intermediate value theorem rather than the mean value theorem, which is what is linked. I imagine a curious reader wanting verification of this step, and then being led down a blind alley by checking the mean value theorem. So rather than citing the 'mean value theorem for integration' maybe put a small explanation like this: because f is continuous it attains its max and min on [Δx,Δx+h], at say m and M by the intermediate value theorem. By the definition of the integral,

\int_{x_1}^{x_1 + \Delta x} f(t) dt is bounded above and below by Δx*f(m) and Δx*f(M). Again, apply the intermediate value theorem to the inequality, one arrives at the desired result. -Kevin.t.joyce 10:20, 19 Oct 2006 (UTC)

Example of function which is integrable but has no antiderivative?[edit]

in the article it states "Part II of the theorem is true for any Lebesgue integrable function f which has an antiderivative F (not all integrable functions do, though)". I would like to know an example of a function which is integrable but doesn't have an antiderivative. That is, it seems to me that if the Lebesgue integral exists, then \int^xf(t)dt should be an antiderivative almost everywhere, which I guess would only be true when the FTC holds? -Lethe | Talk

I've dug around a bit, and it seems that Cantor function is an example of a function which doesn't obey the fundamental theorem of calculus, because it is not absolutely continuous which is apparently a requirement for the FTC? why doesn't it mention that in the article? So, does this mean that the Cantor function has no antiderivative? or that its antiderivative just doesn't obey the FTC? -Lethe | Talk
How about my faviorite, exp(-x^2)? My teacher once showed how to integrate it, but it has no anti-derivative. -- Taku 05:36, Dec 6, 2004 (UTC)
That function has an antiderivative. It's called the error function -Lethe | Talk

A step function H(x) that is 0 for x<=0 and 1 for x>0 is integrable on [-1,1], but it is not a derivative because derivatives have intermediate value property, as can be seen from applying the intermediate value theorem to the difference quotient (f(x+h)-f(x))/h. So we see that H(x) does not have an antiderivative. Any piecewise-continuous function with simple jumps will do as an example. Counterexamples in Analysis By Bernard R. Gelbaum, John M.H. Olmsted Dover June 2003, ISBN 0486428753 is a good reference.

I'm having a hard time believing your claim. Why isn't H(x) the derivative of the function f(x)=0 for x<=0, f(x)=x for x>0? -Lethe | Talk 01:00, 16 September 2005 (UTC)
Because your f is not differentiable at 0, that's why. michaelliv 15:43, 26 May 2006 (UTC)
Differentiability does not imply continuity of the derivative. In the above example, H(x) has an antiderivative at all points except at the jump discontinuities, and that antiderivative is \int H(x) = C \forall(x < 0), x + C \forall(x>0). In fact, the indefinite integral is literally defined as the antiderivative. Here the function is continuous and differential on all real numbers except 0, and so is its antiderivative. The only possible semantical argument is that the integral is defined at x=0 whereas the antiderivative is not, but I don't think there's a whole lot of merit to that. In general, any function that is continuous on an interval other than at a finite number of jump and removable discontinuities is integrable at all points in the interval except those discontinuities. I don't have the proof, but I have seen the theorem.

An example of a function that is not integrable on any interval is f(x)=1 if x is rational, 0 if x is irrational, since this has an infinite number of jump discontinuities in every interval. Infinite discontinuities are also not (Riemann) integrable. Eebster the Great (talk) 03:47, 11 November 2008 (UTC)

This is a great, simple, and classic example of a function that has no antiderivative, is not continuous at any rational number, and hence is not uniformly continuous on any close interval, and hence is not Riemann integrable on any interval.This one is often used as an example in graduate courses in mathematics.

On the other hand, this function is Lesbegue integrable on any closed interval that you care to name, and the value of that integral is always exactly zero. This is because th set on which this function is nonzero has the measure zero.
Yikes! Now, we are getting a little bit into Measure Theory, which for a lot of people (including me) leads to unhappy feelings. (talk) 05:15, 1 August 2010 (UTC)

Misprints corrected[edit]

Today I made changes to Part II, Corollary, and the proof of Part II. Part II and Corollary assumed F only to be continuous and nevertheless delt with F'. It looks like a simple interchange of f and F had happened.

All elementary (if not all) formulations of the fundamental theorem of calculus suffer from the inability to give a simple characterization of the regularity properties of F (in addition to the differentiability) that imply F(b)-F(a) = integral from a to b of F'(x) dx. The problem is complicated since one could fine tune the notion of the integral or even this of the derivative to obtain a simple and sufficiently general formulation of the fundamental theorem of calculus.

One interesting question results from this, and I would be glad to get answers, hints, or opinions about it: Consider a simple setting: a finite closed interval [a,b], a<b, and a real-valued differentiable function f defined on that interval (obvious what this means even at the end points of the interval). What do we know about the real valued function f' ????. Can we characterize these properties ('to be a derivative') in terms of this function alone without mentioning the function from which it can be obtained by differentiation?ulrich 07:03, 10 Jun 2005 (UTC)

If I understand you correctly then I see no answer to your question. What does it mean to be a derivative, other than to imply the existence of an antiderivative? That is the only meaningful interpretaion of your question that I can consider. If you accept that, then any proof would require the construction of an antiderivative, and then the mean value theorem would ensure that it is basically the usual F in the FTOC.

Various Comments[edit]

The 'Intuition' section is not intuitive. The FTOC simply states that the the displacement of a moving object is equal to the net area under the velocity graph. I said that without any symbols or any mention of derivatives! This is most easily seen in the case of "distance = rate*time". Thus, the fundamental theorem is an attempt to generalise this forumla for non-constant velocity. "We've just learned how to differentiate. Now what?"

The way I came to understand the relationship between the derivative and antiderivative (I think I understand it intuitively) was to understand that the derivative function is a "quotient function". More particularly, the quotient consists of the value of vertical axis of the antiderivative function divided by the horizontal axis of the antideviative function which is the same as the horizontal axis of the derivative function. Therefore, multiplying the vertical axis of the derivative function value times its horizontal axis (area under the curve) is basically undoing the same as dy/dx times dx = the vertical value of the antiderivative function. This is crudely stated, but it's the only way I can intuitively understand why the area under the curve of a function is related to the value of an antiderivative function. Or, more briefly, dy/dx of the antiderivative function is the value of the derivative function and multiplying it by the dx of the derivative function gives us dy/dx times dx (area) = dy (value of antiderivative). —Preceding unsigned comment added by (talk) 01:42, 13 August 2010 (UTC)

I agree, the intuitive section was not intuitive at all, in fact it confused me. Maybe something like 'the velocity at some time multiplied by a tiny amount of time gives us a tiny displacement. If we sum up all of those tiny displacements, we get the total displacement'. It may also be helpful when saying these things to refer to what theorem you are trying to explain Taras 96 (talk) 00:50, 29 November 2007 (UTC)

I agree with the two comments above, but don't like the fact that distance, time, and velocity are brought into the discussion at all! We all learn using x for the horizontal axis and y for the vertical, and f(x). Plus, we learn slope as x/y and the derivative as dy/dx. This is what math students are most comfortable with. Why, then, bring in physics? Makes no sense to me and I had to translate everything written away from the whole velocity discussion. I still don't intuitively understand the FTOC. As for comments below, NO, don't delete the intuition section. That's the main thing I came to this article for. And, NO, you don't lose students when you use the word "infinitesimal". In fact, that is something I understand intuitively. If students are lost by this, then they shouldn't be in a calculus course. —Preceding unsigned comment added by (talk) 14:24, 12 August 2010 (UTC)

I also agree. The physical intuition section is filled with many true statements, and I think its author is to be commended for at least attempting to provide an intuitive explanation. But the physical intuition section as written doesn't work. And every time I try my hand at re-writing it, I can't make it work. I recommend we omit that section from the article. (talk) 16:00, 20 August 2008 (UTC)

If you use the word "infinitesmal" in a calculus course you have lost your students. This is really a shame. Most modern US textbooks make some attempt at discussing differentials, but I've yet to see a single one connect this to the fundamental theorem. It's beyond me what their point is. It is differentials we integrate. What's worse is when you treat substitution, you end up saying vague things, like everything really is a differential...sigh

The FTOC is stunningly beautiful (thanks to Leibniz) when we realise that it simply says the "dx's cancel" in dF/dx dx and we end up integrating (summing) dF instead, thus arriving at the net change of F!

The reason for constructing the function F should be phrased in English: Every continous function has an antiderivative. Some sense should be given to how remarkable this is, or perhaps how succesful the calculus is in "clearing things up". The class of differentiable functions is sooo much smaller than the class of continuous functions -- the fundamental theorem requires reflection on this fact.

In the wikipedia page for differentiation they say that a function is differentiable on an interval if it is differentiable at every point of the interval. Since there is no mention of left or right hand derivatives, it is impossible for any function defined on a closed interval to be differentiable at the endpoints -- the two sided limit defining the derivative vacuously does not exist. Thus F'(x) = f(x) only for x in (a,b) and not for all x in [a,b].

The intermediate value theorem is very subtle and totally irrelevant to the fundamental theorem, it's a shame that lots of authors drag it into the proof. All is needed for a proof is continuity of the integrand and positivity of the integral (that is the fact that the integral of a positive function is positive). Can you see how to clean up the proof? Take it as an exercise!

I agree 100% with your last remark, the numerous authors of the numerous calculus books have been copying from each other for many decades without giving enough thought to the subject. As a result, calculus is in a very bad need of renovation today. You can visit my web page at to see some ideas on how to proceed. michaelliv 16:46, 26 May 2006 (UTC)

F'(x) = f(x) = G(x)[edit]

Is there a legitimate, compelling reason why f(x) is used for F'(x), instead of (e.g.) G(x)—or even just use F'(x), itself? I find F(b) - F(a) = \int_{a}^{b} f(x)\,dx can be quite confusing to an unsure reader, especially since above it is given f'(c) = \frac{f(b) - f(a)}{b - a}. It should be either F(b) - F(a) = \int_{a}^{b} G(x)\,dx or, preferably, F(b) - F(a) = \int_{a}^{b} F'(x)\,dx. ~Kaimbridge~ 23:10, 25 November 2005 (UTC)

I find the current notation very acceptable. It suggests that f and F are somehow related, in this case F'=f. This is even more useful when you have two functions, f and g. Calling their antiderivatives F and G is more helpful than calling them H and L. No? :) Oleg Alexandrov (talk) 23:25, 25 November 2005 (UTC)

I'm not saying it should be some other random letter—but Cos(x) = cos(x) = sin'(x) = Sin'(x), so someone just first attempting to understand all this may think F'=f means F'=F, especially since f did equal F earlier in the article (f'(c) = \frac{f(b) - f(a)}{b - a}). I just think there should be a uniform F/F' assignment thoughout the whole article—if you really want to identify an integrand as a different function (and personally I think the integrand should stay identified as a derivative), then let F'=G, not f (or, at the very least, change it to G'=g, to at least eliminate confusion with the earlier assignment of f to F and f' to F'). ~Kaimbridge~ 01:25, 26 November 2005 (UTC)

It is false that Cos(x)=cos(x)! (Just kidding. :) In math nobody uses Cos with big C, only mathematica does. I guess you need to get used to math notation. :) Oleg Alexandrov (talk) 02:26, 26 November 2005 (UTC)

Sure they do, whenever it is presented at the beginning of a sentence! P=) ~Kaimbridge~ 13:48, 26 November 2005 (UTC)

I think everywhere in the article f was a function, and F its antiderivative. No? Oleg Alexandrov (talk) 02:29, 26 November 2005 (UTC)

Ah, okay, but as I understand it F(x)'s derivative is F'(x) or G(x) (no, I'm not fixated on G, just that it is the letter after F. P=), and the antiderivative of F'(x) is \int G(x)\,dx=\int F'(x)\,dx=F(x)+C or, if you want to use f(x) as the function, \int f(x)\,dx=\int e'(x)\,dx=e(x)+C. I'm not saying it is wrong as given in the article (in fact, most articles/papers do present it this way), it's just that, IMHO, it creates a lot of unnecessary ambiguity (no, not to someone who already understands it—in which case it would be just preaching to the choir—but to someone who is first attempting to understand it....I'm saying that from previous, personal experience! P=) ~Kaimbridge~ 13:48, 26 November 2005 (UTC)

I don't understand you. What is that G and e and all? I find this article perfectly clear notatiionwise, and I think your suggestions are going to make it less so. Maybe you should get more familiar with usual math notation. :) Oleg Alexandrov (talk) 23:41, 26 November 2005 (UTC)

All I'm suggesting is that the article provide a consistent, generalized notation flow: F'(x) = G(x), F''(x) = G'(x) = H(x), etc.; and f'(x) = g(x), f''(x) = g'(x) = h(x), but there is nothing that says f has to be related to F, g to G, or h to H, so why introduce that ambiguity in this type of elementary, concept explaining article—I find it particularly ambiguitous and potentially confusing that f starts out as F' (in the Formal statements), continues on down through Part I of Proof, where F'(x_1) = \lim_{c \to x_1} f(c) is introduced, then—at the beginning of Part II—F(b) - F(a)\,\! is introduced, followed down a little further by the statements f'(c) = \frac{f(b) - f(a)}{b - a} and f'(c)(b - a) = f(b) - f(a) \,\!:

So F(b) - F(a)\,\! and f(b) - f(a)\,\! are commingled together. If it is to be consistent, then it should be f''(c)=F'(c)=\frac{F(b)-F(a)}{b-a}\,\!, though f''(c) is misleading since it suggests \frac{f'(b)-f'(a)}{b-a}\,\!, when it is actually \frac{f'(c_b)-f'(c_a)}{\frac{1}{2}(b-a)}\,\! (\mbox{where }m=\frac{a+b}{2}, \ \mathbf{F(b)-F(m)=f'(c_b)[b-m]}\,\! \mbox{ and }\mathbf{F(m)-F(a)=f'(c_a)[m-a]})\,\! (I think I have that set up right! P=), which isn't very helpful as an introduction.

It should just be F'(c) = \frac{F(b) - F(a)}{b - a}

I do understand what is meant, I'm just playing devil's advocate, approaching it from the view of someone who doesn't and is attempting it from scratch—though, don't worry, I'm not looking to mess with the article (at least now), as I have other projects in progress. P=) ~Kaimbridge~ 15:16, 27 November 2005 (UTC)

I fixed the occurence you mentioned. Is there any inconsistency anywhere else? Thanks. Oleg Alexandrov (talk) 21:10, 27 November 2005 (UTC)

A definite improvement! P=) ~Kaimbridge~ 15:26, 28 November 2005 (UTC)

Using F'(x)=f(x) is simply commonly accepted notation amonst the mathmatical comunity akin to arcsin(x) being defined on [-π/2 , π/2]--Tiberious726 01:04, 20 January 2006 (UTC)

I —Preceding unsigned comment added by (talk) 14:14, 12 August 2010 (UTC)

What is "t"?[edit]

the first equation of the "formal statements" section, the variable "t" appears out of nowhere; it is neither defined nor used in any further line. To me it appears that the variable should be "x" and that the equation works that way... either way it is very unclear.

Sure it is—"t" is defined as "time" in the previous "intuition" section and is used in the next subsection ("Corollary"), and is the variable between a and x. However, in the "intuition" section, shouldn't "v(t)" be "x(t)"?  ~Kaimbridge~14:51, 15 December 2005 (UTC)
There is both v(t) and x(t) there from what I saw. And the derivative of position, x(t), is the velocity, v(t). So everything looks right to me, I hope. Oleg Alexandrov (talk) 19:41, 15 December 2005 (UTC)

I'm with the initial question. I took calculus 7 years ago and never used it. Now I'm in economics and find many equations referring to derivatives. I have looked all over to try to find something to re-explain the basics of how to find a derivative, and everything I have found assumes I already know. I found it very easy to learn originally but time (and a head injury during that semester) has erased it. I desperately need a refresher. I really just need a very basic explanation - explained like I'm 6. Any help - please? Diane

You'll probably be able to understand this, but a 6 year old won't because there's no easy way to explain this. A derivative is the slope of a tangent line at a point. a tangent line is a line exactly touching at the one point of the curve, but can hit other points when the function is similar to a wave. A derivative is given by this formula when [f(x+deltax)-f(x)]/deltax when deltax approaches 0. You should know this type of thing. Here's the easier way to find a derivative, cnx^n-1=dy/dx of cx^n


This page states that the fundamental theorem was first proved by James Gregory. As I recall, he only proved a special case of the second fundamental theorem, and doesn't mention the first at all. While Barrow certainly influenced Newton's thinking on this matter, I think we should certainly mention Newton and Leibniz in this article. Grokmoo 15:59, 20 December 2005 (UTC)

I don't know the history, but I've always seen this theorem called the Leibniz-Newton theorem. I think there should be a mention of them, but I don't know enough to put it in myself AdamSmithee 22:59, 18 February 2006 (UTC)
I agree. In my business calculus class, we are learning that Issac Newton and Gottfried Wilhelm Leibniz delveloped the theorem independently.

James Stewart[edit]

An authoritative author? Are you sure?

One should cite the relevant work by Barrow, and not that of Stewart. But I couldn't find it. Does Stewart give a reference? I don't have access to his book.

"detrimental theorem of calculus" ?[edit]

I've never heard of anything about people arguing to call something a "more-apt" "detrimental theorem of calculus"... Is this some kind of vandalism or other such thing? A google search for this text results in only the Wikipedia match, so it sounds extremely fishy to me. 03:25, 7 February 2006 (UTC)

Weired sentence indeed. I cut it off from the article. Oleg Alexandrov (talk) 04:07, 7 February 2006 (UTC)

FORMULA needed[edit]

Shouldn't the formula just be stated outright before the proofs like on the top?

About the second theorem[edit]

For the books in Hong Kong, the second fundamental theorem, which is stated here as the process of antidifferentiation can be used to calculated definite integrals, is simply called the "fundamental theorem of calculus". Should a remark be added? --Deryck C. 09:07, 16 January 2007 (UTC)

Someday, science students will learn that the -DERIVATIVE- should have been designated as the "fundamental theorem" of calculus.

The connection atop the article: "derivation and integration are inverse operations"[edit]

It's unwise to exaggerate the relationship of the derivative with the integral. The Fundamental Theorem (in all its forms) is the beginning and the end of their relationship; everything else is intuition. --VKokielov 19:29, 27 July 2005 (UTC)

I agree. Differentiation and integration are in no way inverses. Differentiation and antidifferentiation are inverses. He Who Is 23:51, 7 June 2006 (UTC)
"Differentiation and antidifferentiation are inverses" That's a tautology, isn't it? Hardly worthy of being called a theorem. I think the point of the FTC is that it shows that antidifferentation and integration are closely related, not that differentiation and antidifferentiation are closely related. (The later statement is just obvious, even linguistically; it doesn't need a theorem.) That's the whole point of the FTC. People knew how to calculate derivatives for centuries before the FTC. And people knew how to calculate integrals for centuries before the FTC. But before the FTC, nobody knew that the two were "inverses" of one another. (talk) 20:06, 28 August 2008 (UTC)

Only an integral with variable upper limit is invertible (with some restrictions)[edit]

I don't completely agree with He Who Is. If we define antidifferentiation as the process of computing and arbitrarily selecting one of the possible antiderivatives of a function, differentiation and antidifferentiation are not inverses. They are quasi-inverses. Neither the differentiation is the inverse of the antidifferentiation, nor viceversa. For one of them to be the inverse of the other, they should be able to reverse each other (and for that, they need to be bijective, and they are not).

Four important notes:

  1. Antidifferentiation cannot be defined differently, because antidifferentiation and antiderivative are strictly related concepts (as well as differentiation and derivative, they represent, respectively, the operation and its result), and all authors agree about the definition of antiderivative.
  2. According to most authors, antidifferentiation (as defined above) is synonym of indefinite integration. The article about antidifferentiation currently espouses this approach. Other authors say that indefinite integration is a process which yields an infinite set of antiderivatives, rather than an arbitrarily selected element of that set.
  3. The definite integral with variable upper limit, used in the first part of the Fundamental Theorem, non-arbitrarily selects a specific antiderivative (the antiderivative with zero at time t = a). Thus, it is something more than an indefinite integral. Due to the fixed lower limit (a), it yields a specific antiderivative, rather than any antiderivative, and as a consequence it might be regarded as bijective and invertible, provided that its codomain is restricted to its range R (int: XR), and of course the domain of the derivative is restricted to R (der: RX), otherwise the integral is not surjective and as a consequence cannot reverse the derivative.
  4. Even the differentiation is not invertible! Since it loses information about the constant term of its operand, it is not injective (information preserving), and as a consequence it is neither bijective nor invertible.

Paolo.dL 16:38, 8 August 2007 (UTC)

Arguing about whether they are inverses or quasi-inverses is splitting hairs, and especially, in the lead. I personally would prefer "inverses", and as well as more intuitive explanation in the lead. This is not a textbook! Proofs can be safely omitted from this article, but the meaning of the theorem, call it "intuition" if you will, has to be explained. Arcfrk 01:49, 9 August 2007 (UTC)

I agree that we should focus on the explanation of the theorem, and that the difference between quasi-inverse and inverse is not important in this context. And I so wholeheartedly agree that, even before reading your comment (which I had not noticed yet because you originally inserted it above my 8 August comment), I removed the note about inverses and quasi-inverses, simplified the introduction (see next section), and wrote the following comment. However, although this doesn't matter anymore, I don't agree about using incorrect mathematical terminology in an article about mathematics. Paolo.dL 12:47, 9 August 2007 (UTC)

I don't think that this topic should be discussed in the article, simply because the theorem does not deal with it. It just says that the indefinite integral is reversible (can be undone) by a differentiation. Reversible (i.e. injective) is not synonym to invertible (i.e. injective and surjective, i.e. bijective).

As explained above, neither the indefinite integration nor the differentiation comply with the formal definition of inverse operation or function, with respect to each other. Paolo.dL 12:06, 9 August 2007 (UTC)

Differentiation, Antidifferentiation, and definite integration[edit]

I believe that the theorem (in its two parts) actually describes the relationship between three concepts: differentiation, antidifferentiation (i.e. definite integration with variable domain), and definite integration with fixed domain:

It is important to realize that the theorem uses two extremely different kinds of integral (variable domain and fixed domain). The integral used in the first part of the theorem (definite with variable upper limit) is more similar to an indefinite integral than to a "standard" (i.e. fixed domain) definite integral. Please see note 1 in the article, and my 8 August comment in the previous section.

I rearranged the introduction, according to this rationale, but respecting as much as I could the original text. Now, I believe it appears simpler to understand, and at the same time less approximative. Please, let me know if you like it.

Paolo.dL 11:28, 9 August 2007 (UTC)

Donald Duck Image[edit]

I removed this from the article.

Image:Donald_Duck_Fundamental_Theorem_Calculus.jpg The theorem is perhaps the most well known Fundamental theorem in popular culture. To impress the judges in a contest, Donald Duck figure skates the theorem on ice. The theorem is written differently and is slightly incorrect: one of the equal signs should have been a minus sign. Donald used f'(x) instead of f'(x)dx, which is usually frowned upon, but not improper. Nothing in the notation indicates the vector analysis version either. The correct version for real functions should have been:  \int_b^a f'(x) \mathrm d x = f(a)-f(b)

Here is my reasoning. The main issue is that this is a nonfree image, and nonfree images are only permitted in Wikipedia articles when they make a significant addition to the article. In this case, the image is in a section on the statement of the fundamental theorem - which the image does not illustrate, and which can be conveyed perfectly via words alone. So the image doesn't make a significant addition to that section. The image might be appropriate for a section on "fundamental theorem in popular culture" except that such sections are deprecated, for good reason in my opinion, and the gist of the cartoon can be conveyed fine by a sentence.

Basically, as I see it, this image is a humorous sidebar for the article. It isn't bad for that role, and if the image were free I wouldn't worry about it (although the caption is a little too informal for my taste). But the goal of Wikipedia is to build a freely reusable encyclopedia, and so the policy doesn't permit nonfree images to be used in this way. — Carl (CBM · talk) 14:10, 20 August 2007 (UTC)

Having added the picture originally, my views are hardly neutral, and I think it would be a shame to exclude it. If it might be appropriate for a section on "fundamental theorem in popular culture" it ought to be appropriate here as well.
The picture illustrates two important things 1) that the theorem is known in popular culture even though 2) it isn't written as it should. There is a knowledge of existence, but not a knowledge of exact content.
Simply mentioning that the theorem exists in popular culture is a poor substitute, it wouldn't illustrate it properly. Should the picture of Bart Simpson be removed the article and with the text "yellow cartoon boy"?Aastrup 03:42, 23 August 2007 (UTC)
Putting a picture of Bart Simpson on the chalkboard article would have the same problem as this. — Carl (CBM · talk) —The preceding signed but undated comment was added at 03:46, August 23, 2007 (UTC).
And you think it would be appropriate in the "chalkboards in popular culture" article?
I don't agree with the comparrison. What's written on a board (e.g Bart's chalkboard jokes)seldom has anything to do with the board itself, but Donald Duck skating a particular theorem, and skating it incorrectly, provides a useful illustration of how well the theorem is known, and the Fundamental_theorem_of_calculus must be one of the only mathematical theorems in the world that ever appeared in an mainstream comic. Being a single panel low resolution picture, it can't harm the sale of the original comic. Aastrup 04:27, 23 August 2007 (UTC)
Lots of math shows up in comics, in my memory. The fact that he states it wrong is not exactly evidence that the theorem is well known, and in any case we would need an actual published source to make that inference. I appreciate that this is a cute image, but our goal is to produce an encyclopedia with as little nonfree content as possible, and this image is not needed to understand the fundamental theorem or its notability. — Carl (CBM · talk) 05:12, 23 August 2007 (UTC)
A pity, I really liked that picture. But liking it isn't a valid reason for why it should stay. I'd love to hear some comments from other about this, am I the only one that thinks the picture should stay? —The preceding unsigned comment was added by Aastrup (talkcontribs) 05:47, August 23, 2007 (UTC).
That's a good idea. You could ask people who know math at Wikipedia talk:WikiProject Mathematics or people who are familiar about our image policies at this talk page. I don't know how much overlap there is between the two groups. — Carl (CBM · talk) 13:25, 23 August 2007 (UTC)


Isn't the last corollary exactly the same as the first theorem, except written slightly differently?

The first theorem says

'if we take the integral from a to x of f, then differentiate it with respect to x, then we get the original function f(x)'

This is exactly what the last corollary listed states, except it does it one line as opposed to two Taras 96 (talk) 00:54, 29 November 2007 (UTC)

I agree, the corollary is beyond pointless. The article defines F(x) = \int_a^b f(t)dt, and has FTC 1 state F'(x) = f(x)\, (or in Leibniz notation, \frac{d}{dx} F(x) = f(x)). The corollary then takes the profound leap of substituting in F, to obtain \frac{d}{dx} \int_a^b f(t)dt = f(x). Eebster the Great (talk) 01:08, 14 January 2009 (UTC)
The corollary is quite important: it says that every primitive of f(x) is given as an indefinite integral plus a constant. This is not strictly contained in either of the two parts of the theorem. siℓℓy rabbit (talk) 03:53, 15 January 2009 (UTC)
No, it does not say this. It says that the derivative of a function's accumulation function is that function. It does not say that every antiderivative is said accumulation function, plus a constant. A proof of this fact is more easily stated as "all antiderivatives of a function differ only by a constant," and requires absolutely no use of the FTC; rather it uses the mean value theorem and basic algebra and facts about antiderivatives. If you could restate the corollary so it says what you are claiming, it would be useful. But please, look at it, and tell me where it says anything like what you are claiming. Eebster the Great (talk) 22:16, 16 January 2009 (UTC)
It says exactly this. That every antiderivative F(x) of f(x) is given by an indefinite integral (as you call it an "accumulation function", though this term is strange to me) plus a constant. Indeed, it is a corollary to either the first part of the theorem (by invoking the mean value theorem, as you suggest) or the second part of the theorem, by setting b=x. siℓℓy rabbit (talk) 22:31, 16 January 2009 (UTC)
I have replaced the statement of the Second fundamental theorem with the version in the cited source (which was originally the statement of the corollary). I have moved the old statement into the corollary section. siℓℓy rabbit (talk) 22:40, 16 January 2009 (UTC)
Thank you, this corollary is much better. I no longer have any problem with it. Eebster the Great (talk) 07:09, 18 January 2009 (UTC)

This corollary should be removed, or it should at least be placed after the second part of the theorem. A function that is continuous on [a,b] is automatically integrable on [a,b], so it's rather redundant for this to follow the first part when it follows immediately from the second part. Additionally, its proof is not worded very well. The fact that two antiderivatives differ by a constant follows from the Mean Value Theorem. This is by no means obvious and requires justification. —Preceding unsigned comment added by (talk) 11:25, 7 November 2010 (UTC)

Numbering of the theorems[edit]

Most pages I've seen have numbered the theorems opposite to how they're numbered in this Wiki article, most notably Wolfram. Is there an accepted labelling scheme?Taras 96 (talk) 00:56, 29 November 2007 (UTC)

Discoverer of FTC[edit]

It is agreed by historians that Barrow discovered the FTC, but elementary mathematics books are not written by historians and so the history is simplified. Instead of acknowledging the less glorified figures, the main developers of calculus (Newton and Leibniz) are given credit for everything, including the fundamental theorem. This is a historical fib, which can be easily corrected on Wikipedia, since an article is not confined to the domain of expertise of any one person. It is perfectly possible for a person who knows the history to edit the math pages for historical accuracy, just as for a person who knows mathematics to edit a history page for mathematical accuracy.

The fact that Barrow discovered the FTC does not diminish Newton or Leibniz's claim to the discovery of the calculus. They were the ones that extended Barrows limited results to a full system capable of arbitrary generalization. That is why the bulk of the credit goes justly to them. But there is no need to lie about it, even if that lie is in many mathematics books (but in no serious work on the history of mathematics).Likebox (talk) 19:49, 16 February 2008 (UTC)


First Fundamental Theorem of Calculus.

Let f be a continuous real-valued function defined on a closed interval [a, b]. Let F be the function defined, for all x in [a, b], by

F(x) = \int_a^x f(t)\, \mathrm dt

Then, F is differentiable on [a, b], and for every x in [a, b],

F'(x) = f(x)\,.

Here you work with closed intervals, but mathworld [1] works with open intervals. I think there is a mistake. Randomblue (talk) 04:35, 17 April 2008 (UTC)

I don't think there is any mistake. The theorem is true both with open and with closed intervals.
I think the version with closed intervals is a bit more general, since any open interval can be written as a union of an increasing sequence of closed intervals, and by applying this theorem to any closed interval you end up with the antiderivative on the entire open interval. Oleg Alexandrov (talk) 05:03, 17 April 2008 (UTC)

FTC Pt 1 in words?[edit]

Copied from the article:

In words, the value of the definite integral \int_a^x f(t)\, dt, viewed as a function of x, is an antiderivative of f.

Is it really a definite integral? It also leaves out the conditions of the start of the theorem. Does it really make the statement above it any clearer? Cheers, Ben (talk) 04:06, 12 June 2008 (UTC).

First/Second Fundamental Theorem[edit]

Mathworld ( seems to invert the name of the two theorems. The First Fundamental Theorem should be the one that allows to calculate indefinite integrals by using antidifferentiation.

Also, the course notes for the MIT single-variable Calculus course (18.01, by Prof.Ben Brubaker) seem to agree with Mathworld. Unfortunately, I cannot find other resources for double-checking this.

Could someone please verify that the names given on Wikipedia are correct?

--cloudguitar (talk) 01:13, 19 October 2008 (UTC)

I can verify that the names are correct, based on Apostol's book (a standard text from when I was a lad). However, there doesn't seem to be a standard order. For some authors, the first theorem is our second, and vice-versa. For other authors, there is only one fundamental theorem of calculus, which subsumes both of ours. The article should probably point these issues out. siℓℓy rabbit (talk) 04:26, 19 October 2008 (UTC)
I just checked a few books too. The same as this article: Stewart, Thomas. In the opposite order: Anton, Grossman. Rudin doesn't give the first part (in this article) a name, and just calls the second part the Fundamental Theorem of Calculus. Ben (talk) 04:46, 19 October 2008 (UTC)

Proof of the First Part[edit]

Hi, from my amateur perspective this proof appears to be a circular argument. At one stage it offers the Intermediate Value Theorem for Integration as justification, but does the proof of this theorem not use the Fundamental Theorem itself?? I'm not disputing that the theorem itself is wrong!! But I have been unable to find a proof of the Intermediate Theorem for Integration which does not use the Fundamental theorem of Calculus. Could someone please help me with this one?? Thanks. Lyndona6 (talk) 18:04, 2 November 2008 (UTC)

Terminological confusion[edit]

In the intro, the article states " . . . the first fundamental theorem of calculus, shows that an indefinite integration can be reversed by a differentiation." Not only is this poorly worded (there is no reason to use the articles "an" and "a"), but it actually misses the theorem. "Indefinite integral" redirects to "Antiderivative," and the page states that they are synonyms. Therefore this first line literally states that FTC1 shows that antidifferentiation and differentiation are inverses. What FTC1 actually states is that antidifferentiation can be reversed by DEFINITE integration with a variable upper limit. There is a huge difference.

On a separate note, the statement of FTC1 says:

"Let f be a continuous real-valued function defined on a closed interval [a, b]. Let F be the function defined, for all x in [a, b], by F(x) = \int_a^x f(t)\, dt\,. Then, F is continuous on [a, b], differentiable on the open interval (a, b), and F'(x) = f(x)\, for all x in (a, b)."

It is obvious that F is differentiable on (a,b) if F'(x)=f(x) on (a,b). The fact that they are equal implies that they are defined, and therefore by definition F is differentiable. Furthermore, the fact that it isdifferentiable already implies it is also continuous, so that line can be removed as well. "Continuous real-valued" is redundant, because the definition of continuity implies that the function is real-valued on [a,b]. The line "Let F be the function defined, for all x in [a,b], by . . . " can be reduced to "Let F = . . . ;" the interval is irrelevant, since we are only considering that interval anyways. Also, the first line should say "Let f be a function continuous on a closed interval [a,b]," since as it is stated it must be continuous everywhere and only defined on [a,b], which is nonsensical.

My proposal:

Let f be a function continuous on a closed interval [a, b], and F(x) = \int_a^x f(t)\, dt\,.
Then, F'(x) = f(x) \forall x \in (a,b)\,.

Obviously it could be made more or less symbolic if necessary.

In my mind, using Leibniz notation is even more streamlined, since you can avoid defining F as a function altogether, and combine the derivative and the integral into a single equation: f continuous \forall x \in [a,b] \to \frac{d}{dx} \int_a^x f(t)\, dt\, = f(x)\, \forall x \in (a,b). Truly, the only hypothesis to FTC is continuity on some interval, and this equation makes that clear (Let f be continuous, then conclusion). What does everybody else think? Eebster the Great (talk) 04:14, 11 November 2008 (UTC)

It has been two months; is there any particular convention to follow or reason to use a particular definition? I think my objections to the current formulation are legitimate, and if they are, should I change the page myself? I would like some feedback before I change the actual statement of the theorem itself. Eebster the Great (talk) 06:10, 12 January 2009 (UTC)
In regards to your objections, first of all, there is a difference between indefinite integrals and antiderivatives, and I think I shall have a look shortly at that article to see if this distinction can be cleared up. But the shortcomings of that article are no excuse to perpetuate the same misinformation here. As for your proposal, "real-valued" is important here, and the hypothesis should not be omitted: the theorem is about real-valued functions on [a,b]. Although there are generalizations for finite-dimensional vector-valued functions as well, it is certainly not true of all "continuous functions" on [a,b] that one could dream up. For instance, for functions with values in a Banach space, the meaning of the integral even becomes ambiguous. As for omitting the conclusion that F is differentiable, most mathematical convention is to state a conclusion rather than allow it to be intuited from an identity. That is, it is proper mathematical style to first state that F is differentiable before using the derivative. So I'm sorry to say that I disagree with the changes you are proposing. siℓℓy rabbit (talk) 04:03, 15 January 2009 (UTC)
"Indefinite Integral" redirects to "Antiderivative," and the article defines them as synonyms. Stewart defines them as synonyms, as does I belive Leithold, whose proof is used in this article. Webster's dictionary defines "Indefinite integral" as "any function whose derivative is a given function" and "antiderivative" as "INDEFINITE INTEGRAL." I really don't know what distinction could be made; the one side of the FTC clearly deals with definite integrals, and the other with antiderivatives, aka, indefinite integrals.
As for "real-valued," I suppose I was making the assumption that [a,b] implied a real interval, which perhaps isn't true, so you can leave that in. However, stating "F is differentiable and its derivative is equal to . . . " just seems unnecessarily wordy. I don't think allowing its existence to be "intuited from an identity" is unreasonable. For example, when I state ax^2 + bx + c = 0 \to x = \frac{-b +- \sqrt{b^2 - 4ac}}{2a}, I don't say "and x exists." Generally when you state an equality, the existence certainly is inferred, not stated. I wouldn't say "Let f(x) = \frac{sinx}{x}, then lim_{x \to 0} f(x) \, exists, and lim_{x \to 0} f(x) = 1 \,, but that's exactly how you have phrased this. It makes the theorem look overly complex, and harder to instantly grasp than a simple formula, while adding no additional information. Eebster the Great (talk) 22:36, 16 January 2009 (UTC)
[a,b] is the domain of f, and the set of reals is the codomain of f. These are different things. Also, by "indefinite integral" is often meant the function of x produced by integrating another function only up to x. See, for instance, the text by Tom Apostol (which is also referenced in the article). At any rate, it should have been clear from the context what my meaning was, and I still believe that this potential confusion should be sorted out in the antiderivative article. siℓℓy rabbit (talk) 23:10, 16 January 2009 (UTC)

f only needs to be continuous at c, not the whole interval for F'(c) to equal f(c). —Preceding unsigned comment added by Adrionwells (talkcontribs) 23:43, 24 February 2009 (UTC)

In which case, we would need to add the hypothesis of "integrable" (whatever that means). This issue bothers me as well, but it should be approached with caution. In particular, the notion of "integrable" generally means different things depending on the background of the reader. For elementary school students, it means continuous. For high school students, it means continuous a.e. For college students, it means Lebesgue integrable. So there is a hierarchy of theorems: one for (everywhere) continuous functions, one for a.e. continuous functions (at a point of continuity), and another for Lebesgue points of integrable functions. Sławomir Biała (talk) 04:31, 25 February 2009 (UTC)

Logical relationship between first and second parts and the corolarry[edit]

The second fundamental theorem as stated in this article follows trivially from the first, so the complicated proof is not necessary. If F'(x) = f(x) in the interval [a,b] then F(x) differs from the integral from the integral of f between a and x by a constant. It is not hard to see that the constant is F(a). The corollary is practically the same as the second part, except that x is replaced with b and the term F(b) is moved to the left side.

A stronger form of the first part says that if f is merely integrable on [a,b] and continuous at c then F'(c) = f(c).

A stronger form of the second part says that if f is integrable on [a,b], not necessarily continuous, then if g is an antiderivitive of f, the integral of f on [a,b] is g(b) - g(a). This does not follow from the first part. And the first part does not follow from the second because you only know that continuous functions have antiderivitives because of the first theorem.

I think the first theorem should be strengthened and the statement that is refered to as the second part in this article should be called a corollary. The second fundemental theorem should be strengthened as well and it should be mentioned that neither one follows from the other. —Preceding unsigned comment added by Adrionwells (talkcontribs) 04:15, 25 February 2009 (UTC)

This is indeed a missing ingredient in the page. Settling on an appropriate level of generality seems to be the main difficulty. In any event, we should at least be specific about what "integrable" means in context. Sławomir Biała (talk) 04:35, 25 February 2009 (UTC)

"Integrable" should probably mean Reimann Integrable.

In the proof of the second part continuity is not essential. The hypothesis can be relaxed with no effect on the proof.

If f is assumed to be continuous, a much simpler proof is possible.

define G(x) = \int_a^b f(t) dt. By the fundamental theorem, G'(x) = f(x) and therefore there is a number c such that G(x) + c = F(x) for all x in [a,b]. Letting x = a, c = F(a) which means F(x) = F(a) + \int_a^x f(t) dt. —Preceding unsigned comment added by Adrionwells (talkcontribs) 01:14, 27 February 2009 (UTC)

Some sections sound informal[edit]

From a purely grammatical perspective, some sections of this page seem to informal for an encyclopedic article. For example, lines like "We don't need to assume continuity" should be rephrased in a more mathematically precise manner, perhaps in the third person impersonal. However, the rest of the article is great! Wikiisawesome (talk) 17:31, 10 March 2009 (UTC)

I agree; WP:NOT#HOWTO. --Andrewlp1991 (talk) 00:43, 25 April 2009 (UTC)

Confusion About Continuity Requirement and Question About Use of Mean Value Theorem[edit]

In the general discussion, just before the proofs begin, there is the statement, "Notice that the Second part is somewhat stronger than the Corollary because it does not assume that ƒ is continuous." However, at the beginning of the proof of the second part, it says, "Let f be continuous on the interval [a, b], and let F be an antiderivative of f." So, the proof of the second part assumes the continuity of f in the hypothesis while the general discusson indicates this is not required. Upon reading the proof, I wasn't sure if this was required. (I haven't taken a class that uses calculus in nearly 20 years. So, I don't remember the exact definition of continuous.)

Also, the proof of the second part uses the mean value theorem.

F'(c) = \frac{F(b) - F(a)}{b - a}\,.

Since b - a is in the denominator, it seems that this proof does not apply to the case where a = b in the integral. Is this correct or am I missing something?

You are not wrong, but this statement is usually phrased in books as: there is c such that a < c < b and... So there is no problem actually. --Bdmy (talk) 08:59, 25 April 2009 (UTC)
Actually, now that I think of it, it follows directly from the definition of a definite integral that when a = b, the integral is zero. So, there is no reason to relate the antiderivative to the definite integral when a = b and, therefore, no reason to prove the FTC for the case where a = b. I also notice that the conclusion of the first theorem includes the statement, "Then, F is continuous on [a, b], differentiable on the open interval (a, b)...". So, if a = b, there is no open interval (a, b) on which F could be continuous or differentiable. I guess this confirms that the FTC does not apply to the case where a = b. -MEM, the original author of this question. —Preceding unsigned comment added by (talk) 17:52, 25 April 2009 (UTC)

What red stripes?[edit]

It says in a caption, The area shaded in light red stripes can be computed as h times ƒ(x). I don't see any red stripes. What gives? —Preceding unsigned comment added by (talk) 19:08, 1 June 2009 (UTC)

It's this color:    . It's nearly invisible. The striped rectangle is bounded by the two vertical lines labelled "x" and "x+h", the x axis, and the horizontal line next to "y=f(x)". — Emil J. 10:37, 2 June 2009 (UTC)
I made the colours darker, so now it should be clearly visible. — Emil J. 13:14, 2 June 2009 (UTC)

Derivative of an antiderivative[edit]

The article presently says:

Loosely put, the first part deals with the derivative of an antiderivative

This is a little too loosely put for my taste. Yes, F as defined in the following equation is an antiderivative of f, but that is the very point of the theorem. It is not something we know a priori. It is a bit like saying that about sum of angles of a plane triangle that "this theorem deals with straight angles".

On the other hand, I'm not completely sure what to say instead. I could accept "... deals with the derivative of an indefinite integral", but that will sound meaningless to readers who consider "indefinite integral" to be just a synonym for antiderivative (as does our own antiderivative article). (talk) 00:31, 1 December 2009 (UTC)

I luv you guys.[edit]

I love you guys. This page is so not interesting that nobody notices it has been vandalized. Thank you for the lulz. (talk) 12:20, 1 June 2010 (UTC) —Preceding unsigned comment added by (talk)

Article structure[edit]

The biggest problem with this article is that clear statements of the Fundamental theorem of calculus are hard to find, buried deep in a section that is overly bloated with proofs. I think that at least the two parts of the fundamental theorem should be moved to a new first section, followed immediately by examples, and then by the sections on physical and geometric intuition. The proofs and generalizations should be last. Sławomir Biały (talk) 15:05, 31 October 2010 (UTC)

I changed the word integrable to "Riemann integrable" in the second part, to distinguish from Lebesgue integration. (talk) 23:55, 9 December 2010 (UTC)particle25

And why did you do that, exactly? It holds for Lebesgue integration all the same, in fact, it also holds for the Henstock–Kurzweil integral, in which case every function that admits an antiderivative is automatically integrable.—Emil J. 11:01, 10 December 2010 (UTC)

resolving the Newton-Leibniz priority dispute in favor of... Barrow[edit]

Childs' book is a fine source, but not every source in the literature believes Barrow was the first to give a complete proof. We might want to make a note of that. Tkuvho (talk) 18:17, 15 December 2010 (UTC)

notation for increments[edit]

I find very unsatisfactory the notation \Delta x to denote an increment of the variable x, instead of a simple symbol (h or \epsilon or whatever). It is misleading, because it presents the increment as the result of an operator \Delta acting on x , which is not. Even worse, \Delta in mathematics is universally used to denote the Laplacian operator; which makes even more confusing the horrible notation \Delta f(x)=f(x+\Delta x) - f(x). --pma 08:06, 1 July 2011 (UTC)

Recent edits of the "Geometric intuition" section[edit]

A few paragraphs were added by Chretienorthodox1 in the "Geometric intution" section, starting with "We can think by another way..." and ending with a note [5] that says "Paragraph "We can think... A'(x) = f(x)." has been written by George Theodosiou."

I don't think this adds anything meaningful to the article. It also doesn't cite any sources. Should this be deleted?

Rachelopolis (talk) 18:38, 4 October 2011 (UTC)

Deleted it.
Rachelopolis (talk) 03:52, 11 October 2011 (UTC)
I don't love the statement "As h approaches 0 in the limit, the last fraction goes to zero, because the height of the excess area goes to zero." Why? Just because the numerator goes to zero doesn't mean the fraction goes to zero, since the denominator is also going to zero. I think the previous "geometric intuition" section was clearer because it avoided that confusion. I'm trying to figure out a way to fix this problem in this version, and I'm drawing a blank. We need to figure out an intuitively appealing way to convince the reader that the "Excess" area really goes go to zero, without saying it in a way that's mathematically untrue. — Preceding unsigned comment added by (talk) 17:48, 3 October 2013 (UTC)
Hi, I will try to improve the phrasing. The paragraph now includes this: "In other words, in the diagram, the area of the 'sliver' and the area of the 'rectangle' approach each other as h approaches zero." This quoted statement is not really very helpful, because it's obvious that the area of the sliver approaches zero, and the area of the rectangle also approaches zero, so their areas approach each other. That fact really does not tell us anything about the fraction "Excess/h". I'll see if the language can be improved.Anythingyouwant (talk) 23:41, 3 October 2013 (UTC)
It's getting better. Just made an attempt to clarify it even a bit more, but I'm starting to like it now. Truejim (talk) 16:01, 9 October 2013 (UTC)
I'm glad you like it. I rephrased slightly.Anythingyouwant (talk) 16:45, 9 October 2013 (UTC)

Geometric Intuition[edit]

The geometric intuition section from several months back was much easier to understand, and simpler. Why was it changed? JDiala (talk) 05:53, 13 January 2014 (UTC)

Hello JDiala. In the immediately preceding section above, you can see some discussion about it between myself and another editor. Notice that the title of the section was also changed from "intuition" to "meaning". That's because the fundamental theorem of calculus can actually be proved this way; it's not just an intuitive suggestion.Anythingyouwant (talk) 06:02, 13 January 2014 (UTC)
The previous one wasn't a proof? The result achieved from that was also f(x) = lim x -> 0 A(x+h)-A(x)/h; and it seemed logically coherent to me. Go to the version of this page from April-May and see it for yourself. JDiala (talk) 06:20, 13 January 2014 (UTC)
For a proof, we need the little square at the top of the red column, and we need to show that the square's area divided by "h" approaches zero as "h" approaches zero. That's what the section now tries to do. I think it improves upon the very good version of April-May, and that's why the section has been moved up in the article.Anythingyouwant (talk) 06:46, 13 January 2014 (UTC)

Alternative geometric approach[edit]

Per this recent revert, I'm wondering if we should add a section or subsection titled "alternative geometric approach".Anythingyouwant (talk) 15:33, 20 February 2014 (UTC)

Failed to parse[edit]

I don't know if it's just me or there is an error under "Geometric meaning" section. Firefox says that 2nd and 3rd formulas failed to parse. I don't know what's the problem anyway. Please correct. — Preceding unsigned comment added by (talk) 19:42, 24 March 2014 (UTC)

Thanks, I have reverted the edit that did that.Anythingyouwant (talk) 19:49, 24 March 2014 (UTC)