# Talk:Analytic continuation

WikiProject Mathematics (Rated Start-class, Mid-importance)
This article is within the scope of WikiProject Mathematics, a collaborative effort to improve the coverage of Mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Mathematics rating:
 Start Class
 Mid Importance
Field: Analysis

Ughh. Someone should copy the bottom half of the article on Riemann surfaces into this article. The article on Riemann surfaces should be trimmed to remove the parts about analytic continuation, and just point to this article instead. Who is up for that? linas 19:43, 22 Jan 2005 (UTC)

There also should be a mentioning of the generalization to meromorphic continuation since the link points to this article currently. - Gauge 05:16, 20 May 2005 (UTC)

## extending a relation by transitivity

Can someone explain what is meant by "extending by transitivity"? Does it mean transitive closure? Do we also have to do some kind of "symmetric closure" to make this relation symmetric, as it must if we're to make an equivalence relation? -Lethe | Talk 22:33, May 31, 2005 (UTC)

Hmmm - 'not symmetric' is true, as defined. But the transitive closure is symmetric. This may really need a picture. As defined, g ≥ h can happen and the radius of convergence of h can be much smaller than that of g: so the power series for h can't 'reach' far enough. But by taking enough small steps with other functions, one can 'reach' the point about which g is defined. Say for example g is defined at 0, h at 1, and they both represent a function that has a singularity at 1.01. Then the radius of convergence of h is (at most) 0.01. You can expand h about a point like 0.99, and get a slightly larger radius of convergence, 0.015. Expand that about 0.98, say, with radius of convergence 0.02. Continuing this way, one gets the relation h R g where R is the transitive closure of ≥. Charles Matthews 06:20, 1 Jun 2005 (UTC)

Hmm. Let me see if I have it. Transitivity fails because b can be in a's radius of convergence, and c can b in b's, but c need not be in a's. Our transitive closure will have a>c when there is an intermediate chain of germs that arrive at c. And I can sort of imagine how we can use a similar chain to see that symmetry holds too, once we have transitivity. A chain of germs getting bigger until they get a's center. Or something. I'm going to add some words to that sentence. -Lethe | Talk 11:09, Jun 1, 2005 (UTC)
Hmm. I would believe your assertion that I can go back from h to g by some finite number of steps, because the picture you give feels somehow right. However, this fact doesn't look easy to prove and so is not at all trivial, and I think the sentence should be changed accordingly. "If we extend the relation by transitivity, we obtain a symmetric relation" conveys the impression that just taking the transitive closure implies also symmetry, which isn't the case in general. Ezander (talk) 10:55, 17 March 2011 (UTC)

## Examples of analytic continuation

The formula

$L(z-1) = \sum_{k=1}^\infin \frac{(-1)^{k+1}}{k}(z-1)^k$

is weird. Perhaps the author wanted to write

$L(z) = \sum_{k=1}^\infin \frac{(-1)^{k+1}}{k}(z-1)^k$

Bo Jacoby 13:47, 10 October 2005 (UTC)

OTOH perhaps the author meant log(z) = L(z-1) = ... and "L(z-1)" is only there to emphasize the fact that z0=1. 194.78.219.157 18:28, 10 October 2005 (UTC)

Perhaps, but that doesn't agree with the notation used in Analytic_continuation#Formal_definition. I dare to correct it. Bo Jacoby 06:39, 12 October 2005 (UTC)

And its not really an example of analytic continuation is it (its a power series with a statement of what its germ is)? Its a concept that I am just trying to understand and there are surprisingly few examples of it being done. Instead one reads lots of earnest accounts of how an analytic continutation (if it exists) is unique in certain circumstances and how easy it is to do (or is it). Nowhere is it said (as far as I can see) under what conditions such a thing exists or how to find one (although its implied that this is so easy it should be obvious). Fine for a typical maths textbook, but not (surely) an encylopedia. How about an *example*? Francis Davey 19:02, 11 May 2006 (UTC)

## Global obstructions

The reference to "global obstructions" is not immediately clear to me. The article on "Sheaf Cohomology" doesn't help about. Better specify. 9:05, 20 May 2006

I've written this out in a longer explanation. Charles Matthews 11:17, 20 May 2006 (UTC)

## this sentence

"That is because the difference is an analytic function vanishing on a non-empty open set."

This needs further comments, because i don't get it.

I gave it a try. How is it now? -lethe talk + 16:12, 11 June 2006 (UTC)

If by "difference" you mean the "difference of two analytic functions" and by "vanish" you mean "[the value of the difference] approach zero".

Not "approaches zero", but rather "is equal to zero at every point of the domain". -lethe talk + 06:14, 12 June 2006 (UTC)

## Monodromy theorem

I added the monodromy theorem to this page. I tried to state it also within the language of sheaves, but I am not really familiar with those, so maybe some correction is needed. This unsigned comment was added by KennyDC at 00:42, 16 Sep 2006.

## History of analytic continuation

I'm a little curious about something, and I'm wondering if anyone else agrees with me, or has a different perspective. Whichever – I'd like to start a discussion here.

I think that the idea of "analytic continuation" has its roots in real analysis. A good example is the gamma function, Γ(z). The factorials were well understood by guys like Pascal and Newton. Euler developed an integral formula

$\Gamma(z) = \int_0^\infty e^{-t}t^{z-1} \mathrm{d}t\,$

which, in the first instance, can be regarded as an extension of the domain of the factorial function from the non-negative integers into the real numbers > −1, since the improper integral exists if z > 0 is real. Euler's approach can be extended into the half-plane where ℜ(z) > 0, and Euler himself gave a product formula that defines Γ(z) everywhere in ℂ where the function is analytic (that is, everywhere except at the poles of Γ). I'm pretty sure I can come up with some other examples ... the sequence of perfect squares was almost certainly the motivation for thinking about the real-valued function y = x2 a long time ago, for instance.

Another good example is the exponential function, and the natural logarithm ln(x). The natural log, in particular, was well understood as an exclusively real-valued function long before Euler extended the definition of ex into the complex plane.

I've got a book (Edwards, on Riemann's Zeta Function) that talks about some of this, particularly the difference between "analytic continuation" as Riemann understood it, and the essentially Weierstrassian approach to analytic continuation that is the exclusive focus of this article. So anyway, here's my question. Would this article benefit by adding some of these ideas, either in a "Motivation" or "History" section? Or should this article restrict itself to the (quite modern, and very narrow) generally accepted definition of "analytic continuation"? DavidCBryant 17:13, 17 January 2007 (UTC)

Maybe that would be a nice touch. Sometimes, as with the zeta function and its functional equation, a function's domain can be extended in one fell swoop by a nice trick. But this is by far the exception, not the rule.
On the other hand, the Weierstrassian approach of using a sequence of power series centered at various points along a path, so that their regions of convergence successively overlap, works in all cases where analytic continuation is possible.Daqu (talk) 10:07, 23 May 2008 (UTC)

## The example of log(z) seems wrong

The article states:

"Sum_{1 <= k < oo} (-1)^(k+1) (z-1)^k is a power series corresponding to the natural logarithm near z = 1. This power series can be turned into a germ g = (1, 0, 1, −1, 1, −1, 1, −1, ...)."

Shouldn't the germ be g = (1, 0, 1, -1/2, 1/3, -1/4, ...) (using the notation for germs described in the paragraph above this statement) ???Daqu (talk) 09:57, 23 May 2008 (UTC)

agreed. done. Bo Jacoby (talk) 12:27, 23 May 2008 (UTC).

## Natural barrier

Whoever pre-empted the term "Natural barrier" for this mathematical usage got it hugely wrong. In normal usage the term is geographical and refers to obstacles to movement, especially of people and especially at modest technological levels. I'm removing the redirect for Natural barrier immediately. I suggest anyone who wants to see the mathematical usage represented should create a disambig page. Philcha (talk) 10:44, 24 May 2008 (UTC)

Philcha, the fact that a term has one established meaning does not suggest whether or not it also has another meaning, especially in a different field. As it happens, "natural barrier" is used in complex analysis to mean a set -- on the boundary of the domain of definition of an analytic function -- beyond which the function cannot be analytically continued.
For better or worse, mathematics has adopted a large number of common terms and assigned technical meanings to them within mathematics.
(It is also true that within math, the term "natural boundary" is considerably more common than "natural barrier", and is used to mean exactly the same thing.)Daqu (talk) 16:36, 10 June 2008 (UTC)

## Significant omission: the connection between analytic continuation and the Riemann surface of a function

There is an intimate connection between analytic continuation of an analytic function, and the Riemann surface of the function. This is not discussed in the article, which strikes me as a significant omission.

The article does include the briefest mention of Riemann surfaces, in the following passage:

"The concept of a universal cover was first developed to define a natural domain for the analytic continuation of an analytic function. The idea of finding the maximal analytic continuation of a function in turn led to the development of the idea of Riemann surfaces."

But an article on analytic continuation requires more than merely the briefest mention of Riemann surfaces. In addition, the first quoted sentence is confusing, since, for example, the natural domain for the sqrt(z) function is the double cover of C* (the complex plane with the origin removed) -- and not the universal cover of C*.Daqu (talk) 02:36, 11 June 2008 (UTC)

## Confusing sentence: Parts of a germ

This sentence: "The base g0 of g is z0, the stem of g is (α0, α1, α2, ...) and the top g1 of g is α0. The top of g is the value of f at z0, the bottom of g." I couldn't find this terminology by a Google search. Also, z0 is referred to as both the 'base' and the 'bottom' of g. Lateralrust (talk) 21:36, 9 March 2011 (UTC)

Seconded. This sentence in particular is confusing, as it doesn't even read like a complete sentence: "The top of g is the value of f at z0, the bottom of g." —Preceding unsigned comment added by 128.163.128.193 (talk) 04:53, 30 March 2011 (UTC)

## Typo?

"This is because F1 − F2 is an analytic function which vanishes on the open, connected domain U of f and hence must vanish on its entire domain. This follows directly from the identity theorem for holomorphic functions."

Is this correct? F1 - F2? It seems that would be zero, which is only an analytic continuation of 0. Was this meant to imply something more along the lines of F1 "slash? F2? Using "and" seems like it would be better and avoid extremely likely confusion. 67.194.8.120 (talk) 00:54, 28 March 2011 (UTC)