Jump to content

Wikipedia:Reference desk/Archives/Mathematics/2009 July 17

From Wikipedia, the free encyclopedia
Mathematics desk
< July 16 << Jun | July | Aug >> July 18 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


July 17[edit]

Asking people for their names[edit]

OK, this is going to be a really odd question.

Say you've got a person who you don't know the name of, but you've got a list of names it might be, and you're allowed to ask yes-or-no questions. What is the number of questions you've got to ask to be guaranteed of knowing their name?

For example, say their name might be Mario, Luigi, Peach or Toad. You can ask them whether they're each name in turn, and be sure of their name after the third question ("Are you Mario?" "No." "Are you Luigi?" "No." "Are you Peach?" "No." So they must be Toad). But there is a quicker way:

"Are you Mario or Luigi?"

If YES, you can simply ask them if they're Mario, and if NO you can ask them if they're Peach. Either way you're guaranteed to know their name in only two questions.

If you have seven names, Mario, Luigi, Peach, Toad, Bowser, Wario and Daisy, you can ask them:

"Are you Mario, Luigi, or Peach?"

If they say YES, you can ask them each name in turn and so guaranteed to know their name in three questions (Are you Mario, Luigi, or Peach? Are you Mario? Are you Luigi?) If they say NO, you simply do what you did when there were only four names, and so you're also guaranteed to know their name in 3 questions. So the minimum number of questions you've got to ask to be guaranteed of knowing their name is 3. As far as I've figured it out, the sequence for the number of questions for each number of names is 0 for 1 (if there's only one name you don't have to ask any questions!), 1 for 2, 2 for 3, 2 for 4, 3 for 5, 3 for 6, 3 for 7, 3 for 8, and I think 4 for 9 and 4 for 10 but my brain is totally melting so I can't be sure. What I want to know is, is there an easier way to figure this out and does this fit with any mathematical sequences already discovered? Vitriol (talk) 11:30, 17 July 2009 (UTC)[reply]

Put it this way: if you are allowed to ask n questions, how many different n-ples of answers can you get? (from Y,Y,..Y to N,N..N). It's 2n right? So no way to detect the name if there are more that 2n names. But if the names are 2n , divide them into two equal groups and ask which one has the name, as you were saying, and in n steps you have it (and a fortiori if they were less than 2n). So the answer is log2(n), rounded. It's the base of information theory...--pma (talk) 11:47, 17 July 2009 (UTC)[reply]
Rounded up. BTW, for reference, the algorithm that you are implementing here is known in computer science as a binary search, and it is the most efficient algorithm possible for this scenario, as pma demonstrated. --COVIZAPIBETEFOKY (talk) 11:55, 17 July 2009 (UTC)[reply]
I think "binary search" refers specifically to the case of searching a sorted list. What we have here is the more general dichotomic search. -- Meni Rosenfeld (talk) 15:14, 17 July 2009 (UTC)[reply]
Aw, nobody told me we'd have to learn Greek! —Tamfang (talk) 17:04, 20 July 2009 (UTC)[reply]

Residue of exp(1/sin(z)) at z=0.[edit]

Is there a simpler way to find the residue of at z=0 than plugging into the power series for e and then doing long division on the first term, as no other term should include a 1/z term? It's not too hard or anything but I am just wondering. Thanks StatisticsMan (talk) 17:43, 17 July 2009 (UTC)[reply]

There's the residue formula: where C is a closed curve around z0 and f is holomorphic everywhere else in the curve. Or if you know the order of the pole is n you can use which is on the residue page.
However, it looks like the singularity you're asking about is an essential singularity, so no residue. Rckrone (talk) 18:15, 17 July 2009 (UTC)[reply]
What? Essential singularites have residues, just like any other isolated singularity. Algebraist 18:19, 17 July 2009 (UTC)[reply]
Oh sorry. I guess I was confused about that. I've only ever seen residues discussed in the context of poles. Rckrone (talk) 18:22, 17 July 2009 (UTC)[reply]
I will answer my own question. Consider the power series expansion of at z = 0. Since 's expansion has a z term but no constant term, only for n = 1 will this power series expansion have a term. So, the coefficient in front of 1/z in the expansion for is just the coefficient for 1/z in the expansion for . And, now we're just dealing with a simple pole so you multiply by z and take the limit to get 1. This, I consider to be simpler than actually doing the long division on the term, though you don't have to go far. StatisticsMan (talk) 19:26, 17 July 2009 (UTC)[reply]


It's not that simple. Say that your argument is OK with res0(1/sin z)=1 (it's equivalent to lim sin(z)/z=1 as z→0). But I fear that after that, you are confusing 1/sin(z) with sin(1/z). Indeed, res0(1/sin(z)n) is 0 for even n, but it is not for odd n. You can easily compute it by means of the Lagrange inversion theorem, using the power series expansion of arcsin(z). You should find
.
From this you get, expanding the exponential in a series of powers of 1/sin(z),  :.
There are other ways of course; maybe somebody has a more elegant solution.--pma (talk) 20:57, 17 July 2009 (UTC)[reply]
Note. I tried googoling the first digits of the decimal expansion 086521097, and as a matter of fact it corresponds to a telephon number. Actually they say they just moved there and do not have much information about that exp(1/sin(z)), so please do not bother them further. --pma (talk) 08:39, 18 July 2009 (UTC)[reply]
If you wanted to find a simpler representation of this number, surely Plouffe's Inverter would be more effective than Google (though I got no hit in this case). -- Meni Rosenfeld (talk) 20:04, 18 July 2009 (UTC)[reply]
Thanks, I'll bookmark it immediately :-) pma (talk) 21:30, 19 July 2009 (UTC)[reply]
Good work with the series. This is a hypergeometric series, . Using an identity, this in turn can be rewritten using the modified Bessel function and the modified Struve function as
which can be checked to agree numerically with the integral. The hypergeometric form is the simplest form yet of the solution, though, I think. Fredrik Johansson 08:58, 18 July 2009 (UTC)[reply]
Agree. I see now Maple also returns the hypergeometric form to the series. Talking about a residue of the form , the above computation admits the following generalization (as I write it, it's not exactly a generalization, but I liked the symmetric form...).
Let and be entire functions, with and . So and have local inverses at 0, respectively and . Then we have:
.
Indeed,
because the first series converges uniformly on any small circle around 0. Using Lagrange's
,
we get the above symmetric expression in and .
Since it seems quite a standard computation we could maybe mention it somewhere in residue (complex analysis)#Series methods. BTW, is there a direct way for seeing that is the derivative of some function meromorphic at 0? --pma (talk) 09:50, 18 July 2009 (UTC)[reply]
I just presented my solution today and was told it was wrong by another student. I had not looked back here since I thought I had it figured out. We decided today that the professors probably meant it to be the way I did it but also overlooked the fact that it doesn't work. Otherwise, from your solutions, it looks harder than anything they would actually expect us to do. For example, I've never heard of the Lagrange inversion theorem. So, either that, or they expect us to figure out it's too hard and then not do the problem because of that and that is how they are using it to test us. Is there some other way to do this (the original problem is the integral of this function around the unit circle) that they could possibly expect us to know? StatisticsMan (talk) 20:26, 22 July 2009 (UTC)[reply]
I do not see a simpler way to get the result. You can express the result with the series, or with the Bessel functions as shown by FJ, or also as . This seems to show that any method you follow to compute that residue could not be too elementary, because it will use more or less explicitly the modified Bessel functions. By the way, the Lagrange inversion theorem is a two lines formal computation; unfortunately our articles on the topic are not very satisfactory. --pma (talk) 15:23, 24 July 2009 (UTC)[reply]