Wikipedia:Reference desk/Archives/Mathematics/2010 January 29

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
Mathematics desk
< January 28 << Dec | January | Feb >> January 30 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


January 29[edit]

Why is the perimeter of a spherical triangle less than 2 pi?[edit]

Hi all - my question is pretty much as in the title: why is it that, assuming we take the shorter of the 2 arc lengths between any 2 points on S^2, the perimeter of any given spherical triangle is strictly bounded above by ? I tried using Gauss-Bonnet (overkill?) but to no avail. I've managed to prove the triangle inequality which is fairly trivial, if that's any use!

Thanks very much, Delaypoems101 (talk) 00:51, 29 January 2010 (UTC)

perhaps I'm misreading what you're asking, but it seems to me that (on a sphere with radius of 1), the largest possible triangle would be where all three vertices line up along a diameter of the sphere - the triangle turns into a circle, and would thus have a perimeter of 2π. --Ludwigs2 01:06, 29 January 2010 (UTC)

I guess my point is that I'm wondering how to actually show that that construes the largest possible triangle - for example, a triangle with one point 'A' at the north pole and 2 antipodal points on the equator would also have perimeter length pi+pi/2+pi/2=2pi, though I would consider these degenerate cases since 'A' isn't really a vertex and I'm not sure 3 collinear points is considered a triangle on the sphere - how would I go about showing that for any side lengths a, b, c, of a non-degenerate spherical triangle, we have a+b+c < 2pi? (Rather than saying 'this case looks like it should be a maximum, and has perimeter 2pi, hence any other triangle probably has a smaller perimeter than that'?) Thanks for all responses, Delaypoems101 (talk) 04:29, 29 January 2010 (UTC)

Here's one way with coordinates that is not pretty. The distance d between points A and B on the sphere staisfies cos(d) = A.B, and since we are requiring 0 ≤ d ≤ π where cosine is strictly decreasing, if the dot product is smaller then the distance is larger. For arbitrary points A and B, we can let A be (1, 0, 0) and B be (x, y, 0) with y ≥ 0 without loss of generality. For any point C = (u, v, w), let C' = (u, -(v2+w2)1/2, 0). A.C = A.C' and B.C ≥ B.C'. So the perimeter of ABC is ≤ that of ABC' and ABC' is a triangle with points on a great circle so has perimeter ≤ 2π. Rckrone (talk) 05:03, 29 January 2010 (UTC)
Spherical triangles satisfy the triangle inequality because their sides are geodesics. Now for a spherical triangle ABC, consider the triangle A'BC where A' is the antipodal point to A. Then BC <= A'B + A'C = (π - AB) + (π - AC) ... Gandalf61 (talk) 17:20, 29 January 2010 (UTC)
That is pretty. Rckrone (talk) 20:39, 29 January 2010 (UTC)

Unique bid auctions[edit]

Is there a (in some sense) mathematically optimal strategy for unique bid auctions, eg. a Nash equilibrium strategy? Has any work been done on optimal strategies in real-life unique bid auctions? -- The Anome (talk) 01:02, 29 January 2010 (UTC)

Is it likely you will participate in one in the near future? I found one reference for this type of action, a Danish company doing an iPod promotion.--RDBury (talk) 01:51, 29 January 2010 (UTC)
No, my interest is currently entirely hypothetical; it just seems like a perfect subject for auction theory. There certainly seem to be a lot of these things: see http://www.google.com/search?q=unique+bid+auction -- The Anome (talk) 01:59, 29 January 2010 (UTC)

Update:An earlier version of the article suggests that there can't be any deterministic optimal strategy, as if more than one player uses it, they will all select the same bid, and thus all lose, therefore contradicting the premise. However, I can't see any reason for a probabilistic strategy not to exist. -- The Anome (talk) 02:16, 29 January 2010 (UTC)

OK, there does seem to be some academic literature on this topic: see for example http://ideas.repec.org/p/cca/wpaper/112.html -- The Anome (talk) 02:20, 29 January 2010 (UTC)
and this: http://mpra.ub.uni-muenchen.de/4185/ and http://zs.thulb.uni-jena.de/receive/jportal_jparticle_00141919?lang=en, which look like they pretty much answer my question. -- The Anome (talk) 02:28, 29 January 2010 (UTC)
Thanks for pointing it out, looks to me like a subject that could be developed into a quite interesting article if someone wanted to give it a go. Dmcq (talk) 11:19, 29 January 2010 (UTC)

Cardinality[edit]

I'm taking a computer science course in Discrete Structures. One of the questions is asking about cardinality, which is defined on wikipedia and my text book as the number of elements in a set. Therefore: {x} = 1
{{x}} = 1
{x, {x}} = 2
{x, {x}, {x, {x}}} = 3 but why is this true? shouldn't the answer be 4??
and this one:
{2, {3, 4, 10, {4, 0}, 6, 3}, 6, 12} the answer is listed as 4...but why, that makes no since.

-- penubag  (talk) 08:51, 29 January 2010 (UTC)

I think what you're really confused about is the notion of "an element of a set". Given some set A and an object x, either (x is an element of A) or . The objects in question can themselves be sets - given sets A and B, it is meaningful to ask whether .
The relation is not transitive. From and it does not follow that .
You can specify a set by describing exactly which objects are its elements. For example, there is a unique set which includes 3, includes 5 and does not include any other object. This set is denoted {3,5}.
Similarly, there is a unique set which includes {3,5} but does not include any other object. This set is denoted { {3,5} }. Note that and , but - when I defined { {3,5} } I explicitly said that any object which is not {3,5} is not an element of { {3,5} }. Obviously 3 is not the same as {3,5}, so 3 is not an element of { {3,5} }.
Now we can go back to cardinality, which is how many different objects are elements of a set. There is exactly one object which is an element of { {3,5} } - this object is {3,5}. Thus, the cardinality of { {3,5} } is 1. Likewise, there are exactly 3 objects in {x, {x}, {x, {x}}}, which are x, {x} and {x,{x}}. There are exactly 4 objects in {2, {3, 4, 10, {4, 0}, 6, 3}, 6, 12} - one of them is 2, another one is {3, 4, 10, {4, 0}, 6, 3}, another is 6 and another is 12. -- Meni Rosenfeld (talk) 09:13, 29 January 2010 (UTC)
I've been reading up on cardinality for an hour but you explained it very concisely! Thanks very much, I understand it now. -- penubag  (talk) 09:22, 29 January 2010 (UTC)

ratio inequality[edit]

At: http://en.wikipedia.org/wiki/Demographics_of_Australia#Population_growth_rate There appears to be a numerical error in the following section:


As of the end of June 2009 the population growth rate was 2.1%.[7] This rate was based on estimates of:[8]

   * one birth every 1 minute and 45 seconds,
   * one death every 3 minutes and 40 seconds,
   * a net gain of one international migrant every 1 minutes and 51 seconds leading to
   * an overall total population increase of one person every 1 minutes and 11 seconds.

In 2009 the estimated rates were:

   * Birth rate - 12.47 births/1,000 population (Rank 164)
   * Mortality rate - 6.68 deaths/1,000 population (Rank 146)
   * Net migration rate - 6.23 migrant(s)/1,000 population. (Rank 15)

The ratio between: one birth every 1 minute and 45 seconds and a net gain of one international migrant every 1 minutes and 51 seconds is approximately 1:1.06

whereas the ratio between: Birth rate - 12.47 births/1,000 and Net migration rate - 6.23 migrant(s)/1,000 is approximately 1:0.5

Should these ratios not be equal?

I am unable to discern which figures are correct and which is wrong, so am in no position to edit the article, but would like an answer to satisfy my interpretation of the figures or show why I am wrong to expect actual or near equality between the two ratios. —Preceding unsigned comment added by Briandcjones (talkcontribs) 10:55, 29 January 2010 (UTC)

I think you're right, the figures don't seem consistent. One set is coming from the Australian government and the other is coming from the CIA so maybe there is a difference in the way each organization defines its terms. Reporting numbers from different sources side by side as is done in the article is bound to lead to confusion. Having worked on database reports many years I can say from experience that there are many ways that figures can be right but appear wrong because of differences in terminology.--RDBury (talk) 16:17, 29 January 2010 (UTC)
Note that in the first case, the rate of migration is roughly equal to the rate of birth (1:51 and 1:45), whereas in the second case the net migration rate is roughly half of the birth rate (6.23 to 12.47). this is probably due to different forms or criteria of measurement - estimations of illegals, types of immigration papers, accommodations for exit and re-entry, etc. you'd have to look into the methodology behind the statistics more closely to figure out the difference. --Ludwigs2 16:31, 29 January 2010 (UTC)

The page Talk:Demographics_of_Australia is the proper place for your question, I think. Bo Jacoby (talk) 16:41, 29 January 2010 (UTC).

What algorithm?[edit]

Basically, say I had a list of characters in a production, and knew which scenes each character appeared in. Then say I needed to assign actors to characters, with each character only needing one actor, and each actor being able to play any number of characters as long as none of their characters share scenes with each other. Assuming none of the actors need any specific requirements to play each character, how would I find the minimum number of actors required? (Note: I only know discrete maths at A-Level standard - I tried looking at stuff about P and NP and didn't understand, so if you could make your explanation as simple as possible, I'd appreciate it.) Thanks! Anthrcer (click to talk to me) 12:17, 29 January 2010 (UTC)

Would graph coloring do what you want, with characters at each vertex and 'appears in the same scene' as the edges ? If so there's a section on algorithms.--JohnBlackburnewordsdeeds 12:54, 29 January 2010 (UTC)
Graph coloring does seem to be the simplest way to solve this. Look into chromatic polynomial, which can be computed by hand for smallish graphs once you learn the recurrence relation for it. — Carl (CBM · talk) 13:00, 29 January 2010 (UTC)

I see that this method would be what I need, but I really don't understand the articles, and the Simple English Wikipedia doesn't have an article specifically about graph colouring. Can the method be explained in words I can understand? Or can an appropriate website be linked? Anthrcer (click to talk to me) 20:27, 29 January 2010 (UTC) (edited by Anthrcer (click to talk to me) 22:08, 29 January 2010 (UTC))

Draw a bunch of circles on a page, each circle representing a character from the play. Then for every pair of characters that appears on stage at the same time, draw a line or arc between the circles corresponding to those characters. That is a graph (mathematics). Next, color the circles so that no two connected circles are the same color. That is graph coloring. Each color corresponds to an actor. A trivial coloring is to make each circle a different color, but you want to do it in fewer colors. The graph coloring problem (to decide whether a graph is colorable with k colors for arbitrary k) is NP-complete, which means you can solve it by brute force (enumerate all possible colorings and see whether any fit the requirement), but there is no known computationally tractable way. For actual play with not too many characters you might be able to use that approach. Otherwise greedy coloring is a reasonable approximation, though it might not always get you the absolute best solution. A book like CLRS will explain this stuff in detail. 66.127.55.192 (talk) 22:10, 29 January 2010 (UTC)

I see. Thanks for helping! Anthrcer (click to talk to me) 13:08, 30 January 2010 (UTC)

Proof[edit]

If you were asked to prove that the square root of 2 is an irrational number with proofs, how would it be done? 198.188.150.134 (talk) 13:02, 29 January 2010 (UTC)

Probably it would look like one of Square root of 2#Proofs of irrationality... --CiaPan (talk) 13:07, 29 January 2010 (UTC)
Assume that there exist two positive coprime integers, a and b, such that a2 = 2b2. Now compute modulo 3. Then either a ≡ 0 or a ≡ 1 or a ≡ 2, and consequently a2 ≡ 02 ≡ 0 or a2 ≡ 12 ≡ 1 or a2 ≡ 22 ≡ 4 ≡ 1, but a2 is not congruent with 2. Similarly 2b2 is not congruent with 1. So the only possibility is that b ≡ a ≡ 0 which is incompatible with the assumption. Bo Jacoby (talk) 15:00, 29 January 2010 (UTC).
Strictly speaking, most of the standard proofs only show that there is no rational number whose square is 2. Some more work is needed to contruct the reals and show that they contain such a number. AndrewWTaylor (talk) 18:06, 29 January 2010 (UTC)
If you mean to ask how one would prove that the square root of 2 is irrational (i.e. the technique), I would say that proof by contradiction is typically employed for such a proof. Thus the first step would be to assume that for coprime integers a and b (if can be written as a fraction, it can of course be written as a fraction in "lowest terms"). Instinct and intuition then suggest to square both sides of the equation and "somehow end up with a contradiction"; because such a contradiction is encountered, the initial hypothesis, namely that is rational, must be false. Thus, the square root of 2 is irrational, as desired. --PST 12:23, 30 January 2010 (UTC)

Derivative question[edit]

Hi, I'm having a little trouble finishing this one differentiation problem I'm trying to do. I have right now:

y^2 = sqrt(b^2- (b^2x/a^2))

for positive y values only. Thanks, --Fbv65edeltc // 20:00, 29 January 2010 (UTC)

thats
one way is to differentiate both sides wrt x to give I think

--JohnBlackburnewordsdeeds 20:12, 29 January 2010 (UTC)

You lost a minus sign: . Algebraist 20:26, 29 January 2010 (UTC)

The function y is implicitly defined anyway, so why not get rid of the square root and the fraction: , and differentiate: Bo Jacoby (talk) 20:47, 29 January 2010 (UTC).

Oops, yes, and I agree that's even neater. --JohnBlackburnewordsdeeds 20:51, 29 January 2010 (UTC)
Or alternatively, since you are assuming y is positive, simply note . No need for implicit differentiation. Nm420 (talk) 20:56, 29 January 2010 (UTC)

Solving equations by algebra alone is often hard and sometimes impossible. Differentiation of polynomials is easy. So if you are not explicitly requested to provide an explicit expression, implicit differentiation is sufficient. For example, the function y = f(x), defined implicitly by the equation y5+y = x, cannot be expressed explicitly, but the equation can easily be differentiated. Bo Jacoby (talk) 16:36, 30 January 2010 (UTC)

Great, thanks for all your help! I appreciate it. --Fbv65edeltc // 06:17, 31 January 2010 (UTC)

Monty hall paradox[edit]

We know the probability is 1/3 if you dont switch your guess and 50% if you do, so whats wrong with this reasoning: lets look at what happens from the perspective of what was under your original guess. In 1/3 of the cases there was a car under it; if you switch your guess when Monty gas removed one goat from the possibilities, you will be guaranteed to switch to the other goat. But in 2/3 of the cases there is a goat under your original door: in these cases, Monty removed the other one, leaving you guaranteed to switch to a car. Therefore, from this perspective, 2/3 of the time you will get a car with the switchin strategy. (but this is false; in fact it is onlz 50 % of the time see monty hall problem. So, whats wrong with the line of reasoning I outlined above? Where is the logical error? Thank you. 80.187.105.225 (talk) 22:54, 29 January 2010 (UTC)

No, the probability of getting the car if you switch is 2/3 - 50% is the popular-but-incorect solution. read the article page more carefully, they do a pretty good job with it. --Ludwigs2 23:06, 29 January 2010 (UTC)
The easy way to understand why switching is 2/3 is this: If you don't switch, you are stuck with 1 in 3 doors. If you do switch, you get both of the other doors. Just because you know what is behind one of them doesn't mean you don't get it. So, switching gives you 2 in 3 odds of winning. -- kainaw 00:02, 30 January 2010 (UTC)
Hey, clever — nice one, kainaw. That's easier to understand than any explanation I've heard. Comet Tuttle (talk) 01:01, 30 January 2010 (UTC)
I feel the need to point out how dated this problem is. Given the current economy and the price of gas, many people nowadays would be aiming to get the goat. --Ludwigs2 02:37, 30 January 2010 (UTC)
LOL! Baccyak4H (Yak!) 04:53, 30 January 2010 (UTC)
The 50:50 paradox isn't so confusing. It was easy to understand if the guest could chose "one door, or two doors", alternatively. Double chance for a pair of 2 doors: One unavoidable goat there indeed, but the second goat there in just 1/3 only, and the car there after all in considerable 2/3, really so privileged! Of course, the car never is guaranteed, but significantly double chance of 2/3. Just easy! In effect that's the fundament of the game.
Confusion results right from opening these two doors not simultaneously. That's it.
Everyone knows: Inevitably given in each and every pair of two doors is at least one goat there, this pair of two doors having double chance of 2/3, though. Double chance! Opening both doors simultaneously and removing a given goat there will only in 1/3 leave the second goat there, but in sizeable 2/3 it will leave the one and only car there. Clearly arranged. But: Opening those two doors not simultaneously and showing only the one unavoidably given goat within this pair of two doors, leaving the "privileged partner-door" still closed, gives birth to confusion. This confusion could easily be rectified, if facts were represented clearly. And if math would stop nebulizing :-)   Kind regards   Gerhardvalentin (talk) 11:38, 30 January 2010 (UTC)
The even easier way to understand it (for me) is that the host has to pick a goat. Therefore the likelier choice (you pick a goat, 2/3) reduces to 50:50 as the host has picked the other goat for you. So in the likelier case, the poor host eliminates all uncertainty for you. If you're "unlucky" enough to pick the car, of course, it doesn't really help. x42bn6 Talk Mess 16:18, 4 February 2010 (UTC)
You are right, any door having a risk of 2/3 each, both refused doors must contain at least one unavoidable goat. A second goat there never is "unavoidably given", but the risk for the second goat there is still 1/3. And you do not know which one of those two doors contains the "given" goat (risk 3/3, chance 0/3) and which one is the "privileged partner-door" with a risk of only 1/3 but with a chance of 2/3. Gerhardvalentin (talk) 19:58, 4 February 2010 (UTC)