Talk:Pirate game

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Game theory (Rated Start-class, Mid-importance)
WikiProject icon This article is part of WikiProject Game theory, an attempt to improve, grow, and standardize Wikipedia's articles related to Game theory. We need your help!
Join in | Fix a red link | Add content | Weigh in
Start-Class article Start  This article has been rated as Start-Class on the quality scale.
 Mid  This article has been rated as Mid-importance on the importance scale.

overly simple[edit]

This analysis doesn't mention the fact that A gets nothing if E decides that his vote is worth more than 1 coin, for example.

The game assumes that each pirate is entirely rational. No other pirate will offer him more, and some would offer him less, so E has no motivation to refuse A's offer. If he did, he would end up with nothing when B and D outvoted C and E to split it 99-0-1-0. Even if he got lucky and B went overboard and C made an offer, E would once again have to accept a mere 1 coin (ganing nothing over A's offer) or C would go overboard and D would keep everything for himself. --Icarus (Hi!) 21:38, 11 July 2006 (UTC)
If this was a real scenario with real people the game would have close relationship with Ultimatum Game. In which case if A takes vast majority of the gold the rest are willing to gain nothing (loss of small amount of gold < punishing distributor A) in order to punish A for greediness. In reality, assuming greedy yet real pirates solution 50-30-20-0-0, or similar, would be much more likely with D & E as a minority are left without a dime. :-)
Absolutely! In the three pirate game (C, D, E), the logic that C can offer E just a single coin, and can be confident that E will accept it, seems flawed. It is in E's interest to be vindictive, but not too vindictive, in order that C doesn't think that an offer of 1 to E is acceptable. If I were E, I would be thinking (because E can't propose offers, only vote) that C had better offer me something good (like C:50 D:0 E:50), or else I'd vote to kill him. If E is steadfast in this attitude, then they must call C's bluff of a single-coin offer and vote to kill. E risks losing 50 coins by adopting this strategy, but C risks losing their life! C's argument of "E, vote for me or you get nothing" doesn't seem as persuasive as E's (implicit) counter-argument of "C, give me X coins or you die!". The best strategy, i.e. quantifying X, requires knowledge of the coin value of each pirate's life to themselves.
The whole problem with the pirate game is that the logic is somewhat circular: "The best strategy for C is to offer E just 1 coin, because the E *knows* that the best strategy for E is to maximize their coinage, and they will take 1 over 0". However, if the best strategy for E is to reject offers below 50, then, because C *knows* this, the best strategy for C is to offer E 50 and stay alive! So if indeed the best strategy for E is to accept a paltry offer, E gets left with 1. If the best strategy for E is to reject offers below 50, then E gets left with 50! So E accepting 1 does not sound like the 'best' strategy, because there exists a strategy that will get them more. That is the problem with trying to answer semantic questions like 'best' using something like Nash equilibria. Nash equilibia do NOT always give the 'best' strategies for intelligent agents. In answering the pirate game, you need to quantify what is meant by 'best'; i.e., the strategy that maximizes some utility function. In particular, the pirate game does a poor job of this by assuming that a pirate thinks that getting 0 coins is the same as losing their life!
I agree that this game is somewhat more based in thought experiments than what people would do in real life. Comparing the "logical" strategy in this game to the natural human process of bargaining and that would most likely occur is similar to the Hangman's surprise, where a prisoner uses a similar pattern of working backwards from saturday to sunday to conclude that they cannot be hung on that day, because they would expect it due to it being the last "possible" day for them to be hung by surprise. But, by doing this, they forget that they have made themselves vunerable to being surprised by being hung any day of the week (which is exactly what happens). The strategy deduced in this experiment doesn't take into account factors such as the fact that, when faced by the difference between getting either 1 coin, no coins, or an amount of their choosing, most people could care less if a bold move or demand cost them that one coin as long as they don't die, as many find the chance of a higher profit more attractive than a guarenteed small one (very akin to how some find a lottery ticket valuable enough to exchange their money (a guarented profit if they choose not to buy the ticket) for it). (talk) 22:01, 5 January 2011 (UTC)

Assumptions Missing[edit]

Perfect knowledge of the starting conditions by all pirates. With this the stated solution holds, without it or a similar knowledge assumption the actions of the pirates are unpredictable, and the problem is unsolvable. After all, it is sometimes rational to bet on people being irrational.

Conclusion is suspect[edit]

I'm with you all the way up until you get to B's decision. Rather than offering one gold piece to E, he should offer it to D, who would otherwise get 0 if B's proposal fails. That means B's proposal should be (B,C,D,E) as (99,0,1,0).

With that in mind, working back to A, he would need 2 more votes, which he can buy from C and E for 1 gold each, as they would get 0 gold if A's proposal fails. So the solution I see is (A,B,C,D,E) as (98,0,1,0,1).

Yup. Fixed. EdC 10:57, 25 June 2006 (UTC)
Whoops! Back to Core Micro for me...wrote the original article in something of a hurry, thanks for the heads up. Oxymoron 21:06, 26 June 2006 (UTC)
DN: No, you are wrong! D always says no, beacuse he wants to get 100 when only D and E are there. Therefore the previous solution (that I already edited) is correct.
No, because D knows if B goes overboard, C will offer one coin to E and nothing to D. Therefore he will accept one coin from B. Right? 16:06, 29 June 2006 (UTC)
That's how I see it. Fermium 23:55, 29 June 2006 (UTC)
I don't see the advantage of giving D the 1 coin instead of E... The game works either way and the outcome is the same... (DN)
Because if B offers 1 coin to E, E knows that that is also what C would offer him. E might then decide that he might as well accept B's offer because no one will offer more, or he might decide that he'll vote to throw B overboard just out of spite because C's offer won't be less. So B's survival is still uncertain. The only way he can guarantee his own survival is to offer the 1 coin to D, who knows that he must accept B's offer because if B dies and C offers, he'll end up with nothing. ----Icarus (Hi!) 10:56, 7 July 2006 (UTC)
This might be true if you don't apply the homo economicus model, which we do in this case (see top of article). And according to this model each individual only acts to maximise its own benefit and does NOT do anything to harm others "out of spite"... So it does make no difference if B offers the 1 coin to D or E! (DN)

I don't think that is quite right, DN. Each individual acts to maximize its own benefit, yes. But faced with equally beneficial options, there are NO requirements on its behaviour. It is not required to refrain from harming others. 13:49, 12 July 2006 (UTC)

Extension of game[edit]

If you begin with more than 200 pirates and cannot divide the individual gold pieces, an interesting pattern develops. Here is when it is important whether a pirate would take 0 gold rather than being thrown overboard. (For the sake of discussion, I assume they would).

The pirates start voting 0 for themselves, with the other 100 for either the even or odd "numbered" pirates at the end of the list. Still, many will get thrown overboard no mattter what. A pattern develops that has 100 of the first 200 getting 1 coin (alternating between odds and evens as you add pirates), then the next pirates up to the highest power of 2 available getting 0 coins, and anyone beyond that getting thrown overboard no matter what.

For example, the solution for 500 pirates is 44 getting thrown overboard, the next 256 (2^8) getting 0 coins, and the remaining "even" numbered pirates each getting 1 gold piece.

This is evidently mentioned in Scientific American in May, 1999 but I cannot confirm that. Fermium 01:35, 24 June 2006 (UTC)

I am not so sure about that. The rules of homo economicus fail to determine what happens after 201 pirates. Try it with one indivisible prize and five pirates:
  • With A, B, and C overboard, D takes the prize.
  • Therefore with A and B overboard, C offers E the prize. C votes yes to save his life, and E votes yes to get the prize.
  • Therefore with A overboard, B... what? He can offer the prize to C or D. B votes yes to save his life, and C or D votes yes to get the prize. Nothing determines who B will offer the prize to. Also, nothing determines which way the one who doesn't get the prize votes, but the vote doesn't matter, it passes.
  • Therefore A ... what? If he goes overboard, B, E, and one of C or D will get nothing (but survive). A can only offer the prize to one of them. The other two will still get nothing (but survive). Again, their behaviour is not determined, and this time, it affects the vote outcome.
If we assume that pirates will, if offered nothing, vote to throw someone overboard, then A goes overboard. In the 100 coin case, the 203rd pirate goes overboard. And it's still not determined who the 202nd pirate offers the coins to. If, on the other hand, we assume that pirates will, if offered nothing either way, vote not to throw someone overboard, then nobody ever goes overboard. I'm not sure what assumption generates the pattern you mention. 16:49, 29 June 2006 (UTC)
One prize 5 pirates is easy...You work it backwards:
  • With D & E remaining D gets the prize.
  • With C, D & E remaining either D or E gets the prize as then C's decision has a majority.
  • With B, C, D & E remaining C, E (equilibrium) or D (majority as C knows he can't get the prize even if he objects) gets the prize.
  • With all A, B, C, D & E present C, D or E gets the prize as B knows there's no way he can get the prize and survive creating 3 (A,B & prize taker) - 2 majority.
-G3, 12:00, 28 November 2006 (UTC)

Here's the article; couldn't find it before: [1] Fermium 23:55, 29 June 2006 (UTC)
Hmmm. The solution is flawed. Under the assumptions, a bribe is effective if the pirate being bribed was not going to receive a bribe under the next successful proposal. Yet, as the article admits, at a certain point, the proposals are not uniquely determined, so at some point there will be some bribed pirates who don't know whether they would be bribed anyway. Their votes cannot be counted on. 14:19, 30 June 2006 (UTC)
This comes a bit late, but I think that the missing piece is that the pirates, in the Sci Am article, like throwing people overboard. In the five pirate example this means that A can only bribe one of them, and the others will want to see him thrown overboard whether or not they can get another chance at the gold with B's proposal. Adam Faanes 03:36, 2 September 2006 (UTC)

Pirates like throwing pirates overboard[edit]

This problem is a little undecidable. If there are 3 pirates and the first one offers 100 coins to himself and 0 to the other two, the last pirates cannot decide how to vote (he would get 0 coins either way). If he likes throwing pirates overboard, he would say no and if he dislikes it he would say yes. --Petter 23:42, 8 March 2007 (UTC)

you have missed something. If the two vote to throw the proposer over-board this will mean that only 2 pirates are left, thus the higher ranked pirate will allocation 100 - 0 and always win (since he has the deciding vote). Thus it is in the best interest of pirate of least rank not to throw the proposer overboard.--Dacium 03:15, 26 July 2007 (UTC)
He'd get zero gold coins either way. That's why the third condition is needed. --Petter 04:10, 27 August 2007 (UTC)

Pirate loot problem[edit]

Messy Thinking 02:29, 7 April 2007 (UTC) Does this connect in any way to the pirate loot problem? Seems to me that, especially as the loot was used to pay certain fees, this was an issue in genuine pirate culture.

Yes, they are the same. They should be merged. Mlewan 21:11, 26 June 2007 (UTC)


You do realize of course that right after the image in question the entire crew were marooned/killed...

Pulled references[edit]

I deleted two entries from the references list that seems doubtful. One ( simply stated the problem for 10 pirates and asked for answers. No answer was given, no research or theory presented. As such it's useless as a reference. The other ( looks like a book of game theory. I downloaded all of the TeX source files and found no references to pirates. Maybe there is valuable information, but it's not reasonably found, so it's gone. (If someone finds it and re-adds the citation, please give enough information to find the reference, ideally the chapter and maybe a a very, very brief snippit to search for.) Finally, I didn't delete but I'm suspicious of the link. It just looks like some guy's blog in which he poses the game and other people replied with analysis. It doesn't seem like much of a resource, especially compared to the article from Scientific American.Alan De Smet | Talk 03:13, 10 August 2007 (UTC)

Possible sources for citations[edit]

Some possible sources for more citations. Please add more if you find some, but don't have the time to integrate them into the article. If someone takes these and adds more to the article, 1: thanks!, 2: please delete the ones you added from this list. — Alan De Smet | Talk 04:36, 10 August 2007 (UTC)

  • "Game Theory spring 2004 Problem Set 2 - solutions" by Eli Berger [2] page 2. (You use Google to read it if you can't handle PostScript format files [3]. Backup link to WebCite, since the author's home page has a warning that it may go away: [4]) - States the problem and offers some analysis. The author was an instructor (probably a professor?) at Princeton University.

B's possible strategies[edit]

When A offers 98 0 1 0 1, B will be incensed at the distribution and will try to think up strategies. A possible strategy for B is to announce that if A is voted off, he(B) will announce a distribution plan perhaps of 25 25 25 25 or perhaps 49 0 2 49. Now E has to decide whether to trust B, but being a pirate, it has to be unlikely and B is likely to go back on his word and offer 99 0 1 0 when he gets in. C is unlikely to be able to persuade D and E to punish B for going back on his word as C could offer 99 0 1 with impunity. E won't vote with D to get rid of C because E will then get none. If E thinks there is a 1 in 25/49 chance that B will keep his word or that there is a good chance B will actually offer something like 95 0 3 2 to stave off punishment action then E might be persuaded to vote off A. If E assesses 0% chance of B offering anything other than 99 0 1 0 then E would not be pursuaded to join B and D in voting off A.

The logic and solution (subject to some reasonable probability assessments) therefore survives this B strategy when there are 5 or fewer pirates. More than 5 pirates and such a B strategy might just work as the punishment action for breaking his word becomes more credible. Even with lots of pirates this B strategy would not work with strictly 'game theory rational' pirates. The article isn't clear whether the 5 rational pirates are 'game theory rational' or 'real world rational'.

Another strategy is for B to bribe E with a gold coin for his vote. E cannot then loose out as a result of voting off A. Therefore if A is afraid of this happening then A may be sensible to offer more to E. If this is not allowed or E could take the bribe then break his word and vote to accept A's offer the logic and solution will again survive.

Hope that makes some sort of sense even if the obvious answer is we cannot add original research. Nevertheless, is the article complete without explaining this? crandles 19:12, 10 August 2007 (UTC)

Given that it's a game theory problem, it's reasonable to assume that perfectly rational (that is "game theory rational") pirates are intended. Thus the above analysis isn't really relevant as the pirates won't trust each other in the slightest as upholding a promise has zero value within the realm of the puzzle. — Alan De Smet | Talk 22:52, 10 August 2007 (UTC)
Isn't the most interesting part of game theory the normative part of seeing whether the solution really applies? crandles 09:20, 11 August 2007 (UTC)

C Strategy[edit]

What happens if C or E rather oddly votes with B and D to get rid of A? Would this set a precident? Would this unsettle B and make him fear he will be voted off even if he makes a 'game theory rational' offer? Might this persuade B into making a better offer? C in this case is risking 1 in the hope of making different things happen. Could that be ultimately 'real world rational'? Should A be afraid of this and offer C and E more? crandles 19:12, 10 August 2007 (UTC)

Intuitive result[edit]

Article currently says

It might be expected intuitively that Pirate A will propose that the allocation shall be 20, 20, 20, 20, 20. However, this is not the theoretical result.

I don't think 20 20 20 20 20 is an intuitive result because this would imply that B,C,D and E should vote off A as there would then be fewer pirates to share the loot and judging by that allocation they may then get 25. Therefore the intuitive result may be that A can allocate little if any to himself. Should it be changed to:

It might be expected intuitively that Pirate A will have to allocate little if any to himself for fear of being voted off so that there are fewer pirates to share between. However, this is as far from the theoretical result as is possible.

crandles 17:04, 27 August 2007 (UTC)

more for pirate E[edit]

logically pirate A would need to offer 97 , 0 , 1 ,0 , 2 in the example given .

Pirate E will need a greater money from Pirate A as he knows logically that Pirate C will offer him 1 coin also and he is also aware Pirate B will offer him nothing .

Now remember that these particualr pirates tertiary motivation is making pirates walk the plank .

Pirate A offers 1 gold and no plankwalkings Pirate C would also offer 1 gold but also 2 plankwalkings .

Knowing this Pirate A would have to offer more gold than Pirate C would do so that Pirate A offers pirate E 2 gold wheras pirate C would only logically need to offer him 1 gold . —Preceding unsigned comment added by (talk) 15:34, August 29, 2007 (UTC)

Sorry but pirate B will make a rational offer sufficient that B (with rational votes) will survive so E needs to compare the A offer with the B offer. Comparing the A offer to the C offer (which will not be offered) is not rational. crandles 17:59, 29 August 2007 (UTC)

Just a reminder, since someone attempted to change the article in this way: the current numbers are supported by a citation. If you can find another source proposing different answers, feel free to add it and add a citation for it, but don't replace the existing numbers. — Alan De Smet | Talk 17:42, 17 April 2008 (UTC)

Just because there is a 'citation' does not mean it is a 'logical' answer. I would expect E to refuse the offer of 1 coin. It is rational for E to assume that A would prefer to lose at least one more gold coin rather than die. So, if E refuses the offer, there is a very strong chance he will be offered one more coin. (If he refuses again, then D may accept 2 coins rather than the 1 he would get from B). — Preceding unsigned comment added by (talk) 11:34, 20 December 2012 (UTC)

That isn't quite right. E doesn't need to make any assumptions about A, because A's only action comes before E's vote. However, it is rational for E to refuse the stated initial offer because it gives him a reputation that is likely to result in him getting more than one coin. Only if all the pirates know all the pirates are rational is reputation worthless and the stated solution correct.

Wikipedia's job is to report on what others have said, not to engage in original research. If you're quite confident that you have something to add, write up a paper on it and get it published. Indeed, if a reliable source were to publish your paper on the pirate game, we would be remiss in not including it in the article. — Alan De Smet | Talk 05:33, 22 December 2012 (UTC)

A is actually in a bad position[edit]

Actually I don't see why A should get the most coins as A is the pirate first to be at risk of dying. E doesn't risk dying at all. Remember the rule that everything being equal pirates would prefer that another pirate dies.

The official result to me appears to be just a preliminary result after an initial iteration. Taking it as a final result is incorrect.

Basically if B is willing to offer more to 2 pirates than A is, A will die.

If A attempts "99 0 1 0 1", B could propose "dead 33 33 33 0" or even "dead 0 50 0 50" or "dead 0 33 33 33", after all B would otherwise be getting zero anyway and in these proposals a pirate dies (pirate A). Assuming the pirates are Perfectly Rational Beings they will all know that too.

So in order to not die A might propose "0 50 50 0 0". B won't object since he's getting more than the zero in the 1st iteration. If A dies and it's his turn (to risk death as well) he might have to propose "dead 50 50 0 0" anyway to prevent C from proposing "dead dead 50 50 0".

Thus I suggest that "0 50 50 0 0" is a stable offer, but I am uncertain that this is indeed the best proposal for A.

While I'm not sure what the final result is (or if there's a single answer), I'm pretty sure it's not "99 0 1 0 1" :). (talk) 17:13, 24 January 2008 (UTC) (Link).

This is a game-theory problem, not a realistic problem. As described, B doesn't get to make proposals at all until A is dead. If A dies, it's identical to the 4-pirate case and doesn't change anything. Ultimately, however, our theories aren't really relevant. Remember that Wikipedia's job is to report on what others have said, not to engage in original research. If you're quite confident that you have something to add, write up a paper on it and get it published. Indeed, if a reliable source were to publish your paper on the pirate game, we would be remiss in not including it in the article. — Alan De Smet | Talk 03:11, 25 January 2008 (UTC)


"Ian Stewart extended it to an arbitrary number of pirates in the May 1999 edition of Scientific American, with further interesting results.[1]" Anyone care to elaborate? As it currently stands, the reader is left wondering what the "further interesting results" are, and is forced to go to the references, download the article, and read through it themselves. Loggie (talk) 21:44, 3 February 2008 (UTC)

Ties are screwed up[edit]

In the SciAm version (referenced), ties go to the proposer. The bit in the description about the "casting vote" is nonsense.

In a less-used version, ties go against the proposer. In those cases the solution is similar, but not the same. Several people on this board have been bothered by the solution not being 97-0-1-2-0; this is because they have the solution for the "tie loses" variant.

The article should reflect both variants, and the reference to casting vote should be deleted. Jd2718 (talk) 14:27, 3 July 2009 (UTC)

I agree, "casting vote" is wrong and needs to be deleted[edit]

A casting vote is made after all other votes, and only when those votes are tied. For the case of 2 pirates, D and E, the initial vote has E voting by himself. Only if the initial votes are tied (which is impossible with one voter!) is D's vote counted. So, for two pirates, E should vote down any proposal by D and take all the loot.

I agree with Jd2718 that "the proposer wins ties" is the correct way out of this. The part about "casting vote" should be removed and replaced with "proposer wins ties."

Mdnahas (talk) 21:47, 31 July 2014 (UTC)

Possible answer[edit]

A does 0-25-25-25-25 so no one wants to vote him off. If they vote off then either B gets the same or gets something less (if he gets more then others they will vote to throw him overboard, and if he gets less then he gets less), so A should do 0-25-25-25-25. (talk) 00:51, 22 June 2010 (UTC)

It's a game theory puzzle. In game theory, you generally assume that all the other players play according to theoretically perfect strategy; hence, their moves are predictable. The article has a mathematical proof as to why 98-0-1-0-1 is the best strategy. So if you're going to argue with it, you need to find a reason why the proof isn't adequate. - furrykef (Talk at me) 07:43, 27 November 2010 (UTC)

Ideal solution[edit]

97-0-1-0-2 and here is why. E knows he can never be offered more than 1 by C or D as is shown in the article's proof, he votes with anyone who gives him anything. D can get 100, so he will never vote with anyone nor does it matter if he does. C has to offer 1 to E and nothing to D. B on the other hand knows he can always get 98 by giving 2 to E and accepting a tie. E won't vote against him because this is better than getting 1 which is his only other alternative. E knows he can get 2 from B, B can get 98 from A, but has no reason to keep A alive due to the pirates prefer killing rule. However, A can offer C 1 coin to vote for him, since C knows how B will play and that he can never get more than 1. - DampeS8N

no... D knows that if C gets to offer, C will give him zero, and C is guaranteed to win. So D would rather accept 1 from B than let C make the offer. (talk) 03:06, 4 November 2013 (UTC)
I agree with (97-0-1-0-2) offer.
The A perspective will be as follows:
- B wants him dead no matter what (b can get 99 gold and give 1 to D as he knows the D won't get anything if B dies. THis is much better option for B rather than giving 2 to E and he gets only 98).
- C can be bribed by A ( at least 1 gold) assuming that C will have difficulty throwing B overboard since D will vote for B to get at least 1 gold.
- D wants A dead because D can get the same from B (at least 1 gold)
- E can only get at least 1 gold from C and will get nothing from B and D
So if A give E 2 gold, that would be a better choice for E than get only 1 gold from C (if A gives E only 1 gold, E would still ::prefer to get that 1 gold from C and at the same time kill A.)--Niutoniumx (talk) 12:57, 29 November 2013 (UTC) Niutoniumx

Frustrations with the problem[edit]

Probably should have put this in here first.

The rule of rationality is always going to be a bit skewed, because rationality in this case means maximizing your potential with perfect knowledge. In this game, you maximize your potential with a coalition of 3 at the start, and it is available to every pirate. However, the game rules take away that very potential at the beginning by saying the pirates do not trust each other. If they do not trust each other, they cannot be rational actors because they don't have perfect information. Because, trusting each other actually yields a higher individual result than not working together, that is the rational course of action for the three. Taking away the ideal rational action and calling the puzzle rational is insanity.

Then you have the valuation of a life. The pirates want to kill, but it apparently has no monetary value. That seems rather random, especially since the pirates value their life more than money according to the rules. So, if life is worth more than money, why is killing someone not worth anything at all aside from a lark? That's contradictory if they are in competition for the most resources and life is both cheap and worth everything. This would mean that the person offering would value their life more than the coins presented. The fact that the other pirates apparently cannot use this information in bargaining, despite it being a rule, is somewhat perplexing to me. If my life is worth more than 100 coins, as established in the rules, then the first pirate giving himself 0 coins is actually a valuable outcome, since he traded his life (Worth over 100 coins) for something worth 100 coins.

Within it's own logic and definiton, this problem fails. It requires further clarification on every single factor. The life cannot have value, or else it will need to be considered by each proposer. The death cannot have value, or it changes the equation. The parties must not trust each other, but they must be perfectly rational, which requires trust and perfect information as coalitions can produce more than individuals alone could, and a rational actor would use the coalition to boost their power. It's frankly annoying that this is considered a simple or smart puzzle, when it is neither. — Preceding unsigned comment added by (talk) 05:10, 1 January 2014 (UTC)

There is no such guaranteed coalition for the bottom 3 pirates. The 2nd-lowest pirate would never cooperate with the 3rd-lowest pirate because it would lessen his overall payout: once there are only three pirates remaining, the 3rd-lowest pirate would pay the lowest pirate 1 coin so that he would accept his proposal, and take the rest for himself, leaving none for the second-lowest. If you follow the logic from the article, you'll see that the lowest pirate is guaranteed to accept this proposal, because if he kills the third-lowest, then the second-lowest pirate will not give him anything.
The rule for pirates killing is that they will always vote to kill unless it lowers their payoff. However, a pirate's valuation of killing and staying alive are two completely different concepts which cannot be conflated. A pirate's priorities are: 1. Do not die, under any circumstances. 2. Get as large of a cut of the money as possible without being killed. 3. If you can kill someone without eventually being killed yourself, and without losing any money, then kill them.
The pirates do use each others' valuation of their own life in "bargaining". This is the reason that the lower pirates get any coins at all. However, the "bargaining" simply takes place through the highest-ranked pirate's reasoning on what the lower pirates prefer. The highest-ranked pirate can get away with not giving the lower pirates very many coins because if he dies, those pirates cannot guarantee that they will receive any coins at all. So, even receiving a single coin is enough incentive to keep him alive.
Rational agents not cooperating to achieve the "best" outcome is very common in Game Theory problems, most famously in the Prisoner's Dilemma, however, in this case, there's not really any basis for claiming that a lower-pirate coalition would be the best outcome. Not trusting one another is actually one of the defining traits of "rational agents," because rational agents understand that other rational agents would betray them for even the slightest benefit or safety.
I am going to remove your section from the main article, but I'll check in this talk page again if you'd like to go over the puzzle a bit more. It is a bit of an unintuitive result. (talk) 05:38, 5 January 2014 (UTC)

I appreciate you removing my section. Was poorly worded, and slightly alcohol infused. Also, late reply.

But basically, by your reasoning, a rational actor will betray another rational actor, despite both partners knowing that their utility is increased without betrayal. That is not rational. I maximize my own utility by making the most coins. He maximizes his utility by making the most coins. By not working together, we get 0/1/2. By working together, they'd get 34/33/33 or 33/33/33. Therefore, to claim the ones who accept 0, when they could receive 33 are acting more rationally, does not make sense. They are not maximizing their utility. Betrayal does not maximize utility in this problem, therefore it is not a rational behaviour.

Again, just to be clear: You are stating that it is more rational to accept 0 coins than 33. To have 0 utility over 33. To know you have no chance at all of gaining more than 0 coins because you want to betray, but if you were rational and worked together, you'd have 33. This isn't even a case of share the wealth. This is 6>7. It has no basis in logic. It is a claim that to be rational is to betray everyone, despite having a loss for doing so.

There is also no claim it will be the lowest pirates who make that trio. Any pirates could make that trio. That is another reason people may dislike this complaint, because there is no clear solution after that information is provided. Any 3 pirates may end up forming that group. Maybe a 33/33/33/1 split.

The killing definition also makes no sense. Again, a rational actor will have perfect information. They will know the NPV of killing another pirate. Since you are so insistant that they will betray each other for a single coin, why would they not kill one for a return in the future? Net Present Value of a murder with future earnings needs to assume these pirates never ever split a single coin again.

Again, that's just a valuation thing on the pirate on their own life. They do not want to die. It is preferable to have 0 coins and be alive than be dead. However, only the first pirate needs to face that dillema. The first pirate, passing any split, gets a value of 101. (Amount of coins + 1). Which may mean he wants as many coins as possible (198 coins of value sounds nice) but even by giving away every single coin, he has a gain greater than the other pirates (in that he hasn't lost what only he could have lost).

The classic prisoners dilemma actually has a gain for betrayal (As well as a possible loss), this making it a dilemma. This puzzle only has a loss for betrayal. There is no gain by betraying the other pirates. Therefore, betrayal is not a rational behaviour, it only provides a loss for yourself and your possible partners. Nothing is gained by it. It voids the basic premise in this puzzle.

 — Preceding unsigned comment added by (talk) 10:21, 26 February 2014 (UTC) 
It's certainly not rational to accept 0 coins, but it certainly is rational to accept 1 coin if you would otherwise not get any. The one who is betraying the others is the highest-ranking pirate in the group-- not all of the pirates-- because the highest-ranking pirate benefits greatly from not giving the pirates below him a fair deal.
The coalition of three pirates will always fall apart when it comes time to actually propose the split. The highest ranking pirate in a group will not benefit by offering more than what is strictly necessary to win the votes of those below him. No matter which three band together, one of the three will eventually be the highest ranking through killing off the other pirates, and at that point, that pirate will betray his comrades. To not betray his comrades would not be rational. You need to follow the process through to its conclusion. Yes, three pirates can conspire to kill off other pirates who give them bad deals; however, this does not lead to them splitting the gold evenly. They have no obligation to honor their coalition once one of them gets his turn to propose the split. He'll offer a 99/0/1 split, because that's the split that benefits him most, and he knows that his comrades have to accept it.
So the highest of the trio has betrayed them, right? So they'll just form a coalition against him, and split it 50/50. But, that doesn't work out either though. Once it's down to just two, the higher of the two can offer a 100/0 split and tie the vote, to guarantee he gets it. So, the lowest ranking pirate is required to vote in favor of the bad deal (99/0/1) offered by the highest pirate of their coalition otherwise he won't get anything. The highest ranking pirate in the coalition will know that this is how things would otherwise work out, so he will always betray his comrades, because he knows the lowest pirate will vote in favor of his split. Thus, there's really no incentive to form a coalition to begin with, because it will always lead to betrayal.
The exact details will differ depending on which three pirates form a coalition, but the highest-ranking pirate will always betray them. That is his gain for betrayal-- 66 coins, not too shabby.
You also cannot fixate on life as having a numerical value. If you try to assign it the value of all the coins, then what happens when they suddenly need to make a split over more or less coins? Does the value of life change to accommodate this? Valuing life most is just a rule, and isn't expressed as a particular numerical value. (talk) 03:01, 25 March 2014 (UTC)

Here's the thing. In the case of the 3 pirate split, what is more rational: Trying to get the most coins, or accepting a single coin? In the end, living is 100% preferable to having coins. In that final situation, pirate C cannot trust pirate D with a single coin, or 100 coins, because pirate D will always betray him, correct? Therefore, if pirate E, acting perfectly rationally says: "I will not vote for you unless you give me 100 coins", what happens? Does pirate C's rationality saying "Well, is 0 coins and life more valuable than 99 coins and potentially dying? He's not lying.". Pirate E has 100% of the power there. Pirate C has none. There is also no possibility for deception, as noted. By being willing to kill, pirate E gains 100 coins. That is perfectly rational, unless the pirates are willing to gamble and die. Since the rules of the game do not allow for that, in the third split, pirate E should actually get 100 coins. Because pirate C doesn't want to die. If I was E, I'd certainly go with 100 coins or kill him. In reality, there's gambling. In this problem, there is none, so unless your life has no value, which breaks the rules of the puzzle, pirate C gets screwed over. D will always betray, because he gets 100 coins and a possible kill. I'd argue that is far more rational of E to force the issue to maximize his return then accept a single coin. If C was a gambler, then yes, things would be different. But C cannot gamble, E cannot lie. Perfect information and all that. If you have the advantage, it's rational to use it. Hurt feelings and revenge do not factor in the slightest. Just your life and your coins.

What makes the coalition more fun is the fact that they don't need to propose it at all. A has to propose the split, and logically, needs to vote yes for his own split, even if he gets 0 coins for it. Now, if you follow the same logic that the puzzle wants you to and work backwards, you can come to an answer. I did, with allowing coalitions, with only one odd logic bubble (Equally viable choice in one split, which has an impact in future splits oddly). Give it a shot. This is a terrible puzzle that fails within it's own rules, terribly if you use rational, and only moderately if you use logical. Because no pirate wants to die, the pirates should never base their betrayal two steps ahead. One death should not happen, let alone 2. Therefore, a coalition of C/D/E would be quite strong, since A and B don't want to die, and only after they do do C/D/E get to betray each other. Then you have the above question. If the person is gambling their life versus coins, in a situation with no gambling, why is it not rational to demand 100 coins? It becomes a hostage situation, there is no viable other option.

Again, this puzzle is terrible. The more you examine it, the worse the flaws become. It's not logical. It's not rational. It just sets it's rules, and then doesn't follow them very well. After all, if you allow gambling, then the puzzle has no answer. If you accept that everyone wants the most coins, but can't work together, then that violates the rule they want the most coins. There are situations where you only have power if it fails, granted. But since it should never fail, you can only plan at most one step ahead, and if that step is not advantageous, why take it?

Honestly. Solve the puzzle the same way the answer wants you to, however accept two things: People know they want you dead, and are going to leverage that for their own maximum return, and failing that, they will form a coalition of votes to demand a vote (or kill you for failing to adjust to their demands). In addition, other people will form other coalitions to ensure they receive their maximum, thus making all coalitions compete. I found an answer following those rules. It looks a fair bit different at the end. — Preceding unsigned comment added by (talk) 04:02, 30 March 2014 (UTC)

You're just not following here. Pirate E doesn't have a leg to stand on. If E says "Give me 100 coins, or I'll kill you." Pirate C will say "If you kill me, then Pirate D won't give you anything. I'll give you one coin, and you'll like it. And you have to accept it because it's more than D will give you if you kill me." (talk) 01:28, 25 April 2014 (UTC)

To that I'd say "You don't want to die, you don't have a leg to stand on either. If you don't give me 100 coins, I'll kill you". Your response? Is giving me 100 coins or breaking the rules of the puzzle. You can't gamble with a single coin. Or 20. Or 50. You take the offer.

Oh, I'm following it completely, but that's not rational in the slightest what you are presenting. What you are telling me is that you are conveniently ignoring the rule that Pirate C does not want to die. Pirate D has 0 impact, we agree on this. But for pirate E to ignore the power of his vote is not rational in the slightest. I am acting in my own, perfectly rational, self interest. He needs my vote. To not use it to maximize my potential is not rational in the slightest.

Let's say it's a different situation, no life or death on the line. 3 senators are voting on a monetary package. 2 Votes are needed. One abstains because of reasons involving cocaine, orgies, and a shocking amount of press attention. As the junior, voting senator, is it rational of me to accept that okay, since senator A, who proposes the bill will get to determine the contents of the bill, accept everything he gives, or is it rational to say that "You only get my vote if this is included". According to your reasoning, Senator A is just going to give me a penny and say "Screw off, Junior". Whereas, according to my reasoning, Senator C is going to say "If this passes, we're getting this". Which is more rational, and benefits what Senator C wants? Your answer, or mine?

Or there are 2 janitors who find 100 bucks in 1's after hours in a locked room. There is a third janitor who has the key to that room, but he is quitting tommorow, and the second janitor will get the keys then. According to your rules, The first janitor can say "I'll give you a buck" and he'll take it, despite knowing the money is there after being told about it. According to my sense of rationality, he'd be able to use his power "If I don't get enough, you won't get any", and if he was rational rational, he'd push for as much as he could.

This takes out the rule of death. This just frames your solution in a different light. Rationality involves using your power, not ignoring it. I appreciate that you're holding on to the answer, but I don't understand how a rational person can. Honestly. Just because you're told it's right doesn't mean it is. Why would one person accept 1 coin when they can get 100, same as why would a group of 3 accept 3 coins when they could get 100 to be split between them? A rational person wouldn't. A logical person wouldn't. But this puzzle and it's defenders swear that they would.

Offering a single coin turns it into a gamble, because the final pirate has all the power, and pirate C, above all else, does not want to die. Because there is no gambling, the final pirate gets to have the power, period. And if he's rational, he's going to use that power. — Preceding unsigned comment added by (talk) 09:09, 20 May 2014 (UTC)

You aren't thinking about what happens after he kills C, and by doing so, you're breaking the rules. A pirate will vote to kill only as long as it does not reduce their maximum payoff. If E votes to kill C, he is not getting anything. So, as long as C offers at least one coin, E is not going to vote to kill him, otherwise he'd be losing that one coin for no reason. And as long as C knows that he cannot refuse one coin, he will not give him more than one coin. E can threaten all he wants, but it's not a gamble in any way whatsoever. He cannot kill C as long as C offers him at least one coin.
You're only considering the immediate consequences of a vote, when the whole point of the puzzle is that you have to think about the long-term consequences of each vote to see what will eventually happen; there are consequences for E if C's proposal fails to go through that you are not taking into account. Your examples show this; your senator example doesn't even have Senator B make an additional proposal if the first one fails-- there's no consequence for the third senator at all-- so you're really just considering a completely different situation. (talk) 13:35, 25 May 2014 (UTC)

I understand your complaint. If E will get 0 coins, and he wants to maximize coins, he would accept 1, because it is more than 0. However, then E is not acting rationally or logically, or greedily, or anything but stupid. In the situation where he would accept 1, he can also get 100. Because C cannot die. The power of their votes is a huge problem that is completely overlooked. In each situations, each pirate would look to maximize their return. They would not follow some poor, failing logic, if they were rational or logical. At no point have I broken the rules of the puzzle. I have just pointed out that the rules clash in this situation, and if they were all greedy, and gambling is not allowed, then pirate E would come out on top. If gambling were allowed (Goodbye logic, rationality, reason, and a sane puzzle) then E could certainly give the 1 coin over. That is an assured vote, following the normal rules of the puzzle, but then it breaks logic or rationality.

I agree, you can't look 2 proposals in the future, but you can look 1. I am just following the rules of the puzzle. A pirate cannot die. He must propose something that will pass. Each pirate is also attempting to maximize their returns. Because of this, Pirate E would accept the single coin if he had no power to take away something else of Value, Pirate C's life. But he does control that life. And a rational person will use every bit of their power to maximize their return. If you had the option of 1 coin, or 100 coins, you would take the 100. Period. Just because you can accept 1 coin, doesn't mean you shouldn't take 100. If the puzzle's logic falls apart, it's not a very good puzzle, hence my problem. 

Let us start from the back, looking at vote power and see how the coins would be split, expand the reaosning: D/E - D has the power. Doesn't need any votes, can do what he wants. (100/0) C/D/E - E has the power. C needs a single vote, or he dies. Since he cannot die, E's power is the most important. D cannot be trusted, since if it fails, he gets the power. (0/0/100) B/C/D/E - C/D have the most power. E cannot be trusted due to his power in the next vote. B only needs to convince one of C/D. There are 2 equally valid solutions to this problem (C/D would work together to maximize their returns***). (49/51/0/0 Or 49/0/51/0) A/B/C/D/E - The power is split, heavily. Any group of 3 that can get more by going with each other over A will do so, because they aren't stupid, unless you follow the rules of this puzzle, which make no sense. A still will need to maximize his return while enticing 2 others to leave their groups (Of the many possibilities), and making sure he compensates enough that makes leaving their group or not deliberately betraying him more valuable. (If you can differentiate between C/D in the previous puzzle, 32/0/34or0/34or0/34, if you cannot, 16/50/0/0/34)

      • C/D would also betray each other to maximize their returns since only 1 of them can recieve it, and because their votes are equally valuable, giving 1 of them 51 is just the easiest solution. It is conceivable that they would begin to bid each other down, since B only needs one of them but the moment they hit below 50 coins, they should realize that by working together, they get 50 coins and stop bidding each other down. Unless they are stupid and not rational. The issue of course, is that if you want a logical answer, having both of them be equally likely and impact the next answer isn't clean and cut. But I didn't design the problem. I just think it is a terribly stupid one. — Preceding unsigned comment added by (talk) 03:31, 12 June 2014 (UTC)

When it comes down to it, pirate C is the only one who gets to decide what the split is, and E has to make his choice AFTER C proposes the split. C will always propose a 99:0:1 split, because that's his best deal. Is E just going to vote against him on principle?
"Argh. You didn't propose the split I wanted, so now I'm going to take NO MONEY out of spite!" -Your Hypothetical Pirate E
Once C makes his proposal, E has a choice between 1 and 0. No matter how complicated you try to make it, that's what it comes down to. E's only choice is 1 coin or nothing. And as long as E would always prefer one coin over nothing, there will never be any risk in proposing the 99:0:1 split. There is no gambling involved. (talk) 02:17, 13 June 2014 (UTC)

Again, I get that, but there is no spite involved. None what so ever in fact. What is happening is in your answer, C is maximizing his utility and power and E is minimizing his utility and power.

If C gets 0 coins, lives, and E gets 100 coins, then C is still maximizing his utility (He's still getting the most out of the situation that he can and not die) and E is actually maximizing his utility instead of just accepting it like an idiot would.

Once again, I can accept that C can propose whatever split he wants. But if E sets the rules saying "You give me 100 coins, or you die" which is 100% rational for him since it maximizes his utility, it maximizes his power, and C does not do that, C is gambling. E very well may be rational in taking that 1 coin, but he is failing to maximize his utility at that point, whereas C is maximizing his utility, but is breaking the rules to do so. He cannot propose what has a risk of dying. A pirate E not maximizing his utility would accept that, but then he's not being rational.

I'm understanding that it's a contradiction, but I'm deliberately pointing out that there's a logical contradiction in that problem. A rational pirate E would maximize his power to maximize his utility. A rational pirate C would then need to make the choice, does he accept E's power or does he gamble. But a rational pirate cannot gamble. Once the rule is stated, and a rational E wouldn't even need to state the rule. If he's trying to make the most, C would already know he wants 100 coins or he's dead. This creates that contradiction, and the reason the accepted answer I view as wrong is because, as above, while C is trying to maximize his return on both answers (99 and 0) E is not. E is only trying to maximize his return on my answer, and is accepting his fate on the common answer. Which is far from rational. If you are rational, you will try to maximize it, and if you can maximize it by pointing out not giving everything to him will result in a death, you have created the logical wall where both pirates have a situation where they break rules. C by proposing, because he cannot die, E by accepting, because he's not maximizing his utility unless C proposes a 0-0-100 split.

If rationality implies that he is going to maximize his utility, maybe spite is a rational action. If C understands that E will kill him, even if he proposes 1-0-99 coins, then E is still being rational. E is maximizing his utility. He is using his power to the full extent to get everything he can out of the situation. And so is C, because C wants to live above all else. It's what happens when the logic isn't actually reasonable. By playing the game differently, E wins that situation, but only because C cannot die. If C can gamble, he's not logical, but would win in that situation.

Let's put it this way. You and I are in this situation. If I was E, I'd kill you for not giving me 100 coins. Period. I'm maximizing my utility. The moment you do not adhere to my demands, which are very much a hostage situation, you're dead. It's in my benefit to do this, because if I followed really stupid logic rules, I'd end up with one coin at most. Because you can't die. You, on the other hand, in the hostage situation, would settle for a coin I dropped then let me go on my merry way, and pretend you maximized your utility. I don't need to pretend anything. I got exactly what I wanted. You can be satisfied with a fake answer and a gamble. — Preceding unsigned comment added by (talk) 21:57, 11 July 2014 (UTC)

You're still making the same mistake. You're saying that E gets to make his decision first. E does not get the chance to make threats before C proposes his split. The situation starts, then C says "99:0:1". By that point, it's already too late for E to make any threats. The proposal is set, and E only gets to choose whether to accept the proposal or not. You are essentially attempting to turn this game into "The bottom ranked pirate gets to propose the split", which is completely the opposite of what it is.
The problem explicitly states that "The pirates do not trust each other, and will neither make nor honor any promises between pirates apart from the main proposal", which means that even in the event that E did make such a threat, C would know that he could not follow up on it. He's not allowed to make a binding promise, so, given the choice between 1 coin or 0, he would break his promise and vote in favor of C's proposal anyways. It's completely ridiculous to say that he'd be maximizing his utility by killing C on the 99:0:1 split. He'd be getting nothing instead of 1 coin.
The only way that what you are proposing works is if both of the following are true:
1. The pirates are allowed to make promises.
2. Pirate E gets to make a promise before any other pirate gets to make a promise, and before C gets to propose the split.
The first one is explicitly forbidden, and there's no basis for the second one; it makes no sense that anyone besides the current leader (pirate C) would get to speak first. (talk) 02:39, 13 July 2014 (UTC)

Missing solution for high number of pirates[edit]

I think a solution might be missing. If you have a very high number of pirates, for which at least half of them will die regardless of their strategy (as exposed in Extension of the game), then the first pirate may very well alocate all the gold to himself, as all the others who would have died will vote for him so they actually don't die. I'm not too keen calculations, so if anyone wants to establish the formula... Alleria86 (talk) 09:22, 16 April 2014 (UTC)

Establishing a formula would probably be original research [original research?] which couldn't be added. Once you get some dying, you have to start making assumptions about life expectancy, inheritance rules, time to carry out a vote, time to distribute booty and probably more. Too many variables for me. I don't think you have to have half dying before the solution changes. Once there are some unexpected deaths, pirates won't know if they are of odd or even rank so the normal solution fails. Low ranks may as well vote lots out to have chance of getting higher distribution. Suppose 10,000 is the most that can be efficiently voted on and distributed without expectation of deaths playing a role but you have 100,000 pirates and 100,000 gold. I can't think of anything much better than offering 6 gold to ranks 50003 through 66666 and 2 gold to ranks 1 to 4. 12 remaining gold given at random amongst those allocated 6. Why? Low ranks cannot be bribed as they can enjoy some killing til the numbers are lower for more chance of being offered a higher payout when there are fewer pirates. If the top third cannot think of a better distribution, they will suffer same fate so should agree just to survive without being bribed. The trouble with trying to bribe ranks 49999 through 50001 is that unexpected deaths may mean they do not get a payout so they prefer to vote out a few to make sure. While 50005 could enjoy a few deaths and is safe from losing gold for quite a few rounds of votes, if he votes against many rounds of voting out may occur. This leaves a few votes short. I chose 1-4 as targets to bribe as these seem least likely to end up as rank 10000 or gain a rank above 50% to get some gold. If previous lead pirate failed with trying to bribe 1-4 with 2 gold then these low ranks should be varied. That way they should take a bribe when it is offered. Not sure this is the right solution, there could be better solutions or many of similar level of sense. If ranks just above 66666 voted against, a subsequent lead pirate might spread the gold over a larger number of pirates above 50% ranking but then those just above 50% might vote against in the hope the gold is spread to fewer pirates. Such swings could mean there may not be a proposal that works til you get down to somewhere near 10,000 pirates. Tapering the gold downwards above 50% rankings may discourage having lots of rounds of voting against. Such tapers might work better with much more gold for example 100,000 pirates and 1,250,000,000 gold allows you to bribe rank 3rd highest with 1 gold, 4th highest with 2 gold, 5th highest with 3 gold, and so on. Anyway probably [original research?] crandles (talk) 22:01, 14 April 2015 (UTC)

Thinking of "correcting" the answer?[edit]

Not intended for a section for much discussion, but please post here if you've decided the answer & solution are wrong and you'll edit them. I've just reverted back an anonymous edit by someone who had obviously read a slightly different version of the puzzle, so they missed that the different answer here was actually correct. Their solution contradicted the rules given here. - Zuytdorp Survivor (talk) 23:38, 17 March 2015 (UTC)

I missed this and have already corrected the article. Oops.Alexshipp (talk) 09:25, 13 October 2015 (UTC)

I see someone removed my correction. However, the 'Extension' piece is factually incorrect and does not follow the Ian Stewart logic (here [1])

Pirate #202 cannot only offer gold to even numbered pirates. As Ian Stewart points out, he can also offer gold to pirate #201 as pirate #201 got 0 gold the previous round. Pirate #202 can therefore offer 100 gold to 101 pirates, all of which therefore have 100/101 chance of getting gold.

This means #204 will not make the same offer as #201. #204 can now offer the gold to any of the first 202 pirates, all of whom get 100/202 chance of gold. Pirates #203 and #204 get nothing.

Similarly #208 will not make the same offer as #202. #208 can offer gold to any of first #204 pirates, who have 100/204 chance of gold. Pirates #205 to #208 get nothing

This means the conclusion 'no pirate whose number exceeds 2G can expect any gold' is wrong

So is this The gold will go to those of the first 2G pirates whose numbers have the opposite parity to log2 M

I therefore propose to correct these errors and put the explanation back Alexshipp (talk) 06:55, 14 October 2015 (UTC)

Sorry yes. I have tried to correct the extension section but there may be other clearer ways of explaining it. I guess a certain offer of 1 is better than an 100/101 chance of being offered a gold coin but giving much consideration to this might be [original research?]. The source does say the solution in not unique so I hope what little I said about this is OK. crandles (talk) 15:17, 15 October 2015 (UTC)

Part of a gold coin distributions[edit]

Does the problem as described adequately rule out part gold coin distributions via some random procedure in the distribution proposal? E.g. A sets distribution so C and E each role a dice and if they roll a 6 they get a gold coin otherwise 0 gold coins. I.e. effectively bribed with 1/6 of a gold coin so expected gold for A is 99.666 ? Then a better solution is roll n die and get gold coin only if total is 6n with n being a high number. How best to edit in that restriction? crandles (talk) 10:05, 15 April 2015 (UTC)

Changed from "apart from the main proposal" to "apart from a proposed distribution plan that gives a whole number of gold coins to each pirate" crandles (talk) 10:12, 15 April 2015 (UTC)