Talk:Tit for tat

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Game theory (Rated B-class, Mid-importance)
WikiProject icon This article is part of WikiProject Game theory, an attempt to improve, grow, and standardize Wikipedia's articles related to Game theory. We need your help!
Join in | Fix a red link | Add content | Weigh in
B-Class article B  This article has been rated as B-Class on the quality scale.
 Mid  This article has been rated as Mid-importance on the importance scale.
 
WikiProject Economics (Rated B-class, Low-importance)
WikiProject icon This article is within the scope of WikiProject Economics, a collaborative effort to improve the coverage of Economics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
B-Class article B  This article has been rated as B-Class on the project's quality scale.
 Low  This article has been rated as Low-importance on the project's importance scale.
 


Axelrod's Competition[edit]

I changed the 1984 date that had been previously noted for Axelrod's competition. It was actually two competitions, and at least one of them had to have been held prior to Axelrod's 1981 publication of the original paper in the journal Science in 1981. There appears to be nice references in the Wikipedia article "Evolution of Cooperation" but I don't know how to do a link. I'm not exactly sure why the fourth condition applies, so if anyone finds an explanation, please post. I think the fourth condition is necessary because the retaliation only works if the agent plays the same opponent twice in a row. Herman - hke AT home.nl

"Downfall"[edit]

I think this section either needs to be removed or renamed/rewritten. I can only interpret the existence of this section as a misunderstanding of the workings of IPD and Tit for Tat. The section says that Tit for Tat was "beaten for the first time" in the year 2004, which would be incorrect. The success of a given strategy competing in the IPD is entirely dependent on the environment. Robert Axelrod outlined this in his book The Evolution of Cooperation, where he also tested a population where Tit for Tat didn't win. The situation in that population was simular to the one in the "Downfall" section - one category of strategies became induced to provide many points for another (altough they were not deliberately designed for this purpose), which lead to Tit for Tat losing the top position. The lasting point is that Tit for Tat is robust in a wide variety of environments - including those where the population of strategies are allowed to increase/decline over a number of generations according to their score. [signed by /dkristoffersson]

The naive interpretation of "winning" doesn't include the sort of "winning" that resulted in one member outscoring Tit for Tat - the article should be more clear about the precise definition of "winning" used in this case. [unsigned]

Requested move[edit]

The term "Tat" in this article shouldn't be capitalized. According to the naming convention:

Do not capitalize second and subsequent words unless the title is a proper noun (such as a name) or is otherwise almost always capitalized (for example: John Wayne, but not Computer game).

--goethean 19:41, 3 October 2005 (UTC)


Add *Support or *Oppose followed by an optional one sentence explanation, then sign your vote with ~~~~
  • Oppose Tit for Tat is a proper noun, it is the name of a specific strategy. --best, kevin ···Kzollman | Talk··· 01:11, 4 October 2005 (UTC)

Discussion[edit]

Add any additional comments
  • Support for the reason given. (And because "a specific strategy" does not make a proper noun as according to our comma-splicer.) Stephan Leeds 06:00, 6 October 2005 (UTC)
  • Oppose I agree with Kzollman for the reason specified: Proper Noun. Decision should depend on etymology of expression. --TimeHorse (talk) 11:33, 26 July 2010 (UTC)

Decision[edit]

Page moved per request and double redirects fixed. Ryan Norton T | @ | C 07:03, 17 October 2005 (UTC)

What tournament?[edit]

What tournament is this article referring to? It just mentions a tournament without any explanation. - idiotoff 07:57, 24 April 2006 (UTC)

Temporal confusion[edit]

In the introductory paragraph, "It was first introduced by Anatol Rapoport in Robert Axelrod's 1984 tournament." In the fourth paragraph of the Overview section, "For several decades Tit-for-Tat was the most effective strategy…" So... am I missing a few decades? Theory.Of.Eli 20:59, 26 April 2006 (UTC)

I removed the decades and reworded the sentence. It is not perfect, but now more accurate. Marc Harper 14:31, 11 August 2006 (UTC)

Practical Applications[edit]

Is there any room on this page for practical applications of Tit for tat? Examples: Bittorrent, Stop LightsSwerty 20:05, 2 May 2006 (UTC)

Or a lot of cooperation in nature (between unrelated organisms). Yeah examples are good, I came to this article to see if it mentioned bittorrent. BrokenSegue 03:32, 8 August 2006 (UTC)

What about Tit for two tats[edit]

Tit for two tats is sometimes (but not always) more successfull than tit for tat. It certainly deserves a mention. It retaliates with defection only after two consequitive defections by the opponent. This avoid some locks into mutual retaliation loop. [unsigned posting]

Logical Fallacy[edit]

The article states that "A fifth condition applies to make the competition meaningful: if an agent knows that the next play will be the last, it should naturally defect for a higher score." There is a logical fallacy in the reasoning that defection is best for the last move. While it is true that, whether the other player chooses to cooperate or defect, choosing to defect will always gain a higher score, this is a simplistic view. Generally, both players cooperating is more beneficial for both sides than both defecting. So, if both defect because they are aware it is the last play, then they both get less points than they would if they had both cooperated. [unsigned comment]

There is no fallacy. My best option is to defect if I know you will have no recourse to exact revenge. Your statement, "Generally, both players cooperating is more beneficial for both sides than both defecting." is true only if it is iterated. In the non-iterated Prisoner's dilemma, you must agree, defect is the best option. When there is only one move left, it is effectively a single run of the non-iterated scenario. Thus, people should always defect on their last turn (which is why the rules state that they must never know when the game will end). BrokenSegue 04:39, 17 November 2006 (UTC)
To both of which I would say, you are as believable as your sources. What are they? Leprof 7272 (talk) 17:29, 14 April 2015 (UTC)
I believe the reference was to the payoff matrix e.g. both cooperating gives 3 points each, both defecting gives 1 each. In this case, even if it is not iterated, if both defect on the last move they will have less points than if both cooperate. - AlKing464
True (and I just added the payoff matrix to clarify the situation), however, there still is not a fallacy. In fact, the matrix is correct and consistent with other texts. The issue is that even though they would be better off both cooperating instead of both defecting, they will still both defect. This is because no matter what their opponent does, they are better off defecting (and have no way to enforce cooperation agreements). Thus, in a non-iterated round, they will both defect. BrokenSegue 14:17, 22 November 2006 (UTC)
If I am not mistaken, in the Tit for Tat strategy, an agent will defect when playing a partner who defected last time, and cooperate when playing a partner who cooperated last time. It doesn't matter if its the final iteration, Tit for Tat agents only consider the previous iteration... — Preceding unsigned comment added by 99.248.99.157 (talk) 01:17, 18 August 2011 (UTC)

Evolutionary Stable Strategy[edit]

The strategy is not in itself an evolutionary stable strategy, by definition, as is stated at the end of the first paragraph. While it prevents invasion by defectors, it is possible to invade a Tit for Tat environment with cooperators. TechnoBone 16:41, 2 December 2006 (UTC)

Source, so we can use this observation? Leprof 7272 (talk) 17:29, 14 April 2015 (UTC)

Prisoner's dilemma article information[edit]

There is a lot of information on Prisoner's dilemma that can be incorporated into this specific one. --165.230.46.148 22:02, 11 December 2006 (UTC)

Please, the game theoretic needs not more cut-and-paste material, but, rather, sourced and more reliable, verifiable material. No more essaying, please. Leprof 7272 (talk) 17:29, 14 April 2015 (UTC)

What about unequally powerful players?[edit]

I tried the tit for two tats strategy on the Utopia web-game and failed within days. The reason was, I guess, mostly because my opponents were more powerful than me. They got a full payoff when they defected. When I defected in response, I barely affected them. So "tit for tat", and "tit for two tats" seems applicable only in scenarios of players of equal power. Can anyone give a tip of strategies and research they have found concerning unequally powerful players?

In Robert Axelrods book The Complexity of Cooperation he mentions for the first time the problem of unequally powerful players on page 127. Tommy 20:41, 29 October 2007 (UTC)

Offensive statement in Implications?[edit]

There is a possible offensive statement in its implications, in the following quote:

Also the theory can give insight in how technological innovation have taken place in history, and in particular, why the modern age evolved in the many competing kingdoms of Europe, but not for example in China.

I can't tell if this information is derived from Axelrod's book or not; could we reference that as the book's idea, or at least flesh out how Tit for Tat relates to the Enlightenment happening in Europe, not China? I'm not trying to be excessively PC, but there were a lot of environmental factors involved in how different regions developed. And besides, the exact point is unclear -- are we saying that China's relative isolation made them less internally cooperative? It seems a bit sketchy to me. Aletheion 18:08, 19 August 2007 (UTC)

It looks like nonsense to me. It's been deleted. --best, kevin [kzollman][talk] 03:16, 30 October 2007 (UTC)

...and so on. Really?[edit]

The overview claims, "Similarly if it knows that the next two plays will be the last, it should defect twice, and so on." [Emphasis mine.] Clearly(?), this does not inductively extend to: "If it knows the competition will be limited to a finite number of plays, it should always defect." Why not?

Or does it? And Tit for Tat should only be used when the end is determined randomly, after each round, so that an infinite game could be possible. Furthermore, only assuming a universe that manages to avoid a big crunch or heat death. (:  — gogobera (talk) 20:57, 27 December 2007 (UTC)

I gave you a comment on your talk page. Tommy (talk) 21:24, 30 January 2008 (UTC)

Remarks [Generality of Article, Etymology][edit]

K, this article is great and all, but Tit for Tat is an adage or saying, and the general populace does not think of it in relation to the prisoner's dilemma. This article should either be a subsection in an article about the actual saying or it's own article, such as Tit for Tat (Prisoner's Dilemma Strategy). —Preceding unsigned comment added by 24.4.200.52 (talk) 02:32, 8 February 2008 (UTC)

Seconded! Specifically, what is the etymology of the expression? --TimeHorse (talk) 11:24, 26 July 2010 (UTC)
Yes check.svg Done in part, in the edit today. I began the generalization of the article today. Please, @User:TimeHorse, others, continue the process? Leprof 7272 (talk) 17:29, 14 April 2015 (UTC)

Has Nash Equilibrium been proven?[edit]

Has it been proved (under appropriate conditions) that Tit for Tat is a Nash Equilibrium strategy for Infinitely Repeated Prisoners' Dilemma? 146.115.34.7 (talk) 20:28, 8 January 2009 (UTC)

Not just in game theory[edit]

Game theory is not all of tit for tat. It has other informal meaning that ought to be explained at the start before the game theory kicks in. 89.242.93.56 (talk) 13:11, 4 October 2009 (UTC)

Yes check.svg Done in the edit today. Began the process of generalizing the article today. Leprof 7272 (talk) 17:29, 14 April 2015 (UTC)

Tit for tat is not optimal[edit]

My recollection is that tit for tat is not optimal; that the testing in the 1980's was (inadvertently) rigged in its favor. Does anyone have more concrete information along these lines? --Quantling (talk) 17:00, 1 May 2010 (UTC)

Not so impractical for real-life[edit]

This parenthetical in the Overview section:

(Thus, the game-theory measure of effectiveness is impractical in many real life situations where players do have a vested interest in, or an altruistic compassion towards, other players.)

I have to disagree. If the players do have "a vested interest in, or altruistic compassion toward, other players" then that should be reflected in their own payoff matrix. It does not make game theory impractical, it just means you have to incorporate it into the model. Should we remove the parenthetical, add a discussion of this, or leave it how it is (in case someone disagrees with me)? Luqui (talk) 12:58, 12 June 2010 (UTC)

It seems the preceding sentence is at the root of your objection, as it generally rules out a payoff matrix which takes into account benefits other to the player. This would be incorrect if, for example, the players were genetically related or had some other common interest eg Formula 1 racers from the same team. Stephen B Streater (talk) 15:03, 12 June 2010 (UTC)
I agree with Luqui; the parenthetical statement is misleading. A given model that does not include altruistic compassion in the payoffs isn't a shortcoming or weakness of game theory, but rather indicates that specific model is incorrect because it does not account for the "universe" being modeled. Further, altruism can arise simply because the game is repeated as agents have a vested interest in their future transactions, even if the agents don't have altruistic compassion towards others. Many (most?) real-life interactions do not have strictly independent actions. Agents typically will either interact with the same agent sometime in the future or incur some effects due to communication of agents (e.g., if you rob a store, that agent will report your actions to the police, and you'll be arrested if caught). Pure one-shot games are somewhat rare. Information on this is pretty easy to find, such as http://plato.stanford.edu/entries/game-theory/ and much of this whole book: http://www.amazon.com/Repeated-Games-Reputations-Long-Run-Relationships/dp/0195300793 Halcyonhazard (talk) 08:05, 14 June 2010 (UTC)
I agree that one could redefine the payoff matrix so that it included altruistic compassion and/or a vested interest in the fate of the other players. But then we would have no guarantee that the payoff matrix would continue to have the prisoner's dilemma feature (where there is advantage to the selfishness of ratting when the other player does not), and thus we could easily find ourselves in a situation where the tit for tat strategy is not applicable. Or saying it another way, there are many situations where the tit for tat strategy is applicable (and more-or-less optimal) only if those situations are "dumbed down" by eliminating real world considerations. I know people who see the phrase "tit for tat is optimal" and don't realize that the applicability of that statement has limits in the real world. I am hopeful to have some note of this "limited applicability" in the article. Quantling (talk) 13:45, 14 June 2010 (UTC)
Modeling complex real-world interactions in social settings as simple repeating normal form games is analogous to modeling momentum using spherical cows gliding on frictionless surfaces in introductory physics classes; it's useful for instruction and may even be a good approximation for certain (many) real-world settings, but it is still a very simplified model. Tit for tat is an important strategy that is most easily explained in such a simple environment, so I agree with your assessment that it should be qualified as such. Halcyonhazard (talk) 20:44, 14 June 2010 (UTC)

I for one believe all of the above arguments to the extent that sources are given in support. We editors are not the experts, we are editors. This is not a blog, it is an encyclopedia. Make any changes desired, but based on sources please, not based on who wins arguments here, or because on editor assents to the request of another. On matters of WP policy, assent. On matters of content, source. Le Prof Leprof 7272 (talk) 17:29, 14 April 2015 (UTC)

Etymology[edit]

A good source for the Etymology of the expression is the Online Etymology Dictionary: probably a corruption of "Tip for Tap", as in I hit you so hard I tip you over and you hit me back with just a tap; this being unequal use of force. --TimeHorse (talk) 11:42, 26 July 2010 (UTC)

Yes check.svg Done in the edit today. Made this edit in the lede rewrite and first section introduction today, using M-W dictionary as source of the same information. Leprof 7272 (talk) 16:58, 14 April 2015 (UTC)

Christian tradition is not an eye for an eye[edit]

Christian tradition is not an eye for an eye tradition. Should be corrected imho..--LarryJ 19:51, 7 February 2011 (UTC) — Preceding unsigned comment added by Larjohn (talkcontribs)

X mark.svg Not done because not found. I could find no evidence of this connection. The "eye for eye" appears in links at the end, and so this request is not germane to the article as it stands (as far as I can see). Leprof 7272 (talk) 17:29, 14 April 2015 (UTC)

Example[edit]

Note, that if one of the two tit-for-tat strategists would switch to defecting in last game, as that wouldn't produce any disadvantages for him in the future, he would win having 30 points while the other tit-for-tat strategist would lose having only 25. And if all would defect one another in last move, the game will end in a 26-point draw. It's clear that if the prisoners aren't highly altruistic (as usually is the case with bandits) they will all defect one another in the end, because doing otherwise may only disadvanantage them. This is, of course, if they know that 6-th game is the last one and this is not a part of tit-for-tat strategy. — Preceding unsigned comment added by 62.221.56.166 (talk) 20:05, 8 October 2011 (UTC)

Source of opinion? Le Prof Leprof 7272 (talk) 17:29, 14 April 2015 (UTC)

Etymology, repeat[edit]

Two years ago there had been an etymology discussion; but, no one ever performed the edit. If someone can, then please do so. It is very needed; game theory is annoying to someone who simply needs the long term history. [ unsigned, presumed from 24.4.200.52 (talk) ]

hopiakuta Please do sign your communiqué .~~Thank You, DonFphrnqTaub Persina. 00:00, 27 February 2012 (UTC)
Yes check.svg Done in the edit today. I accomplished this twice requested general improvement to the article, which underpins all other usages (meanings) appearing in the article, whether game theoretic, biologic, or social psych. Unclear why the preceding editor focused in the lack of signature, and not on making the edit. Leprof 7272 (talk) 17:02, 14 April 2015 (UTC)

This Article is the %$#&!!![edit]

Keep up the good work. [unsigned comment, 07:35, 23 March 2012 (UTC)]

Not really a useful contribution. But, at least no one can revert you for taking your initiative. Leprof 7272 (talk) 17:00, 14 April 2015 (UTC)

Picture confusion[edit]

This picture really does not capture the idea of this concept. — Preceding unsigned comment added by 71.253.123.211 (talk) 01:01, 16 July 2012 (UTC)

Yes check.svg Done in the edit today. I concurred, and removed the image. Leprof 7272 (talk) 16:59, 14 April 2015 (UTC)

Reference sorely needed[edit]

This phrase seems to scream out for a reference: "The team which recently won over a pure tit for tat team outperformed it with some of their algorithms because they submitted multiple algorithms which would recognize each other and assume a master and slave relationship (one algorithm would "sacrifice" itself and obtain a very poor result for the other algorithm to be able to outperform tit for tat on an individual basis, but not as a pair or group)."

That was a very pertinent and current comment at one point in time and was perhaps independently recognizable as such, but that time is long gone now and needs a reference and perhaps rephrasing to indicate that it is no longer current. [unsourced comment]

Yes check.svg Done by someone earlier. Offending statement no longer seems to appear. Leprof 7272 (talk) 16:54, 14 April 2015 (UTC)

Key Concepts of "Cooperate" and "Retaliate" Never Defined or Explained[edit]

Adding an example near the top of the article of what cooperation and retaliation mean, perhaps in relation to a prisoner's dilemma scenario, would be very helpful. In lieu of any explanation, I can speculate that cooperation could mean silence, and retaliation means blaming/betrayal, but there should not be a need to speculate on such fundamental concepts to the article topic. — Preceding unsigned comment added by 98.100.23.77 (talk) 20:16, 29 January 2014 (UTC)

X mark.svg Not done but I concur it needs to be. Issue remains a problem in the business, social psych, and game theory sections. I concur with need for clarity, esp. in game theory and social psych areas, where the definitions exist. Leprof 7272 (talk) 16:55, 14 April 2015 (UTC)

"In popular culture"[edit]

The section heading is misleading -- the section goes on to talk about similar concepts in the Codex of Hammurabi and the Abrahamic religions. Are those "popular culture" now? Maybe the heading should be "Similar concepts" or something (or the section could be removed entirely -- the first link under "See also" does the job just as well IMO). 178.115.129.150 (talk) 09:35, 2 October 2014 (UTC)

I agree. The "See also" link pretty much covers it. I will make the edit. Leegrc (talk) 15:09, 3 October 2014 (UTC)

Inaccurately narrow lede, and WP:VERIFY, POV, OR, WEASEL, EDITORIAL, and PEACOCK violations[edit]

Before I begin to explain today's edit, please note that the refimprove tag has been in place since 2007. The issues with the tone and lack of sourcing are—based on review of edit history—article issues of longstanding. Well, today, in order to make the article usable as a wiki-reference, they became a priority.

On coming here to link to this article, today, I could simply not reference it as a general article on this subject, the way it stood. Not only was the lede filled with game theoretic references indecipherable to a lay audience, but that specialist lede content appeared to the exclusion of the various other uses of the term in general and scholarly venues; as well, the lede simply did not substantively reflect existing article content. So, I edited the lede, to make it a general statement of the meanings and uses of the title term.

In order to do this, and to separate the game theoretic from other areas, I created sections in the article, including one providing the first sourced definition and etymology of the title term. The sectioning had as a second impact creating sections that I and others in our specialty areas can then populate (leaving the game theoretic sections for the true specialists in those areas).

I also began to generally mark where in the article specific WP:VERIFY, WP:POV, WP:OR, WP:WEASEL, WP:EDITORIAL, and WP:PEACOCK violations were occurring. To do this inline would have mean near solid tagging in some sections, and so, sadly, I had to resort to section tags; I then summarized these in one multiple issue below the lede (because the lede is the one thing largely fixed). The tags at top and in sections can be removed, quickly, as people attend to these real, substantive issues. Sentence after sentence in the social psychology and game theory sections cannot express the founding editors' personal intellectual opinions. These statements need to be justified (made verifiable), line by line, or better, replaced by new scholarly text that is sourced, more general, less opinionated, and up-to-date.

Given that this article had evolved into largely a specialist tome for game theorists, and not the general article it was intended to be, I expect that bias may result in a reversion, rather than accepting this profs guidance toward a good, general article. I would simply ask you to use this effort on my part to see the article for what it is—a largely unfinished (unsourced) student opinion essay, limiting the general article to the narrow areas of game theory that s/he knew about at the time of writing. Now is the time to make the article into a general article about the title subject. If reverting, please, don't throw baby out with bath. Remove things you think are wrong, and not the whole. Cheers. Le Prof Leprof 7272 (talk) 16:53, 14 April 2015 (UTC)

Yes check.svg Done. I have noted the several requests made by others over the preceding history of this article, that were in fact achieved with today's edit. Le Prof Leprof 7272 (talk)