Goodhart's law: Difference between revisions
m Journal cites:, added 1 PMID using AWB (12145) |
→top: noun |
||
Line 1: | Line 1: | ||
'''Goodhart's law''' is named after economist [[Charles Goodhart]], which states: "When a measure becomes a target, it ceases to be a good measure." This follows from individuals trying to anticipate the effect of a policy, then taking actions which alter its outcome. |
'''Goodhart's law''' is an [[adage]] named after economist [[Charles Goodhart]], which states: "When a measure becomes a target, it ceases to be a good measure." This follows from individuals trying to anticipate the effect of a policy, then taking actions which alter its outcome. |
||
==Formulation== |
==Formulation== |
Revision as of 07:13, 22 April 2017
Goodhart's law is an adage named after economist Charles Goodhart, which states: "When a measure becomes a target, it ceases to be a good measure." This follows from individuals trying to anticipate the effect of a policy, then taking actions which alter its outcome.
Formulation
Goodhart first advanced the posit in a 1975 paper, which later became used popularly to criticize the United Kingdom government of Margaret Thatcher for trying to conduct monetary policy on the basis of targets for broad and narrow money.[clarification needed] However, the concept is considerably older,[1] and closely related ideas are known under different names, including the subsequently formulated Campbell's law (1976), and Lucas critique (1976).
The law is implicit in the economic idea of rational expectations.[further explanation needed] While it originated in the context of market responses, the law has profound implications for the selection of high-level targets in organizations.[2]
Expressions
Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.
- Goodhart's original 1975 formulation, reprinted on page 116 in Goodhart 1981[3]
A risk model breaks down when used for regulatory purposes.
All metrics of scientific evaluation are bound to be abused. Goodhart’s law (named after the British economist who may have been the first to announce it) states that when a feature of the economy is picked as an indicator of the economy, then it inexorably ceases to function as that indicator because people start to game it.
- Mario Biagioli[5]
See also
References
- ^ "Overpowered Metrics Eat Underspecified Goals" Ribbonfarm. Accessed 26 January 2017
- ^ Goodhart, C.A.E. (1975). "Problems of Monetary Management: The U.K. Experience". Papers in Monetary Economics. I. Reserve Bank of Australia.
{{cite journal}}
: Cite has empty unknown parameter:|coauthors=
(help) - ^ Goodhart, Charles (1981). "Problems of Monetary Management: The U.K. Experience". Anthony S. Courakis (ed.), Inflation, Depression, and Economic Policy in the West. Rowman & Littlefield: 111–146.
- ^ Daníelsson, Jón (July 2002). "The Emperor Has No Clothes: Limits to Risk Modelling". Journal of Banking & Finance. 26 (7): 1273–96. doi:10.1016/S0378-4266(02)00263-7. – via ScienceDirect (Subscription required.)
- ^ Biagioli, Mario (12 July 2016). "Watch out for cheats in citation game". Nature. 535 (7611): 201. doi:10.1038/535201a. PMID 27411599.
Further reading
- K. Alec Chrystal and Paul D. Mizen, Goodhart's Law: Its Origins, Meaning and Implications for Monetary Policy, November 12, 2001