This article may be in need of reorganization to comply with Wikipedia's layout guidelines. (August 2020)
A Superforecaster is a person who makes forecasts which are aggregated and scored, where the individual proves to be consistently more accurate than the general public or experts. Superforecasters sometimes use modern analytical and statistical methodologies augment estimates of base rates of events. Research finds that they are typically more accurate than experts in the field who do not use those techniques.
Origins of the term are attributed to Philip E. Tetlock with results from The Good Judgment Project and subsequent book with Dan Gardner Superforecasting: The Art and Science of Prediction.
In December 2019 a Central Intelligence Agency analyst writing under the pseudonym "Bobby W." suggested the Intelligence community should study superforecaster research on how certain individuals with "particular traits" are better forecasters and how they should be leveraged.
Superforecasters estimate a probability of an occurrence, and review the estimate when circumstances contributing to the estimate change. This is based on both personal impressions, public data, and incorporating input from other superforecasters, but attempts to remove bias in their estimates. In The Good Judgment Project one set of forecasters were given training on how to translate their understandings into a probabilistic forecast, summarised into an acronym "CHAMP" for Comparisons, Historical trends, Average opinions, Mathematical models, and Predictable biases.
As part of The Good Judgment Project one group was given training in the "CHAMP" methodology and it was claimed this increased forecasting accuracy. During the Superforecasters appear to have varying degrees of success: Bloomberg noting a poor prediction of 23% for a leave vote in the month of the June 2016 Brexit referendum; while the BBC notes collectively Superforecasters may have accurately predicted Donald Trump's success in 2016 primaries. Aid agencies are using superforecasting to determine to the probability of droughts becoming famines.
One of Tetlock's findings from The Good Judgment Project was that superforecasters personality traits rather than specialised knowledge allowed them to predict the outcome of various world events typically more accurately than intelligence agencies.
This section needs expansion. You can help by adding to it. (February 2020)
- Elaine Rich, a "certified" superforecaster who participated in the Good Judgement Project.
- Andrew Sabisky, a self proclaimed superforecaster, resigned from his position as advisor to the United Kingdom government at Downing Street, with chief advisor Dominic Cummings telling journalists "read Philip Tetlock's Superforecasters, instead of political pundits who don't know what they're talking about".
- Adonis, Andrew (20 February 2020). "In praise of (some) superforecasters". The New European. Archived from the original on 20 February 2020. Retrieved 20 February 2020.
- BBC News (18 February 2020). "Andrew Sabisky: What is superforecasting?". BBC News. Archived from the original on 18 February 2020. Retrieved 18 February 2020.
- Bobby W. (December 2019). "The Limits of Prediction—or, How I Learned to Stop Worrying About Black Swans and Love Analysis" (PDF). Studies in Intelligence. 63 (4). Archived (PDF) from the original on 19 February 2020.
- Burton, Tara Isabella (20 January 2015). "Could you be a 'super-forecaster'?". BBC. Archived from the original on 19 February 2020. Retrieved 19 February 2020.
- Harford, Tim (5 September 2014). "How to see into the future". Financial Times Magazine. FT Group. Archived from the original on 23 September 2019. Retrieved 18 February 2020.
- Nilaya, Josh (29 September 2015). "U.S. Intelligence Dabbles in Forecasting the Future". Connecticut Public Radio Home. Archived from the original on 11 January 2019. Retrieved 21 February 2020.
|Look up Superforecaster in Wiktionary, the free dictionary.|