|Initial release||November 2, 2008|
9 / January 31, 2018
Stockfish is a free and open-source UCI chess engine, available for various desktop and mobile platforms. It is developed by Marco Costalba, Joona Kiiski, Gary Linscott and Tord Romstad, with many contributions from a community of open-source developers.
Stockfish is consistently ranked first or near the top of most chess-engine rating lists and is the strongest open-source chess engine in the world. It won the unofficial world computer chess championships in season 6 (2014), season 9 (2016), season 11 (2018) and season 12 (2018). It finished runner-up in season 5 (2013), season 7 (2014) and season 8 (2015). Stockfish is derived from Glaurung, an open-source engine by Romstad.
Stockfish can use up to 512 CPU threads in multiprocessor systems. The maximal size of its transposition table is 1 TB. Stockfish implements an advanced alpha–beta search and uses bitboards. Compared to other engines, it is characterized by its great search depth, due in part to more aggressive pruning and late move reductions.
Stockfish supports Chess960, which is one of the features that was inherited from Glaurung. The Syzygy tablebase support, previously available in a fork maintained by Ronald de Man, was integrated into Stockfish in 2014.
The program originated from Glaurung, an open-source chess engine created by Romstad and first released in 2004. Four years later, Costalba, inspired by the strong open-source engine, decided to fork the project. He named it Stockfish because it was "produced in Norway and cooked in Italy" (Costalba is an Italian, Romstad is a Norwegian). The first version, Stockfish 1.0, was released in November 2008. For a while, new ideas and code changes were transferred between the two programs in both directions, until Romstad decided to discontinue Glaurung in favor of Stockfish, which was the more advanced engine at the time. The last Glaurung (version 2.2) was released in December 2008.
Around 2011, Romstad decided to abandon his involvement with Stockfish and preferred to spend his time on his new iOS chess app.
On 18 June 2014 Marco Costalba announced that he had "decided to step down as Stockfish maintainer" and asked that the community create a fork of the current version and continue its development. An official repository, managed by a volunteer group of core Stockfish developers, was created soon after and currently manages the development of the project.
Changes to game-playing code are accepted or rejected based on results of playing of tens of thousands of games on the framework against an older "reference" version of the program, using sequential probability ratio testing. Tests on the framework are verified using the chi-squared test, and only if the results are statistically significant are they deemed reliable and used to revise the software code.
As of June 2018[update], the framework has used a total of more than 1200 years of CPU time to play more than 840 million chess games. After the inception of Fishtest, Stockfish incurred an explosive growth of 120 Elo points in just 12 months, propelling it to the top of all major rating lists. In Stockfish 7, FishTest author Gary Linscott was added to the official list of authors in acknowledgement of his contribution to Stockfish's strength.
Participation in TCEC
In 2013 Stockfish finished runner-up at both TCEC Seasons 4 and 5, with Superfinal scores of 23–25 first against Houdini 3 and later against Komodo 1142. Season 5 was notable for the winning Komodo team as they accepted the award posthumously for the program's creator Don Dailey, who succumbed to an illness during the final stage of the event. In his honor, the version of Stockfish that was released shortly after that season was named "Stockfish DD".
On 30 May 2014, Stockfish 170514 (a development version of Stockfish 5 with tablebase support) convincingly won TCEC Season 6, scoring 35.5-28.5 against Komodo 7x in the Superfinal. Stockfish 5 was released the following day. In TCEC Season 7, Stockfish again made the Superfinal, but lost to Komodo with the score of 30.5-33.5. In TCEC Season 8, despite losses on time caused by buggy code, Stockfish nevertheless qualified once more for the Superfinal, but lost the ensuing 100-game match 46.5-53.5 to Komodo.
Stockfish version 8 is the winner of the 2016 Season 9 of TCEC against Houdini version 5 with the score of 54.5 versus 45.5. Stockfish finished third during season 10 of TCEC and won seasons 11 and 12 convincingly.
Stockfish versus Nakamura
Stockfish's strength relative to the best human chess players was most apparent in a handicap match with grandmaster Hikaru Nakamura (2798-rated) in August 2014. In the first two games of the match, Nakamura had the assistance of an older version of Rybka, and in the next two games, he received White with pawn odds but no assistance. Nakamura was the world's fifth-best human chess player at the time of the match, while Stockfish was denied use of its opening book, as well as endgame tablebase. Stockfish won each half of the match 1.5–0.5. Both of Stockfish's wins arose from positions in which Nakamura, as is typical for his playing style, pressed for a win instead of acquiescing to a draw.
An artificial-intelligence approach, designed by Jean-Marc Alliot of the Institut de recherche en informatique de Toulouse ("Toulouse Computer Science Research Institute"), which compares chess grandmaster moves against that of Stockfish, rated Magnus Carlsen as the best player of all time, as he had the highest probability of all World Chess Champions to play the moves that Stockfish suggested.
Computer chess tournament
In November 2017, chess.com held an open tournament of the ten strongest chess engines, leading to a "Super final" tournament between the two finalists - Stockfish and Houdini. In the 20-game Super final, Stockfish won over Houdini with a score 10.5-9.5. Five games were decisive, with 15 ending in a draw. Of the decisive games, three games were won by Stockfish (one as Black), and two games won by Houdini (winning both as Black). The average game length was 199.5 ply (100 moves). The tournament was organized with a variety of time controls, and engines allocated equal computing support; each having its own dedicated AWS virtualized instance of a hyperthreaded Intel Xeon 2.90 GHz (two processors each with 18 cores) with 60 GB RAM running on a Windows-based server.
Stockfish versus AlphaZero
In December 2017, Stockfish 8 was used as a benchmark to evaluate Google division Deepmind's AlphaZero, with each engine supported by different hardware. AlphaZero was trained through self-play for a total of nine hours, and reached Stockfish's level after just four. Stockfish was allocated 64 threads and a hash size of 1 GB; AlphaZero was supported with four application-specific TPUs. Each program was given one minute's worth of thinking time per move.
In 100 games from the normal starting position AlphaZero won 25 games as White, won 3 as Black, and drew the remaining 72, with 0 losses. AlphaZero also played twelve 100-game matches against Stockfish starting from twelve popular openings for a final score of 290 wins, 886 draws and 24 losses, for a point score of 733:467.[note 1] The research has not been peer reviewed and Google declined to comment until it is published.
In response to this, Stockfish developer Tord Romstad commented, "The match results by themselves are not particularly meaningful because of the rather strange choice of time controls and Stockfish parameter settings: The games were played at a fixed time of 1 minute/move, which means that Stockfish has no use of its time management heuristics (lot of effort has been put into making Stockfish identify critical points in the game and decide when to spend some extra time on a move; at a fixed time per move, the strength will suffer significantly). The version of Stockfish used is one year old, was playing with far more search threads than has ever received any significant amount of testing, and had way too small hash tables for the number of threads. I believe the percentage of draws would have been much higher in a match with more normal conditions."
Grandmaster Hikaru Nakamura also showed skepticism of the significance of the outcome, stating "I don't necessarily put a lot of credibility in the results simply because my understanding is that AlphaZero is basically using the Google super computer and Stockfish doesn't run on that hardware; Stockfish was basically running on what would be my laptop. If you wanna have a match that's comparable you have to have Stockfish running on a super computer as well."
Stockfish has been a very popular engine for various platforms. On the desktop, it is the default chess engine bundled with the Internet Chess Club interface programs BlitzIn and Dasher. On the mobile platform, it has been bundled with the Stockfish app, SmallFish and Droidfish. Other Stockfish-compatible graphical user interfaces (GUIs) include Fritz, Arena, Stockfish for Mac, and PyChess. As of March 2017, Stockfish is the AI used by Lichess, a popular online chess site.
- The academic paper on this sequence of games does not provide the computer resources allocated to each engine.
- "Stockfish/src/uci.cpp". Retrieved 18 March 2016.
- "About". stockfishchess.org. Retrieved 5 March 2014.
- Chabris, Christopher. "The Real Kings of Chess Are Computers". Wall Street Journal. Retrieved 18 September 2015.
- Eade, James (2016). Chess for Dummies. Hoboken, New Jersey: John Wiley & Sons. p. 476. ISBN 9781119280033. OCLC 960819719. Retrieved 2 January 2017.
- "CEGT Best Versions 40/20 (AMD 4200+)". Chess Engines Grand Tournament. 29 June 2014. Archived from the original on 8 September 2012. Retrieved 1 July 2014.
- "CCRL 40/40". Computer Chess Rating Lists. 29 June 2014. Archived from the original on 2 October 2011. Retrieved 1 July 2014.
- "IPON Rating List". 6 June 2014. Retrieved 1 July 2014.
- Kaufman, Larry (24 November 2013). "Stockfish depth vs. others; challenge". talkchess.com. Retrieved 8 March 2014.
- Kislik, Erik (6 June 2014). "IM Erik Kislik analyzes the TCEC Superfinal in-depth". susanpolgar.blogspot.hu. Retrieved 7 June 2014.
- "Stockfish development versions". abrok.eu. Archived from the original on 11 November 2014. Retrieved 1 February 2015.
- Costalba, Marco (2 November 2008). "Stockfish 1.0". talkchess.com. Retrieved 6 March 2014.
- Romstad, Tord (5 September 2009). "Re: Stockfish - Glaurung". wbec-ridderkerk.forumotion.com. Retrieved 5 March 2014.
- Costalba, Marco (18 June 2014). "Step down". groups.google.com. Retrieved 19 June 2014.
- Linscott, Gary (18 June 2014). "New official repository". groups.google.com. Retrieved 19 June 2014.
- "Stockfish Testing Framework". tests.stockfishchess.org. Retrieved 7 March 2014.
- "Get Involved". stockfishchess.org. Retrieved 8 March 2014.
- Costalba, Marco (1 May 2013). "Fishtest Distributed Testing Framework". talkchess.com. Retrieved 18 April 2014.
- "Stockfish Testing Framework - Users". test.stockfishchess.org. Retrieved 14 June 2018.
- "Fast GM Rating List".
- "CCRL Rating List". Archived from the original on 2014-05-30.
- "Stockfish Blog on Stockfish DD".
- "TCEC Season Archive". tcec.chessdom.com. Retrieved 9 January 2015.
- Costalba, Marco (31 May 2014). "Stockfish 5". talkchess.com. Retrieved 19 June 2014.
- "Stockfish is the TCEC Season 9 Grand Champion". Chessdom. Retrieved 5 December 2016.
- "Stockfish convincingly wins TCEC Season 11". Chessdom. Retrieved 18 April 2018.
- "When artificial intelligence evaluates chess champions". Science Daily. CNRS. 25 April 2017.
- https://www.chess.com Chess.com announces computer chess championship.
- https://www.chess.com Stockfish wins chess.com computer championship.
- Knapton, Sarah; Watson, Leon (6 December 2017). "Entire human chess knowledge learned and surpassed by DeepMind's AlphaZero in four hours". Telegraph.co.uk. Retrieved 6 December 2017.
- Vincent, James (6 December 2017). "DeepMind's AI became a superhuman chess player in a few hours, just for fun". The Verge. Retrieved 6 December 2017.
- "'Superhuman' Google AI claims chess crown". BBC News. 6 December 2017. Retrieved 7 December 2017.
- "DeepMind's AlphaZero crushes chess". chess.com. 6 December 2017. Retrieved 13 December 2017.
- Silver, David; Hubert, Thomas; Schrittwieser, Julian; Antonoglou, Ioannis; Lai, Matthew; Guez, Arthur; Lanctot, Marc; Sifre, Laurent; Kumaran, Dharshan; Graepel, Thore; Lillicrap, Timothy; Simonyan, Karen; Hassabis, Demis (5 December 2017). "Mastering Chess and Shogi by Self-Play with a General Reinforcement Learning Algorithm". arXiv:1712.01815 [cs.AI].
- "AlphaZero: Reactions From Top GMs, Stockfish Author". chess.com. 8 December 2017. Retrieved 13 December 2017.
- Using the Stockfish Engine, Stockfish Support.
- ChessEngines, PyChess Github.
-  Lichess uses Stockfish announcement.
- Interview with Tord Romstad (Norway), Joona Kiiski (Finland) and Marco Costalba (Italy), programmers of Stockfish