English: Overfitting (for example in neural networks training). The red curve is the error on the validation set over several epochs. The blue curve is the error of the training set. When the error for the validation set increases while the training error steadily decreases then a problem of overfitting may occur (the learning is too specialized and does not generalize enough)
Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation; with no Invariant Sections, no Front-Cover Texts, and no Back-Cover Texts. A copy of the license is included in the section entitled GNU Free Documentation License.http://www.gnu.org/copyleft/fdl.htmlGFDLGNU Free Documentation Licensetruetrue
to share – to copy, distribute and transmit the work
to remix – to adapt the work
Under the following conditions:
attribution – You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
share alike – If you remix, transform, or build upon the material, you must distribute your contributions under the same or compatible license as the original.
This licensing tag was added to this file as part of the GFDL licensing update.http://creativecommons.org/licenses/by-sa/3.0/CC BY-SA 3.0Creative Commons Attribution-Share Alike 3.0truetrue
Captions
Add a one-line explanation of what this file represents
Overfitting (for example in neural networks training). The red curve is the error on the validation set over several epochs. The blue curve is the error of the training set. When the error for the validation set increases while the training error steadily
File usage
No pages on the English Wikipedia use this file (pages on other projects are not listed).