From Wikipedia, the free encyclopedia
|This is the talk page for discussing improvements to the Winnow (algorithm) article.
This is not a forum for general discussion of the article's subject.
|WikiProject Computing||(Rated Start-class)|
|WikiProject Robotics||(Rated Start-class, Low-importance)|
Removed the "insufficient context" tag
Is that update procedure correct?
The text specifies the update procedure as:
- If an example is correctly classified, do nothing.
- If an example is predicted to be 1 but the correct result was 0, all of the weights involved in the mistake are set to zero (demotion step).
- If an example is predicted to be 0 but the correct result was 1, all of the weights involved in the mistake are multiplied by (promotion step).
Step 2 moves weights to zero, and neither of the other steps changes any weight that is zero. Because of that, I fear that this will too easily migrate towards an all-zero weights vector.
- That is indeed the original (Winnow1) algorithm. Roughly, any feature that is implicated in a false positive should thereafter be ignored. I think the weights cannot all go to zero if the data is linearly separable: there will always be some instance that on the other side of the current hyperplane. And if the data is not linearly separable, all bets are off, just as with most linear classifiers. —johndburger 18:09, 6 October 2011 (UTC)