Talk:Winnow (algorithm)

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Computing (Rated Start-class)
WikiProject icon This article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Start-Class article Start  This article has been rated as Start-Class on the project's quality scale.
 ???  This article has not yet received a rating on the project's importance scale.
 
WikiProject Robotics (Rated Start-class, Low-importance)
WikiProject icon Winnow (algorithm) is within the scope of WikiProject Robotics, which aims to build a comprehensive and detailed guide to Robotics on Wikipedia. If you would like to participate, you can choose to edit this article, or visit the project page (Talk), where you can join the project and see a list of open tasks.
Start-Class article Start  This article has been rated as Start-Class on the project's quality scale.
 Low  This article has been rated as Low-importance on the project's importance scale.
 

Removed the "insufficient context" tag[edit]

The tag was placed on 2007-10-29T16:33:23 by user Kallerdis, but the article is much improved since then. I think it does provide enough context. -Pgan002 (talk) 18:31, 22 October 2010 (UTC)

Is that update procedure correct?[edit]

The text specifies the update procedure as:

  • If an example is correctly classified, do nothing.
  • If an example is predicted to be 1 but the correct result was 0, all of the weights involved in the mistake are set to zero (demotion step).
  • If an example is predicted to be 0 but the correct result was 1, all of the weights involved in the mistake are multiplied by \alpha (promotion step).

Step 2 moves weights to zero, and neither of the other steps changes any weight that is zero. Because of that, I fear that this will too easily migrate towards an all-zero weights vector.

So, either explain why that will not happen, or adjust the description. —Preceding unsigned comment added by 145.36.235.3 (talk) 11:30, 19 April 2011 (UTC)

That is indeed the original (Winnow1) algorithm. Roughly, any feature that is implicated in a false positive should thereafter be ignored. I think the weights cannot all go to zero if the data is linearly separable: there will always be some instance that on the other side of the current hyperplane. And if the data is not linearly separable, all bets are off, just as with most linear classifiers. —johndburger 18:09, 6 October 2011 (UTC)