This bot patrolsnewly created pages in the main space, and matches the contents against a web search. Pages found to contain a significant portion of text taken from another web page are tagged (and categorized) for human attention according to some guidelines:
^It will, in fact, queue up every new page at creation but may defer reading it and (possibly) editing it for some time if the current Wikipedia load is too high. Regardless of the current load, there is a hard limit of one article processed every five seconds.
^Copy-and-paste of Wikipedia pages are sometimes created as subtle vandalism