MediaWiki talk:Spam-blacklist

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Hu12 (talk | contribs) at 18:59, 9 April 2008 (→‎Request unlisting of hubpages.com). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

    Mediawiki:Spam-blacklist is meant to be used by the spam blacklist extension. Unlike the meta spam blacklist, this blacklist affects pages on the English Wikipedia only. Any administrator may edit the spam blacklist. See Wikipedia:Spam blacklist for more information about the spam blacklist.


    Instructions for editors

    There are 4 sections for posting comments below. Please make comments in the appropriate section. These links take you to the appropriate section:

    1. Proposed additions
    2. Proposed removals
    3. Troubleshooting and problems
    4. Discussion

    Each section has a message box with instructions. In addition, please sign your posts with ~~~~ after your comment.

    Completed requests are archived. Additions and removals are logged, reasons for blacklisting can be found there.

    Addition of the templates {{Link summary}} (for domains), {{IP summary}} (for IP editors) and {{User summary}} (for users with account) results in the COIBot reports to be refreshed. See User:COIBot for more information on the reports.


    Instructions for admins

    Any admin unfamiliar with this page should probably read this first, thanks.
    If in doubt, please leave a request and a spam-knowledgeable admin will follow-up.

    Please consider using Special:BlockedExternalDomains instead, powered by the AbuseFilter extension. This is faster and more easily searchable, though only supports whole domains and not whitelisting.

    1. Does the site have any validity to the project?
    2. Have links been placed after warnings/blocks? Have other methods of control been exhausted? Would referring this to our anti-spam bot, XLinkBot be a more appropriate step? Is there a WikiProject Spam report? If so, a permanent link would be helpful.
    3. Please ensure all links have been removed from articles and discussion pages before blacklisting. (They do not have to be removed from user or user talk pages).
    4. Make the entry at the bottom of the list (before the last line). Please do not do this unless you are familiar with regex — the disruption that can be caused is substantial.
    5. Close the request entry on here using either {{done}} or {{not done}} as appropriate. The request should be left open for a week maybe as there will often be further related sites or an appeal in that time.
    6. Log the entry. Warning: if you do not log any entry you make on the blacklist, it may well be removed if someone appeals and no valid reasons can be found. To log the entry, you will need this number - 204512887 after you have closed the request. See here for more info on logging.
    snippet for logging: {{/request|204512887#section_name}}
    snippet for logging of WikiProject Spam items: {{WPSPAM|204512887#section_name}}
    A user-gadget for handling additions to and removals from the spam-blacklist is available at User:Beetstra/Gadget-Spam-blacklist-Handler

    Proposed additions

    freerepublic.com

    This is as a result of a discussion here[1] about the usage of FreeRepublic.com as a reprinting service for a primary source. I was curious to see what other articles linked to FreeRepublic and found a small handful on en and on other languages. In looking into the specific links in article space what I'm finding is that FreeRepublic is often being used in lieu of linking to the actual source [2][3], where it exists in a web archive [4], or just to link to it in the external links section[5]. I'm sure the articles were linked as references in good faith, but given that FreeRepublic is an unreliable source, should it be blacklisted and then whitelisted onto articles related to the site, added to one of the spambots, or periodically cleaned up by hand? --Bobblehead (rants) 23:51, 24 March 2008 (UTC)[reply]

    FreeRepublic.com is an unreliable source (self-published source) that includes a portion of a site that reprints articles from reliable sources (copyright violations). Most of the reprints actually include links to the reliable source's article, so the only reason they are being included is for traffic. FreeRepublic is itself a notable website, so freerepublic.com itself should be whitelisted, but at a minimum the area for the reprints (http://www.freerepublic.com/focus/f-news) should be blacklisted. --Bobblehead (rants) 02:31, 31 March 2008 (UTC)[reply]

    Just checking on the status of getting this site blacklisted. Please see the following discussion for more details.[6] Thanks! --Bobblehead (rants) 16:48, 2 April 2008 (UTC)[reply]

    mother-surrogate.net & mother-surrogate.com

    The same user (193.33.49.9) keeps inserting these Ukrainian commercial sites (advertising). The IP address for both URLs is 82.144.223.6. The affected page is Surrogacy, but they also added to Commercial surrogacy, which has since been merged into Surrogacy. I have not included the diffs, the user IP above links to the contribs page. As can be seen the user has edited only these two articles. The user IP is (not surprisingly) also registered to the Ukraine. It should also be considered whether this user IP is to be blocked. TINYMARK 06:17, 31 March 2008 (UTC)[reply]

    Accounts
    Added links to help review. --- Barek (talkcontribs) - 19:40, 4 April 2008 (UTC)[reply]
    Thanks, TinyMark and Barek. Nothing piques my interest like a spammer trying to delete spam records -- it's always an encouragement to dig a little deeper. So after additional digging, there appear to be more domains. Also I found this stuff was spammed cross-wiki, so it should be blacklisted at meta:Spam blacklist. I'll work on that later today. --A. B. (talkcontribs) 20:57, 6 April 2008 (UTC)[reply]
    Related domains:


    Possibly related domains:


    Accounts on other wikis:
    --A. B. (talkcontribs) 21:57, 6 April 2008 (UTC)[reply]


    See:
     Defer to Global blacklist --A. B. (talkcontribs) 22:16, 6 April 2008 (UTC)[reply]


    provacylonline.com

    Added by multiple IPs, all from Belarus, to multiple articles. A few examples: [7] [8] [9] -- Ed (Edgar181) 12:03, 1 April 2008 (UTC)[reply]

    Spamming of this site continues with new IPs. -- Ed (Edgar181) 18:07, 6 April 2008 (UTC)[reply]


    Proposed removals

    Request unlisting of aceshowbiz.com/celebrity/meagan_good

    It gives critical information on her heritage and is of great help since finding information on heritage of multiracial actors is quite difficult i.e. Jada Pinkett Smith is part Cherokee however this is hard to find since it isn't posted on the internet.Thank You Mcelite (talk) 16:48, 9 April 2008 (UTC)mcelite[reply]

    Request unlisting of hubpages.com

    • Notability and Importance

    The domain has established notability (http://en.wikipedia.org/wiki/Talk:HubPages). We should be able to link to the domain's main page "www.hubpages.com" on the article concerning it. It feels pretty straightforward that we want to have an external link to the subject at the bottom of the article.

    • Rectified potential problem behavior

    While people used wikipedia to promote their own hubs in the past, this behavior is reduced since HubPages has hired additional staff to remove overly promotional hubs. Furthermore, the chronic problem of site members incorrectly adding their links to wikipedia articles can be prevented on a case by case basis. After all, some hubs constitute relevant verifiable research. mroconnell (talk) 17:54, 9 April 2008 (UTC)[reply]

    HubPages links
    • Have no editorial oversight (see WP:RS) and articles are essentially self-published
    • Offers its authors financial incentives to increase page views
    • Fails Wikipedia's core content policies:
    While it may be encouraging that hubpages is attempting to controll content, it is not a reason to delist at this time,  Not done. I have however whitelisted the root (main page) page (www.hubpages.com/index.php) for use in the article HubPages. If a specific link is needed as a citation, an etablished editor can request it on the whitelist on a case-by-case basis, where the url can be demonstrated as a "Verifiable Reliable Source" (in an appropriate context) when there are no reasonable alternatives available. main page whitelisted  Done. thanks--Hu12 (talk) 18:55, 9 April 2008 (UTC)[reply]

    Troubleshooting and problems

    Discussion

    Blacklist logging

    {{WPSPAM|0#section_name}} →(replacing '0' with the correct "oldid" (ie. permalink) example shown here).

    For example:

    {{WPSPAM|182728001#Blacklist_logging}}

    results in:

    See WikiProject Spam report

    This should aid in requests originating from Wikipedia_talk:WikiProject_Spam and for use with the entry log here. I've added a snipit in the header --Hu12 (talk)


    Addition to the COIBot reports

    The lower list in the COIBot reports now have after each link four numbers between brackets (e.g. "www.example.com (0, 0, 0, 0)"):

    1. first number, how many links did this user add (is the same after each link)
    2. second number, how many times did this link get added to wikipedia (for as far as the linkwatcher database goes back)
    3. third number, how many times did this user add this link
    4. fourth number, to how many different wikipedia did this user add this link.

    If the third number or the fourth number are high with respect to the first or the second, then that means that the user has at least a preference for using that link. Be careful with other statistics from these numbers (e.g. good user do add a lot of links). If there are more statistics that would be useful, please notify me, and I will have a look if I can get the info out of the database and report it. The bots are running on a new database, Eagle 101 is working on transferring the old data into this database so it becomes more reliable.

    For those with access to IRC, there this data is available in real time. --Dirk Beetstra T C 10:41, 26 March 2008 (UTC)[reply]

    archive script

    Eagle 101 said he had one running on meta, is it possible to get it up and going here?--Hu12 10:27, 15 November 2007 (UTC)[reply]

    Would be good - Eagle hasn't been working on Meta for a while though & I've not seen anything (there was supposed to be a logging script too!) --Herby talk thyme 12:10, 15 November 2007 (UTC)[reply]
    • Great news, Ive written a script that can archive this page given the templates that we use, I can create a approved archive along with a rejected archive if people are interested. βcommand 06:51, 4 January 2008 (UTC)[reply]
    "Interested" - bit of an understatement there :) Great news - please feel free to help/supply the script. I tend to leave stuff around a week in case anyone shouts or adds more (archives once done should be left alone). How would you handle the "discussion" type bits? Cheers --Herby talk thyme 09:40, 4 January 2008 (UTC)[reply]
    First question, do you want approved and rejected request in separate archives? as for the discussions we could get Misza bot over here for things older than 30 days. βcommand 17:13, 4 January 2008 (UTC)[reply]
    I would think one archive, seperate sections, like it is currently[10], not sure if the script can do that, but if so, doubt there would be objections in implementation...--Hu12 (talk) 00:24, 10 January 2008 (UTC)[reply]
    There is no simple way of editing sections using the bot. (section editting is evil). it would just be one large archive. βcommand 00:59, 10 January 2008 (UTC)[reply]