Jump to content

User talk:Lemmey

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Lemmey (talk | contribs) at 21:26, 4 June 2008 (RCP). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Open Source

LemmeyBOT has gone open source

  • LemmeyBOT.py Runs RefHistoryFix on each article in the broken references category
    • RefHistoryFix.py Attempt to fix broken references on the specified article by looking at article history
  • LemmeyBOT2.py Runs RefHistoryFix2 on each article in the broken references category
    • RefHistoryFix2.py Attempt to fix broken references on the specified article by looking at current versions of linked articles
  • RefHistoryFix3.py Attempt to fix broken references on the specified article by looking at a specific article or text file
  • Whoipedia.py A magical replacement of the Wikipedia class that doesn't require an editor to login
  • basic.py An example bot that uses the whoipedia class. Finally a bot that anyone can edit (with)...
  • TheDaily.py A method for posting current events to the proper pages.

Your comment

This comment is totally inappropriate. Please tone it down a notch. JACOPLANE • 2008-05-25 11:29

Final Warning

You have broken the three revert rule but I will not block you for it given a conflict of interest. However, I would suggest that you read up more thoroughly on our rules before continuing your current editing practices. If you keep this up, I will bring your unconstructive behaviour to the attention of others and you could well be blocked. The links were appropriate to the story. Capitalistroadster (talk) 21:28, 30 May 2008 (UTC)[reply]


again don't whine when your wrong and don't threaten. You need to read the Current Events guidelines on context --Lemmey talk 21:30, 30 May 2008 (UTC)[reply]
camper, just waiting until I made an unrelated edit to launch a final warning --Lemmey talk 21:33, 30 May 2008 (UTC)[reply]

Think links you added to the current events portal are non-contextual Please review Wikipedia:How_the_Current_events_page_works#The_context_is_not_the_geographical_location and the following paragraph on subject. The articles crane, and locations in New York are not specific to the event. --Lemmey talk 21:47, 30 May 2008 (UTC)[reply]

Whatever. VolatileChemical (talk) 21:52, 30 May 2008 (UTC)[reply]

You have been blocked

I have blocked you for 1 week for multiple policy violations. The full description is here. ^demon[omg plz] 21:13, 31 May 2008 (UTC)[reply]

LemmyBot Task 2c: Bastards

My observation has been that Bastard refs are often created by an editor cutting a text snippet from one article and pasting it into another. The source article where the parent ref can bve found can often be identified from the context surrounding the bastard, found by text search for the ref name and/or other pasted-in text, or identified from clues left in the edit summary. -- Boracay Bill (talk) 23:18, 1 June 2008 (UTC)[reply]

Yes I saw that. I modified the bot to copy identically named refs from the current version of another article. I also modified it to check each of the pages linked in the article. The bot had great success with articles such as The Wire Season 3 and Criticisms of Tony Blair, where it matched references in The Wire and Tony Blair. I had to run this manually due to the possibility of false positives (mostly on music album sources) but it was very quickly done. I estimate I fixed approximately 100 references in 75 articles this way. However this method has run its course and there are still many articles remaining (Any of the small towns in North Dakota) where its obvious there is no source, never has been a source, and can not find a source on any of the linked pages. At some point any editor would call bullshit and add a fact-needed tag, why should the bot be held to a higher standard when it has already gone so far beyond what the average editor would do? --Lemmey talk 01:55, 2 June 2008 (UTC)[reply]
Examples of copying from another page

[1], [2], [3], [4], [5]

Obviously its very easy to approve the change when its a very unique reference name (reuters010908) or the articles have similar names. --Lemmey talk 02:01, 2 June 2008 (UTC)[reply]
You check every linked article for a matching named ref? That seems like a reasonable solution to me. Gimmetrow 02:52, 2 June 2008 (UTC)[reply]

RCP

from BeautifulSoup import BeautifulSoup
import datetime
import urllib
import wikipedia
import time
import re

##Get soft numbers
url = "http://www.realclearpolitics.com/epolls/maps/obama_vs_mccain/"
tag = "map-legend2"
f=urllib.urlopen(url)
html=f.read()
f.close()
soup = BeautifulSoup(html)
##print soup.prettify()
soup = soup.find(id=tag)

images=soup.findAll('img')
[image.extract() for image in images]

candidates=soup.findAll("div", {"class" : re.compile('^candidate')})

for candidate in candidates:
	if candidate.find("p", {"class" : "candidate-name"}):
		firstPTag, secondPTag = candidate.findAll('p')
		nametot = firstPTag.string
		style = str(firstPTag.attrs[1][1])[len("color:"):]
		name = nametot[:-3]
		total = nametot[len(name):]

		print name, total, style

		solid , leaning = secondPTag .findAll('strong')
		print "Solid:",solid.string
		print "Leaning:",leaning.string
	else:
		tossupPTag = candidate.find('p')
		nametot = tossupPTag.string
		name = nametot[:-3]
		total = nametot[len(name):]
		style = str(tossupPTag.attrs[1][1])[len("color:"):]

		print name, total, style