User talk:Topbanana

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Archives: [ 1 - 2 - 3 - 4 - 5]

enfield paul (talk · contribs)[edit]

please can you return the article kingston hospital radio to my sandbox. i do not understand what you mean by redirects etc, and I find the wikipedia guidelines confusing and complicated. i work for the charity involved so am not sure how why i don't seem to be able to use our content. all i need is for someone to say do this, don't do this in relation to my article. if necessary i will move it to a special page. thanks.

Re: Howdy[edit]

Nuvola apps edu languages.svg
Hello, Topbanana. You have new messages at Staeiou's talk page.
You can remove this notice at any time by removing the {{Talkback}} or {{Tb}} template.

Link repair[edit]

Hi, regarding edits like this - how does that fall within WP:RLR? Neither were redlinks. --Redrose64 (talk) 16:31, 17 June 2014 (UTC)

I'm trying to train a new system to predict common mis-spellings of article titles; this would be of great use in suggesting potential matches for red links. I had hoped to use redirects marked with the {{R from misspellings}} template as training data but there's not much consistency or logic behind the template's current use. So, I'm trying to clean these up by hand to move things forwards - so far I've checked 3000 of them, 2000 more or so to go ;) - TB (talk) 17:00, 17 June 2014 (UTC)

WikiProject Stub Sorting reports[edit]

It would be nice if you could update Wikipedia:WikiProject Stub sorting/Uncatted stubs and Wikipedia:WikiProject Stub sorting/Stubs catted via redirects - both of these arhave been updated last in December. עוד מישהו Od Mishehu 04:39, 8 August 2014 (UTC)

And Wikipedia:WikiProject Stub sorting/missing stubs - not updated since 2012. עוד מישהו Od Mishehu 04:46, 8 August 2014 (UTC)
All done. Enjoy. - TB (talk) 20:48, 8 August 2014 (UTC)
Thanks. One more thing: Next time you do it, please make sure that stub tags with equal signs get escaped properly (<nowiki<{{stub=tag-name}}</nowiki>, not {{tl|stub=tag-name}}. עוד מישהו Od Mishehu 03:41, 10 August 2014 (UTC)
Sure thing; this is normally handled automatically by my editing tools, but you caught me on my holidays. It seems my mobile phone's just no good at copying and pasting utf8 and html. I'll regenerate the reports again from my normal system in a couple of weeks time for you. - TB (talk) 11:30, 10 August 2014 (UTC)
By the way, keep in mind (for Wikipedia:WikiProject Stub sorting/Uncatted stubs) that transclusions of redirects are also counted as transclusions of the redirectr target; counting each separately means thst you end up counting them twice. עוד מישהו Od Mishehu 08:52, 12 August 2014 (UTC)

Not all admins are active[edit]

About that box on your userpage.. wouldn't it be more accurate to say, "... and of those 1,402 (0.01%) administrators, only 612 are active (as per WP:List of administrators)" Which would raise the number to one administrator for each 213 'active' users. -- œ 04:04, 12 August 2014 (UTC)

Update most wanted pages[edit]

Hello topbanana,

I'm gerardduenas from the Catalan Wikipedia. We were wondering whether there was a way to get the Most wanted pages content in order to modify it or filter some of the pages. Due to my lack of programming skills (I only know a bit of python) I've been searching for scripts in media wiki and other places until I got to the English Wikipedia most wanted articles project. Beland told me that his scripts were too outdated to continue working and that you might have an updated script.

Thank you in advance,

Gerardduenas (talk) 10:54, 22 September 2014 (UTC)

The tools I use to generate the most wanted articles lists are part of the Red Link Recovery project. I have enabled the Red Link Recovery Live tool for the Catalan Wikipedia and extracted the 500 'most wanted' red links from it for you at ca:Usuari:Gerardduenas/MWA. - TB (talk) 16:32, 22 September 2014 (UTC)
Thank you very much! Just another little question, how did you extract the list? --Gerardduenas (talk) 20:15, 22 September 2014 (UTC)
I've pulled the relevant parts of the script together for you below.
-- Find all redlinks
CREATE TABLE pageredlinks (
  prl_from INT(8) UNSIGNED NOT NULL DEFAULT '0',
  prl_namespace INT(11) NOT NULL DEFAULT '0',
  prl_title varbinary(255) NOT NULL DEFAULT '',
  UNIQUE KEY pl_from (prl_from,prl_namespace,prl_title),
  KEY prl_namespace (prl_namespace,prl_title)
)
 
REPLACE INTO pageredlinks ( prl_from, prl_namespace, prl_title )
SELECT l.pl_from, l.pl_namespace, l.pl_title
FROM cawiki_p.pagelinks l
INNER JOIN cawiki_p.page f ON f.page_id = l.pl_from
LEFT OUTER JOIN cawiki_p.page t ON t.page_namespace = l.pl_namespace AND t.page_title = l.pl_title
WHERE f.page_namespace IN ( 0, 10 )
AND  t.page_id IS NULL;
 
-- Find those with a significant number of sources
CREATE TABLE mwa AS
SELECT prl_namespace, prl_title, COUNT(*) AS 'c'
FROM pageredlinks
WHERE prl_namespace = 0
GROUP BY prl_namespace, prl_title
HAVING COUNT(*) > 10;
 
-- Define a function to estimate the 'wantedness' of each of these
DELIMITER //
 
DROP FUNCTION IF EXISTS wantedness //
 
CREATE FUNCTION wantedness( namespace INT, title VARCHAR(255) )
  RETURNS INT
  READS SQL DATA
  NOT DETERMINISTIC
BEGIN  
   DECLARE done INT DEFAULT FALSE;
   DECLARE w, pns, t INT;
   DECLARE ptit VARCHAR(255);
   DECLARE links CURSOR FOR
      SELECT page_namespace, page_title
      FROM cawiki_p.pagelinks
      INNER JOIN cawiki_p.page ON pl_from = page_id
      WHERE pl_namespace = namespace
      AND   pl_title = title;
   DECLARE CONTINUE HANDLER FOR NOT FOUND SET done = TRUE;
 
   SET w = 0;
 
   OPEN links;
   read_loop: LOOP
 
      FETCH links INTO pns, ptit;
      IF done THEN
         LEAVE read_loop;
      END IF;
 
      -- Count one for any link in namespace 0
      IF pns = 0 THEN
        SET w = w + 1;
      END IF;
 
      -- For each linked page, check how many times is has been transcluded into namespace 0
      SELECT COUNT(*)
      INTO t
      FROM cawiki_p.templatelinks
      INNER JOIN cawiki_p.page ON tl_from = page_id
      WHERE page_namespace = 0
      AND  tl_namespace = pns
      AND  tl_title = ptit;
 
      SET w = w - t;
 
   END LOOP;
   CLOSE links;
 
   RETURN w;
END;   
//
 
DELIMITER ;
 
-- Update the mwa table
UPDATE mwa SET c = wantedness( prl_namespace, prl_title );
 
-- Pick out the those most wanted
SELECT concat( '*[[', prl_title, ']] - ', c, ' links' )
FROM mwa
ORDER BY c DESC
LIMIT 500;
Thank you for everything! --Gerardduenas (talk) 05:43, 25 September 2014 (UTC)