Jump to content

User talk:Dr pda

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 99.38.151.90 (talk) at 09:51, 8 March 2010 (Thank you for the symbolic regression help: new section). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

BBA

(award moved to user page)

Thanks! Dr pda (talk) 01:36, 1 October 2009 (UTC)[reply]

Royal Company of Archers Peer Review

Thank you--Koakhtzvigad (talk) 02:01, 9 October 2009 (UTC)[reply]

extractpersondata.stx

Hello - thank you for your persondata scraping work. I am wondering if you could look at the page and see if the script posted is the same as the one you have working. I copied it straight and wasn't able to get any output when using this against the current wikipedia download. I fixed the space in "substring-before" but still it doesn't work for me. Thanks for any assistance —Preceding unsigned comment added by 208.120.207.201 (talk) 01:00, 15 October 2009 (UTC)[reply]

What you need to do, rather than just copying and pasting the text as shown the page Wikipedia:Persondata/extractPersondata.stx, is to click the "edit the page" link at the top of that page, then copy the text which appears in the edit box. The reason there's a difference is that stx uses the strings &gt; and &lt; rather than the characters > and < when testing for inequality, however when your browser renders the code it displays the former as the latter. Dr pda (talk) 03:21, 15 October 2009 (UTC)[reply]

I think I found that the problem that I was having with the stx file is that it references the old namespace xmlns:m="http://www.mediawiki.org/xml/export-0.3/ when I switched this to match the XML 0.4, it worked. Thanks —Preceding unsigned comment added by 208.120.207.201 (talk) 00:27, 20 October 2009 (UTC)[reply]

Thanks. I should probably update the stx file. The reason I didn't come across this issue was I tested the script on an old file. Dr pda (talk) 00:35, 29 October 2009 (UTC)[reply]

Prose size script

Hi Dr pda, I discovered your prose size script today, and just wanted to say thank you for writing it. Working out numbers of words, minus footnotes, headers etc, has always been a pain, and is now a pleasure. One small niggle is that it doesn't seem to count words in blockquotes, or at least doesn't highlight them in yellow, which I assume means it's not counting them. It's a minor issue as it won't affect the word count that much in most articles, but I thought you might want to know about it anyway, in case you don't already. And again, thank you for writing it. :) SlimVirgin talk|contribs 07:11, 24 October 2009 (UTC)[reply]

Thanks for the thanks! You are correct that the script doesn't count blockquotes. The reason is that when I first wrote the script, it turned out that simply counting the text which was in <p> </p> tags was almost exactly equivalent to determining the "readable prose". There are some occasions when this is not perfect, blockquotes are arguably one such. Indeed there was an argument (in which I didn't participate though my name was thrown around) in December 2008 at Wikipedia talk:Did you know/Archive 36#SIZE over whether blockquotes should be counted as readable prose for the purposes of determining if an article was eligible for DYK. I have just reread WP:SIZE and this doesn't exclude blockquotes from readable prose, so maybe I should make an effort to include them. One other reason I had for not including them was that people often use pullquotes instead of images, e.g. with templates like {{cquote}} or {{quote box2}}, which I thought used blockquote internally, and these should not be included in the readable prose. However on further inspection most of these templates seem to use HTML tables instead. So maybe I should modify the script to include blockquotes. One case where it would still fail would be where articles incorrectly use pull quotes instead of blockquotes, e.g. the lead of Apollo program. I'll see what I can do when I have time; I might just add an toggle to include blockquotes. Dr pda (talk) 00:35, 29 October 2009 (UTC)[reply]

Hey there

You have email. Best, Fvasconcellos (t·c) 23:14, 17 December 2009 (UTC)[reply]

Replied. Dr pda (talk) 05:12, 18 December 2009 (UTC)[reply]

Seasons Greetings and all that ...

Happy Holidays
Wishing you and yours a Happy Holiday Season, from the horse and bishop person. Ealdgyth - Talk 16:33, 24 December 2009 (UTC)[reply]

And now, for FV's traditional last-minute nonsectarian holiday greeting!

Here’s wishing you a happy end to the holiday season and a wonderful 2010.
Fvasconcellos (t·c) 15:14, 25 December 2009 (UTC)[reply]

I've replied regarding the plan. Nev1 (talk) 17:08, 30 January 2010 (UTC)[reply]

I've read your reply, and also Awadewit's and Elcobbola's comments. It seems this is indeed a grey area. I was just about to add a comment on the FAC page saying that this issue has been satisfactorily addressed, but I found it had already been promoted. Congratulations! Dr pda (talk) 01:53, 31 January 2010 (UTC)[reply]

Fewer reviewers in 2009

Thank you for such a clear presentation of the situation in the upcoming Signpost. Only one statement (in the pullquote) was somewhat confusing at first: "FAC nominators only up 250%". I did not immediately grasp that you were referring to those that only nominated but did not review. I thought you were saying that the FAC nominators (nominations) "only" increased by 250%. (As if that were a small increase!) But you mean those who nominated but did not contribute to reviewing. Sorry! It might help if you put that figure in the text also, as not all of us quickly realize that 2.5 times translates to 250%. Regards, —mattisse (Talk) 00:39, 9 February 2010 (UTC)[reply]

Thanks for your feedback. I used 'factor of 2.5' in the text, as percentages greater than 100% are often confusing, however in the summary box I wanted to be consistent so I used percentages for all the figures. I've now added 250% to the text as well. Regarding "nominators only up 250%", I was trying for brevity in the summary box, though it appears clarity suffered. I have since added quotes around 'nominators only' to make it clearer what I am referring to. Dr pda (talk) 01:20, 9 February 2010 (UTC)[reply]
Thanks. I think that helps. Regards, —mattisse (Talk) 01:34, 9 February 2010 (UTC)[reply]
Good article; thanks for your work! I hope you don't mind that I tried to lift the title and opening of the article ... a bit more positive, even though from inside the house it's easy to see the negative side. Tony (talk) 06:17, 9 February 2010 (UTC)[reply]
Not at all. That paragraph was largely copied and pasted from the corresponding Dispatch I wrote about this time last year, and I had been in two minds about its tone anyway (introduction vs more journalistic lead). The fact that reviewers gain respect for their contributions is probably something which deserves mentioning, and may entice more people into reviewing! Finetooth's suggestion at WT:PR about a userbox for the number of peer reviews done is in the same vein. Dr pda (talk) 10:13, 9 February 2010 (UTC)[reply]

There was a query about including A-Class reviews in the statistics; see this comment. Nice work overall. Dabomb87 (talk) 01:33, 10 February 2010 (UTC)[reply]

Replied there. Dr pda (talk) 03:13, 10 February 2010 (UTC)[reply]

(Barnstar from TomStar81 moved to user page)

Much appreciated. Dr pda (talk) 22:21, 17 February 2010 (UTC)[reply]

Thanks for cryonics cites and docs

I have them. That was about 1000 times easier than a trip to the library. :D Thank you! 99.22.95.61 (talk) 22:41, 17 February 2010 (UTC)[reply]

...and the other. I'm filled with gratitude and appreciation. 99.22.95.61 (talk) 01:25, 18 February 2010 (UTC)[reply]

I did get the Fahy et al review, too, and don't need the others. "Directional freezing is useful for macro-homogeneous tissue (e.g., whole single organs) but not heterogeneous collections of more than one organ at a time, so it's of little use for my purposes. Thanks again! 99.191.75.124 (talk) 19:40, 23 February 2010 (UTC)[reply]

Page size script

Hey Dr pda, I just found your script today that shows page size in the toolbox. I recently switched to the beta version of Wikipedia, though, and the script doesn't work in beta. Is there any way to fix this? I don't know enough javascript to be able to do anything. Great tool! (I've gotten it to work on non-beta). Thanks! --Dudemanfellabra (talk) 23:01, 2 March 2010 (UTC)[reply]

Hi. The beta version of the Wikipedia interface uses a different skin (vector instead of monobook), therefore any scripts which you have in your monobook.js won't be picked up under beta. Instead you need to install them in User:yourusername/vector.js. I haven't explicitly tested the script for beta/vector but I think it should work without further modifications. Dr pda (talk) 23:45, 2 March 2010 (UTC)[reply]
Ah ok that explains it. I'd rather not create a new subpage, though.. Do you know if beta will switch over to monobook if it ever becomes umm.. non-beta? haha. If it will switch over, I'll just wait until the switch. --Dudemanfellabra (talk) 05:03, 3 March 2010 (UTC)[reply]
An alternative way to run the script without installing it is to go to the page you are interested in, then paste javascript:importScript('User:Dr pda/prosesize.js'); getDocumentSize(); into the address bar of your browser instead of the URL. It's also possible to make this a bookmark, to save having to type it out each time. Dr pda (talk) 10:20, 3 March 2010 (UTC)[reply]
Awesome.. I bookmarked it, and it works perfectly now. Thanks for the help! --Dudemanfellabra (talk) 14:27, 3 March 2010 (UTC)[reply]

Thank you for the symbolic regression help

I have that file. Thank you so much. 99.38.151.90 (talk) 09:51, 8 March 2010 (UTC)[reply]