Wikipedia talk:WikiProject Newspapers/Wikidata

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
WikiProject iconNewspapers Project‑class
WikiProject iconThis page is within the scope of WikiProject Newspapers, a collaborative effort to improve the coverage of Newspapers on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
ProjectThis page does not require a rating on Wikipedia's content assessment scale.

@99of9: Do you know how to create a Wikidata-based infobox like {{Infobox artwork/wikidata}}? It would be nice if there were a quick and easy way to create a basic infobox for a newspaper. If I'm understanding right, this would involve a workflow like...

  1. Use {{Infobox newspaper/wikidata}} and get a basic, somewhat generic infobox
  2. If and when the article develops further, perhaps replace it with {{Infobox newspaper}} in order to customize it

Is that about right? -Pete Forsyth (talk) 16:29, 2 August 2018 (UTC)[reply]

I know enough technically to make (or copy) them, but I'm not current enough on the politics and procedures of using them! i.e. I'd be liable to step on toes occasionally :). I tend to favour automation and single-source-of-truth, so your step 2 is counterintuitive to me. I'd either shoot for improving the customisability of the /wikidata version to fit any desired usage, or at (not as good) making it easy to manually override the values coming from wd. --99of9 (talk) 13:19, 3 August 2018 (UTC)[reply]
OK, thanks. I don't know that it's a high priority, but maybe we can play around with these ideas at some point. When I look at the code in the "artwork" example above, I'm rather mystified. What are the basic skills needed? Lua? Template code? Something else? -Pete Forsyth (talk) 23:39, 3 August 2018 (UTC)[reply]
Most likely just template code, combined with a few calls to existing Lua modules. For example it looks like we should understand Module:WikidataIB. --99of9 (talk) 11:26, 9 August 2018 (UTC)[reply]

Color codes in example map?[edit]

@99of9: Thanks for explaining the color codes in the main talk page. Is there a way to make a color distinction between, say...

  • Exists on Wikidata but not EN Wikipedia
  • enwp article is stub
  • enwp article is start class
  • enwp article is C class

...

Or

  • Exists on Wikidata but not enwp
  • enwp article lacks infobox
  • enwp article has infobox

? I realize these conditions might be hard to test for...but maybe not impossible? -Pete Forsyth (talk) 23:53, 3 August 2018 (UTC)[reply]

@Peteforsyth: Hmm, yes, this is challenging. Wikidata doesn't store anything about infoboxes or article assessments (other than badges (QA, FA etc)). I think I can see ways of getting this data into a spreadsheet, but not onto the normal GUI map. For example, within openrefine, I could call this API to tell me which pages transclude the stub/infobox template (or others). Another option, which at least gets it into a SPARQL query is querying the categories that get added by those templates, but unfortunately that page says it doesn't work with the GUI yet. Your question pushes a bit beyond my experience, so there may be some workaround that I havevn't thought of yet. --99of9 (talk) 11:06, 4 August 2018 (UTC)[reply]
I've asked for expert help here. --99of9 (talk) 11:35, 4 August 2018 (UTC)[reply]

Ok, I've solved the second case here

99of9 That's fantastic! I'll send out a tweet in the morning with the Oregon portion. -Pete Forsyth (talk) 06:39, 9 August 2018 (UTC)[reply]

With a project this scale a spreadsheet is critical for communicating this type of issue. --Robert Sterbal, call/text 412-977-3526, robert@sterbal.com Rsterbal (talk) 12:19, 20 September 2018 (UTC)[reply]

Newspapers and Wikidata conference call[edit]

Hi all, especially Bluerasberry and 99of9--

Michaelacaulfield has suggested that we should have a Wikidata conference call for this project later this week. Would either of you be interested/able to join in? I can provide some general info about Wikidata, but I think it would be very helpful to have somebody with deeper experience with it than I have. (I can create individual records and claims, but I don't know much of anything about building queries, maps, data-based infoboxes, etc. Hoping somebody knowing more about such things can join in!) Mike and I are both in the Pacific time zone. Apart from that, I don't know what times he's proposing, but I'd imagine there's some flexibility. -Pete Forsyth (talk) 00:20, 8 August 2018 (UTC)[reply]

I could do your Wednesday 4:10pm-5:50pm or your Thursday 5:10pm-5:50pm. Later hours in your evening are also possible. --99of9 (talk) 00:27, 8 August 2018 (UTC)[reply]
99of9 I've passed those times along to Mike, hopefully one will work. (They're both good for me, but Mike is the critical one.) By the way, would you like to join our Slack channel? It's been pretty helpful for coordinating, and is pretty low-volume.
Also -- I see that you're getting some good responses on more sophisticated maps from Wikidata -- glad to see that discussion progressing! -Pete Forsyth (talk) 00:47, 8 August 2018 (UTC)[reply]

mediawiki podcast[edit]

https://sterbalssundrystudies.miraheze.org/wiki/Category:Between_the_Brackets_podcast — Preceding unsigned comment added by 72.22.12.62 (talk) 21:31, 19 September 2018 (UTC)[reply]

Query timeout limit reached[edit]

I keep getting a Query timeout limit reached

when running this query:

https://query.wikidata.org/#%23defaultView%3AMap%0A%0ASELECT%20DISTINCT%20%3Fitem%20%3FitemLabel%20%3Fplace%20%3FplaceLabel%20%3Fid%20%3Fcoords%20%3Farticle%20%3Fmapflags%20%3Frgb%20WHERE%20%7B%0A%20%3Fc%20wdt%3AP279%2a%20wd%3AQ11032%20.%0A%20%20%3Fitem%20wdt%3AP31%20%3Fc%20.%0A%20%3Fitem%20wdt%3AP17%7Cwdt%3AP495%20wd%3AQ30%20.%0A%20OPTIONAL%7B%3Fitem%20wdt%3AP5454%20%3Fid%20.%7D%0A%20%3Fitem%20wdt%3AP291%7Cwdt%3AP159%7Cwdt%3AP131%20%3Fplace%20.%0A%20%3Fplace%20wdt%3AP625%20%3Fcoords.%0A%20OPTIONAL%20%7B%20%20%20%0A%20%20%20SELECT%20%3Fitem%20%3Fpageid%20%3Fns%20WHERE%20%7B%0A%20%20%20%20SERVICE%20wikibase%3Amwapi%20%7B%0A%20%20%20%20%20%20bd%3AserviceParam%20wikibase%3Aendpoint%20%22en.wikipedia.org%22%20.%0A%20%20%20%20%20%20bd%3AserviceParam%20wikibase%3Aapi%20%22Generator%22%20.%0A%20%20%20%20%20%20bd%3AserviceParam%20mwapi%3Agenerator%20%22search%22%20.%0A%20%20%20%20%20%20bd%3AserviceParam%20mwapi%3Agsrsearch%20%22insource%3A%5C%22infobox%20newspaper%5C%22%22%20.%0A%20%20%20%20%20%20bd%3AserviceParam%20mwapi%3Agsrlimit%20%22max%22%20.%0A%20%20%20%20%20%20%3Fitem%20wikibase%3AapiOutputItem%20mwapi%3Aitem%20.%0A%20%20%20%20%20%20%3Fpageid%20wikibase%3AapiOutput%20%22%40pageid%22%20.%0A%20%20%20%20%20%20%3Fns%20wikibase%3AapiOutput%20%22%40ns%22%20.%0A%20%20%20%20%7D%0A%20%20%7D%20LIMIT%2015000%0A%20%7D%0A%20%20%20%20%0A%20%20OPTIONAL%20%7B%0A%20%20%20%20%20%20%3Farticle%20schema%3Aabout%20%3Fitem%20.%0A%20%20%20%20%20%20%3Farticle%20schema%3AinLanguage%20%22en%22%20.%0A%20%20%20%20%20%20FILTER%20%28SUBSTR%28str%28%3Farticle%29%2C%201%2C%2025%29%20%3D%20%22https%3A%2F%2Fen.wikipedia.org%2F%22%29%0A%20%20%7D%0A%0A%20%20BIND%28IF%28BOUND%28%3Fns%29%2C%22_infobox%22%2C%22_noinfobox%22%29%20AS%20%3Finfobox%29.%0A%20%20BIND%28IF%28BOUND%28%3Farticle%29%2C%22enwiki%22%2C%22noenwiki%22%29%20AS%20%3Fonwiki%29.%20%20%0A%20%20BIND%28CONCAT%28%3Fonwiki%2C%3Finfobox%29%20as%20%3Flayer%29%20.%0A%0A%20%20BIND%28%20IF%28BOUND%28%3Farticle%29%2C%20%20%20IF%28BOUND%28%3Fns%29%2C%20%22009500%22%20%2C%20%22FFF000%22%20%29%20%20%2C%20%20%22FF0000%22%20%20%20%29%20%20AS%20%3Frgb%29.%20%20%0A%20%20%0A%20%23hint%3APrior%20hint%3ArunFirst%20%22true%22.%0A%20%23OPTIONAL%7B%20%3Fitem%20wdt%3AP5454%20%3Fid%20.%20%7D%0A%20%20%0A%20SERVICE%20wikibase%3Alabel%20%7B%20bd%3AserviceParam%20wikibase%3Alanguage%20%22en%22.%20%7D%0A%7D%20

Can someone else run the query and send me the results?

Thanks, Rsterbal (talk) 02:45, 22 September 2018 (UTC)[reply]

Hi Rsterbal, I didn't notice this request before, sorry. I assume based on our chat a few days ago it's not needed any more? Let me know if you still need something. (FWIW, I"ve found that this query fails on occasion for me as well. Coming back in a couple hours usually does the trick. I assume it has to do with Wikidata server resources.) -Pete Forsyth (talk) 18:07, 4 October 2018 (UTC)[reply]

i found another query

do you have a preferred query to run to check on progress every week or so?

Rsterbal (talk) 18:40, 4 October 2018 (UTC)[reply]

LCCN?[edit]

Which property should I use to add LCCN? E.g. for this one: https://chroniclingamerica.loc.gov/lccn/sn95001761/ aka News-Register (Q7019170) what should be added? Laboramus (talk) 21:51, 29 November 2018 (UTC)[reply]

Hi Laboramus, just noticed this. Are you looking to add LCCN to the Wikidata item? I've been using wikidata:P1144 which is "Library of Congress Control Number (LCCN) (bibliographic)". I find the naming confusing, and the resulting links don't seem to go anywhere useful. Might be worth bringing up at wikidata:Wikidata:WikiProject Periodicals or similar. The property names or descriptions could probably be improved, but I don't know enough to make a specific suggestion. -Pete Forsyth (talk) 05:41, 20 December 2018 (UTC)[reply]

Media sources for get out the vote[edit]

I'm trying to help people with their get out the vote campaigns by delivering a list of media sources for a zip code within each ballot level of a political race.

This project is a great start.

I'm hoping the list winds up in a table so I can query the zip code the person running the campaign is in, and let them filter the list for media sources of nearby zip codes.

The more newspapers that get added, the better. I'm very impressed with the work that has gone into this so far.

Rsterbal (talk) 04:42, 20 December 2018 (UTC)[reply]

Thanks -- we've been getting a fair amount into Wikidata, zip codes are not part of it but if you can map zip codes to locations (like town names or county names), you can probably query Wikidata to get what you want. Others probably know the syntax better than I do, but I'd imagine it's doable. -Pete Forsyth (talk) 05:43, 20 December 2018 (UTC)[reply]

USNPL ID links now go to malware[edit]

The domain for the US Newspaper Links (USNPL) now redirects to malware:
https://www.usnpl.com/

So the 6K Wikidata items with a USNPL ID now have malware links, e.g., Wikidata's NY Times page has this malware URL:
https://www.usnpl.com/search/newspapers?q=2293

Proposed fixes for the USNPL ID property are either:
1. Remove the "URL match pattern", or,
2. Change it to go to the Wayback Machine. For instance, this URL finds the latest capture in 2022 (when all the USNPL pages were still online):
https://web.archive.org/web/2022/https://www.usnpl.com/search/newspapers?q=5812

So the URL match pattern could be:
^https://web.archive.org/web/2022/https?:\/\/(?:www\.)?usnpl\.com\/search\/newspapers\?q=([1-9]\d*)

(Also posted this in Wikidata_talk:WikiProject_Periodicals)
Hearvox (talk) 19:01, 11 January 2024 (UTC)[reply]