|CenPop Script Source and Info|
My major contributions are historical United States Census population data for cities and towns (and boroughs and villages and townships and any other form of municipality).
I have also done a good amount of vandalism reversion. User:ClueBot_NG is absolutely amazing for a lot of vandalism, but machine learning and natural language processing is still a long way away from catching more subtle vandalism.
Areas of Expertise/Knowledge
- Cryptography, in particular lattice-based cryptography (I have a PhD in computer science focused on lattice-based cryptography). Disclosure: I am currently employed by NIST as a cryptographer, but any edits I make on Wikipedia do not reflect the views or opinions of the United States government, and to the best of my ability, do not reflect any point of view at all, in accordance with Wikipedia policy.
Historical Census Population Data
I am currently working on getting historical census population data (using the Template:US Census Population template) up for all incorporated places in the United States. Currently, I have completed
- All Incorporated Places (as of the 2015 estimates)
- All Minor Civil Divisions (as of the 2014 estimates) in Maine, New Jersey and Vermont, (as of the 2015 estimates) in New Hampshire and Rhode Island
Unfortunately, compiling old data can't really be automated much because the Census files (for years before 1990) are in the form of scanned prints, and the quality of these files is not good enough to trust OCR, at least of the quality I have access to.
However, to aid in adding the compiled data to Wikipedia, I've written User:DemocraticLuntz/CenPop which is a modification of the AutoWikiBrowser script by User:joeyjte50. It can be run (at least for me) at Project:CenPop/Script
TODO: need to get gazetteer page for County subdivisions from server
I have recently begun trying to update the svg maps shown in many US place articles (as the original user User:Arkyan is gone and his script with him). I forked a python project someone did to help make svg maps (it being the closest thing I could find to what I wanted here).
It is far from what I would like it to be in terms of automating and I have not documented the changes I made for the purpose of making svg files of maps for Wikipedia, but the actual code is available on Github at github
Fortune 1000 Companies
I have also been working on adding and improving articles (mainly stubs because I don't have sufficient reliable sources for more full-length articles) on those Fortune 1000 (and other very large) companies currently missing from Wikipedia.
- Created pages for Health Care REIT, Superior Energy Services, Whiting Petroleum Corporation, Ventas, Team Health Holdings, Rush Enterprises, Hyster-Yale Materials Handling, LKQ Corporation, WPX Energy, Primoris Services Corporation, Scansource, Helmerich & Payne, Westlake Chemical, Adams Resources & Energy, MRC Global, Greif Inc., Magellan Health, Northern Tier Energy, H.B. Fuller, A-Mark Precious Metals, Renewable Energy Group
- TODO: Darling Ingredients, Crestwood Equity Partners, Enable Midstream Partners, Par Petroleum, Antero Resources, TrueBlue (fix with Labor Ready), RCS Capital, Seventy Seven Energy, Performance Food Group Company (needs Infobox), Univar (cleanup), Veritiv Corporation (cleanup), Builders FirstSource, California Resources Corporation, TRI Pointe Group, SPX FLOW (as distinct from SPX Corporation?), VCA Animal Hospitals (beef up infobox), TTM Technologies, Engility Holdings, On Assignment, Inc. (cleanup)