Jump to content

Philip Meyer Journalism Award

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by InternetArchiveBot (talk | contribs) at 16:13, 22 March 2018 (Rescuing 7 sources and tagging 0 as dead. #IABot (v1.6.5)). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Philip Meyer Journalism Award
DescriptionBest journalism done using social science research methods
Country United States
Presented byNational Institute for Computer-Assisted Reporting and Arizona State University's Walter Cronkite School of Journalism and Mass Communication
Reward(s)$500
First awarded2005
Websiteire.org/awards/philip-meyer-awards

The Philip Meyer Journalism Award has been awarded since 2005 to recognize the best journalism done using social science research methods.

Background

The Philip Meyer Journalism Award is a joint program of the National Institute for Computer-Assisted Reporting and Arizona State University's Walter Cronkite School of Journalism and Mass Communication. The award is named for Philip Meyer, a groundbreaking journalist and professor who has championed the use of scientific methods in the media. It is presented at the annual conference held by the National Institute for Computer-Assisted Reporting.

Thomas Hargrove, Fred Schulte and David Donald are the only reporters to have won the award twice.

Winners

Sortable table
Year Winner Entry Citation
2005[1] The Oregonian 'Unnecessary Epidemic' A series of articles over the past year showing how Congress and the Drug Enforcement Administration could have stopped the growth of meth abuse by aggressively regulating the import of the chemicals necessary to make it. Lead reporter Steve Suo's work included sophisticated statistical analyses of data on hospital and treatment center admissions, arrests, meth prices and purity, and chemical imports.
2006[2] The Wall Street Journal 'Perfect Payday' For a series of articles over the past year that exposed the widespread practice of secretly backdating stock option grants to benefit corporate insiders. Lead writers Charles Forelle and James Bandler used a statistical model to calculate the wildly improbable odds that options grant dates would just happen to be so favorably profitable to dozens of executives at some of the nation's best-known companies. Their stories about the scandal have spurred an ongoing federal securities investigation into rigged options at more than 100 companies to date.
2007[3] The Dallas Morning News 'Faking the Grade' For a three-day series that uncovered strong evidence of cheating on standardized tests by more than 50,000 students in Texas public and charter schools. Reporters Joshua Benton and Holly Hacker followed up on the paper’s groundbreaking 2004 investigation of cheating at the district and school level by analyzing a huge public records database of the scores and answers of hundreds of thousands of individual students taking the tests over a two-year period. The series prompted the state to announce stricter controls over test-taking conditions in Texas schools, and to adopt the cheat-detection statistical methods used by the paper.
2008[4] Scripps Howard News Service "'Saving Babies: Exposing Sudden Infant Death' Scripps Howard national reporters Tom Hargrove, Lee Bowman and Lisa Hoffman did a masterful job in exposing bureaucratic lapses that hinder the search for causes of Sudden Infant Death. Making good use of strong statistical tools, the team analyzed the sharp differences in cause-of-death diagnoses among the states and produced the first rigorous proof of the value of the local and state child death review boards that only some jurisdictions use. A few months after the project ran, then-U.S. Sen. Barack Obama introduced national legislation that would require medical examiners to make death scene investigations in all cases of unexpected infant death.
2009[5] USA Today 'The Smokestack Effect' USA Today reporters Blake Morrison and Brad Heath used techniques from social and physical sciences to examine the levels of air pollution at schools across the country. They gathered tens of millions of air quality and industrial pollution records and the locations of nearly 128,000 schools, then used the Environmental Protection Agency's own pollution model to identify thousands of schools where the air was far more toxic than in nearby neighborhoods. USA Today teams also spent weeks gathering air samples at 95 schools in 30 states, proving high pollution levels at two-thirds of them. The stories prompted immediate action from the EPA, including creation of a $2.25 million program to monitor air quality at schools.
2010[6] Los Angeles Times 'Grading the Teachers' A first-rate example of strong watchdog story-telling combined with innovative use of social science methods. Indeed, the point of the project was the failure of Los Angeles school officials to use effective methods to measure the performance of classroom teachers. The Los Angeles Times, applying a method called gain-score analysis to a huge database of individual students’ test scores and their teachers, identified the most and least effective teachers based on how much the students’ scores improved. The Times hired a national expert in gain-score analysis to do the data crunching, adding credibility to the results, but also did additional statistical analysis to identify high- and low-performing schools and otherwise verify their findings. In identifying and rating 6,000 teachers by name, the Times outraged the teachers’ union, but the series has prompted district officials to begin negotiating with the union to use the gain-score method in evaluations. Another sign of the impact of this series is that newspapers across the country have begun requesting similar data from local school districts.
2011[7] Scripps Howard News Service 'Murder Mysteries' The series is a sterling example of the power of precision journalism to find revealing patterns in data. Thomas Hargrove began the project by wondering if the FBI’s Supplementary Homicide Report could be used to detect the work of serial killers among the nation’s more than 185,000 unsolved murders. He first discovered that local police failed to report thousands of murders to the FBI and spent months using Freedom of Information laws to gather details of more than 15,000 unlogged murders across the country. After building what experts say is the most complete database of unsolved murders available, Hargrove developed a unique algorithm that used the statistical technique of cluster analysis to identify the likely traces of serial murders, as marked by victims of similar demographics killed by similar means. Police in at least eight cities have acknowledged that the clusters found by Hargrove are either confirmed serial cases or are likely to be such. The database was placed online so readers could do their own interactive analysis of local murders, and the entire dataset is available for anyone to download and explore. At least one armchair detective has used the data to find a cluster that police in his area agree is the work of a heretofore unacknowledged serial killer.
2012[8] The Center for Public Integrity 'Cracking the Codes' CPI uncovered the vast scale of Medicare billing errors and abuses that have padded the incomes of thousands of medical professionals to the tune of more than $11 billion over the past decade. Reporters Fred Schulte and Joe Eaton, working with project editor Gordon Witkin and database editor David Donald, analyzed 133 million Medicare records to demonstrate how so-called “upcoding” of diagnoses and procedures was steadily increasing Medicare payouts over the years. A key part of the analysis involved plotting the distribution curves of payment codes year by year, controlling for patients’ age and condition, thereby showing how use of more expensive codes was steadily increasing.
2013[9] ProPublica 'The Prescribers' This team of intrepid reporters did what Medicare officials had failed to do. They put a bright spotlight on the hundreds of millions of dollars that are wasted each year by a relative handful of doctors who prescribe expensive brand name drugs in high volumes instead of much cheaper generics. Using statistical tests to identify the outliers, they chewed through more than 1.2 billion Medicare Part D records and built an interactive "Prescriber Checkup" database that lets readers see the prescription patterns of their own physicians. Their reporting prompted Medicare to promise Congress that the agency would toughen its oversight of such abuses.
2014[10] The Center for Public Integrity 'Medicare Advantage Money Grab' In a superb series on behalf of the taxpayer, The Center for Public Integrity exposed how the medical industry has raised the “risk scores” for elderly patients to overbill the Medicare Advantage program tens of billions of dollars. Despite the challenges of dealing with complex and voluminous government data the Center aptly dissected the shocking shortcomings of a program that was meant to stabilize costs, but instead has allowed the industry to harvest huge sums by saying patients were sicker than they were. The explanation of the risk score system and the analysis of how it is manipulated was particularly lucid.
2015[11] The Tampa Bay Times 'Failure Factories' The team used statistical analysis and linear regression of data from dozens of records requests to document how steady resegregation of Pinellas County schools left black children to fail at increasingly higher rates than anywhere else in Florida. The series focused on failures of school district officials to give the schools the support necessary for success. The judges praised the reporters for dogged work on a project that took 18 months to report and write, and noted that the results underscored what decades of sociological research has shown happens in racially segregated schools.
2016[12] The Atlanta Journal Constitution 'Doctors & Sex Abuse' The newspaper took data analysis for a story to new levels of sophistication. The goal was to root out instances in which doctors had abused patients and gone unpunished, but the task was more than daunting. The team built 50 scrapers to pull in more than 100,000 documents. They then used machine learning to analyze all of those documents, searching for keywords that alluded to cases of sexual misconduct. They backed up their findings with other sophisticated data analysis and shoe-leather reporting. The sheer scope of their project was impressive. What was even more impressive were the results. The investigation found that doctors in every state had abused patients, and even when caught, still went unpunished.
2017[13] The Chicago Tribune 'Dangerous Doses' Dangerous Doses was groundbreaking work that made a remarkable discovery: More than half of the 255 pharmacies that the Chicago Tribune tested failed to warn patients about potentially deadly interactions. To identify the holes in patient safety, the paper consulted leading pharmacology researchers at universities to design the drug pairs for the pharmacy-testing project. The team then worked with a physician to obtain prescriptions, which 15 staff reporters took to pharmacies and documented whether they were told of potential adverse reactions. The results resonated in Illinois, with the governor launching new safety regulations, and nationwide with the country’s largest pharmacy chains, including CVS, Walgreens and Walmart, taking steps to improve patient safety for millions of consumers -- and potentially saving lives.

Notes

References

  • "2005 Philip Meyer Award winners". Investigative Reporters and Editors. Archived from the original on October 18, 2012. Retrieved October 20, 2012. {{cite web}}: Unknown parameter |deadurl= ignored (|url-status= suggested) (help)
  • "2006 Philip Meyer Award winners". Investigative Reporters and Editors. Archived from the original on October 18, 2012. Retrieved October 20, 2012. {{cite web}}: Unknown parameter |deadurl= ignored (|url-status= suggested) (help)
  • "2007 Philip Meyer Award winners". Investigative Reporters and Editors. Archived from the original on May 5, 2012. Retrieved October 20, 2012. {{cite web}}: Unknown parameter |deadurl= ignored (|url-status= suggested) (help)
  • "2008 Philip Meyer Award winners". Investigative Reporters and Editors. Archived from the original on May 5, 2012. Retrieved October 20, 2012. {{cite web}}: Unknown parameter |deadurl= ignored (|url-status= suggested) (help)
  • "2009 Philip Meyer Award winners". Investigative Reporters and Editors. Archived from the original on May 5, 2012. Retrieved October 20, 2012. {{cite web}}: Unknown parameter |deadurl= ignored (|url-status= suggested) (help)
  • "2010 Philip Meyer Award winners". Investigative Reporters and Editors. Archived from the original on May 5, 2012. Retrieved October 20, 2012. {{cite web}}: Unknown parameter |deadurl= ignored (|url-status= suggested) (help)
  • "2011 Philip Meyer Award winners". Investigative Reporters and Editors. Archived from the original on June 14, 2012. Retrieved October 20, 2012. {{cite web}}: Unknown parameter |deadurl= ignored (|url-status= suggested) (help)
  • "2012 Philip Meyer Award winners announced". Investigative Reporters and Editors. Retrieved January 9, 2012.
  • "2013 Philip Meyer Award winners". Investigative Reporters and Editors. Retrieved January 13, 2014.
  • "2014 Philip Meyer Award winners". Investigative Reporters and Editors. Retrieved February 1, 2015.
  • "2015 Philip Meyer Award winners". Investigative Reporters and Editors. Retrieved January 21, 2016.
  • "2016 Philip Meyer Award Winners Announced". Investigative Reporters and Editors. Retrieved January 23, 2017.
  • "2017 Philip Meyer Award winners announced". Investigative Reporters and Editors. Retrieved January 22, 2018.