How Reporting Statistical Significance Causes p-Value Hacking

P-value hacking Nicole Janz Replication BlogA new article by researchers at the University of Amsterdam shows that publication bias towards statistically significant results may cause p-value misreporting. The team examined hundreds of published articles and found that authors had reported p-values < .05 when they were in fact larger. They conclude that publication bias may incentivize researchers to misreport results.

I have recently had a paper rejected, with one of the reviewers stating that I presented “primarily a big set of null findings across a variety of models.” This is, apparently, not uncommon. A growing range of studies provide evidence that a rejection of articles with non-significant findings (p-values < .05) is common in social science (Fanelli 2010, Mervis 2014).

A new article by Ivar Vermeulen and his co-authors (2015) now provides evidence that such publication bias may cause researchers to misreport their p-values, so that their findings seem statistically significant below the cut-off point of .05, when in fact they are not.

For their study, the authors extracted 5,834 p-values from articles that contained F and t test statistics reported in 25 communication science
journals between 2010 and 2012. They also extracted 19,380 F and t-test statistics from 1,725 papers published in 28 social psychology journals from the same period.
Using the given information on t and F values, and degrees of freedom, they re-calculated the appropriate p-value, and compared that to the reported significance level. (They did some more stuff that you can read in the paper).
Results Vermeulen Nicole Janz Replication Blog .png

Clear bias toward reporting p-values too low

In communication science, 8.8% of the p-values reported were misreported, and in social psychology 8.7%. Some of the misrepresented p-values were critical errors, which led to the presentation of significant results – somewhat an advantage if you want to get published. Interestingly, they also found that p-value misrepresentation is more common in higher impact journals and when smaller samples were used.

The authors concluded:

With respect to errors in p-value reporting, our results show that p-value misreporting in communication science literature is rather frequent, similarly frequent to the field of social psychology (…). It is disturbing that, rather than being innocent mistakes, these errors appear to be driven by researchers’ motivations to demonstrate significant relationships. That is, our results reveal that erroneously reducing p-values (especially in favor of significance) occurs significantly more often than erroneously enlarging p-values.

Proposed solutions: Pre-replication

The authors proposed several solutions to the problem that can be tackled by journals and by the scientific community as a whole. For example:

  1. Journals should ask editors and reviewers to conduct an automated statistics check for every incoming document
  2. Increase transparency, e.g., by posting research data and analysis files in public repositories and promoting replication research
  3. Pre-registration of studies, where researchers submit their hypotheses and analysis plans before starting data collection


Read the full study

Ivar Vermeulen , Camiel J. Beukeboom , Anika Batenburg , Arthur Avramiea , Dimo Stoyanov , Bob van de Velde , Dirk Oegema, Blinded by the Light: How a Focus on Statistical “Significance” May Cause p-Value Misreporting and an Excess of p-Values Just Below .05 in Communication Science, Communication Methods and Measures, Vol. 9, Iss. 4, 2015.


Tagged ,

2 thoughts on “How Reporting Statistical Significance Causes p-Value Hacking

  1. […] that publication bias may incentivize researchers to misreport results.” To read the blog, click here.  To read the article, click […]


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: