Skip to main content
Log in

Negative results are disappearing from most disciplines and countries

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

Concerns that the growing competition for funding and citations might distort science are frequently discussed, but have not been verified directly. Of the hypothesized problems, perhaps the most worrying is a worsening of positive-outcome bias. A system that disfavours negative results not only distorts the scientific literature directly, but might also discourage high-risk projects and pressure scientists to fabricate and falsify their data. This study analysed over 4,600 papers published in all disciplines between 1990 and 2007, measuring the frequency of papers that, having declared to have “tested” a hypothesis, reported a positive support for it. The overall frequency of positive supports has grown by over 22% between 1990 and 2007, with significant differences between disciplines and countries. The increase was stronger in the social and some biomedical disciplines. The United States had published, over the years, significantly fewer positive results than Asian countries (and particularly Japan) but more than European countries (and in particular the United Kingdom). Methodological artefacts cannot explain away these patterns, which support the hypotheses that research is becoming less pioneering and/or that the objectivity with which results are produced and published is decreasing.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  • Atkin, P. A. (2002). A paradigm shift in the medical literature. British Medical Journal, 325(7378), 1450–1451.

    Article  Google Scholar 

  • Bian, Z. X., & Wu, T. X. (2010). Legislation for trial registration and data transparency. Trials, 11, 64. doi:10.1186/1745-6215-11-64.

    Article  Google Scholar 

  • Bonitz, M., & Scharnhorst, A. (2001). Competition in science and the Matthew core journals. Scientometrics, 51(1), 37–54.

    Article  Google Scholar 

  • Browman, H. I. (1999). The uncertain position, status and impact of negative results in marine ecology: Philosphical and practical considerations. Marine Ecology Progress Series, 191, 301–309.

    Article  Google Scholar 

  • Csada, R. D., James, P. C., & Espie, R. H. M. (1996). The “file drawer problem” of non-significant results: Does it apply to biological research? Oikos, 76(3), 591–593.

    Article  Google Scholar 

  • de Meis, L., Velloso, A., Lannes, D., Carmo, M. S., & de Meis, C. (2003). The growing competition in Brazilian science: Rites of passage, stress and burnout. Brazilian Journal of Medical and Biological Research, 36(9), 1135–1141.

    Article  Google Scholar 

  • De Rond, M., & Miller, A. N. (2005). Publish or perish—Bane or boon of academic life? Journal of Management Inquiry, 14(4), 321–329. doi:10.1177/1056492605276850.

    Article  Google Scholar 

  • Delong, J. B., & Lang, K. (1992). Are all economic hypotheses false. Journal of Political Economy, 100(6), 1257–1272.

    Article  Google Scholar 

  • Doucouliagos, H., Laroche, P., & Stanley, T. D. (2005). Publication bias in union-productivity research? Relations Industrielles-Industrial Relations, 60(2), 320–347.

    Google Scholar 

  • Dwan, K., Altman, D. G., Arnaiz, J. A., Bloom, J., Chan, A.-W., Cronin, E., et al. (2008). Systematic review of the empirical evidence of study publication bias and outcome reporting bias. PLoS ONE, 3(8), e3081. [Research Support, Non-U.S. Gov’t; Review].

    Article  Google Scholar 

  • Evanschitzky, H., Baumgarth, C., Hubbard, R., & Armstrong, J. S. (2007). Replication research’s disturbing trend. Journal of Business Research, 60(4), 411–415. doi:10.1016/j.jbusres.2006.12.003.

    Article  Google Scholar 

  • Fanelli, D. (2010a). Do pressures to publish increase scientists’ bias? An empirical support from US States Data. Plos One, 5(4), e10271. doi:10.1371/journal.pone.0010271.

    Article  Google Scholar 

  • Fanelli, D. (2010b). “Positive” results increase down the hierarchy of the sciences. Plos One, 5(3), e10068. doi:10.1371/journal.pone.0010068.

    Article  MathSciNet  Google Scholar 

  • Feigenbaum, S., & Levy, D. M. (1996). Research bias: Some preliminary findings. Knowledge and Policy: The International Journal of Knowledge Transfer and Utilization, 9(2 & 3), 135–142.

    Google Scholar 

  • Formann, A. K. (2008). Estimating the proportion of studies missing for meta-analysis due to publication bias. Contemporary Clinical Trials, 29(5), 732–739. doi:10.1016/j.cct.2008.05.004.

    Article  Google Scholar 

  • Fronczak, P., Fronczak, A., & Holyst, J. A. (2007). Analysis of scientific productivity using maximum entropy principle and fluctuation-dissipation theorem. Physical Review E, 75(2), 026103. doi:10.1103/PhysRevE.75.026103.

    Article  Google Scholar 

  • Gad-el-Hak, M. (2004). Publish or perish—An ailing enterprise? Physics Today, 57(3), 61–62.

    Article  Google Scholar 

  • Gerber, A. S., & Malhotra, N. (2008). Publication bias in empirical sociological research—Do arbitrary significance levels distort published results? Sociological Methods & Research, 37(1), 3–30.

    Article  MathSciNet  Google Scholar 

  • Howard, G. S., Hill, T. L., Maxwell, S. E., Baptista, T. M., Farias, M. H., Coelho, C., et al. (2009). What’s wrong with research literatures? And how to make them right. Review of General Psychology, 13(2), 146–166.

    Article  Google Scholar 

  • Hubbard, R., & Vetter, D. E. (1996). An empirical comparison of published replication research in accounting, economics, finance, management, and marketing. Journal of Business Research, 35(2), 153–164.

    Article  Google Scholar 

  • Ioannidis, J. P. A. (2005). Why most published research findings are false. Plos Medicine, 2(8), 696–701.

    Article  Google Scholar 

  • Ioannidis, J. P. A. (2006). Evolution and translation of research findings: From to where? Plos Clinical Trials, 1, e36. doi:10.1371/journal.pctr.0010036.

    Article  Google Scholar 

  • Ioannidis, J. P. A. (2008a). Perfect study, poor evidence: Interpretation of biases preceding study design. Seminars in Hematology, 45(3), 160–166.

    Article  MathSciNet  Google Scholar 

  • Ioannidis, J. P. A. (2008b). Why most discovered true associations are inflated. Epidemiology, 19(5), 640–648.

    Article  Google Scholar 

  • Ioannidis, J. P. A., Ntzani, E. E., Trikalinos, T. A., & Contopoulos-Ioannidis, D. G. (2001). Replication validity of genetic association studies. Nature Genetics, 29(3), 306–309.

    Article  Google Scholar 

  • Ioannidis, J. P. A., & Trikalinos, T. A. (2005). Early extreme contradictory estimates may appear in published research: The proteus phenomenon in molecular genetics research and randomized trials. Journal of Clinical Epidemiology, 58(6), 543–549.

    Article  Google Scholar 

  • Jeng, M. (2006). A selected history of expectation bias in physics. American Journal of Physics, 74(7), 578–583.

    Article  Google Scholar 

  • Jennions, M. D., & Moller, A. P. (2002). Publication bias in ecology and evolution: An empirical assessment using the ‘trim and fill’ method. Biological Reviews, 77(2), 211–222.

    Article  Google Scholar 

  • Jennions, M. D., & Moller, A. P. (2003). A survey of the statistical power of research in behavioral ecology and animal behavior. Behavioral Ecology, 14(3), 438–445.

    Article  Google Scholar 

  • Jones, K. S., Derby, P. L., & Schmidlin, E. A. (2010). An investigation of the prevalence of replication research in human factors. Human Factors, 52(5), 586–595. doi:10.1177/0018720810384394.

    Article  Google Scholar 

  • Kelly, C. D. (2006). Replicating empirical research in behavioral ecology: How and why it should be done but rarely ever is. Quarterly Review of Biology, 81(3), 221–236.

    Article  Google Scholar 

  • King, D. A. (2004). The scientific impact of nations. Nature, 430(6997), 311–316. doi:10.1038/430311a.

    Article  Google Scholar 

  • Knight, J. (2003). Negative results: Null and void. Nature, 422(6932), 554–555.

    Article  Google Scholar 

  • Kundoor, V., & Ahmed, M. K. K. (2010). Uncovering negative results: Introducing an open access journal “Journal of Pharmaceutical Negative Results”. Pharmacognosy Magazine, 6(24), 345–347. doi:10.4103/0973-1296.71783.

    Google Scholar 

  • Kyzas, P. A., Denaxa-Kyza, D., & Ioannidis, J. P. A. (2007). Almost all articles on cancer prognostic markers report statistically significant results. European Journal of Cancer, 43(17), 2559–2579.

    Article  Google Scholar 

  • Larsen, P. O., & von Ins, M. (2010). The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index. Scientometrics, 84(3), 575–603. doi:10.1007/s11192-010-0202-z.

    Article  Google Scholar 

  • Lawrence, P. A. (2003). The politics of publication—Authors, reviewers and editors must act to protect the quality of research. Nature, 422(6929), 259–261. doi:10.1038/422259a.

    Article  Google Scholar 

  • Lortie, C. J. (1999). Over-interpretation: Avoiding the stigma of non-significant results. Oikos, 87(1), 183–184.

    Article  Google Scholar 

  • Maddock, J. E., & Rossi, J. S. (2001). Statistical power of articles published in three health psychology-related journals. Health Psychology, 20(1), 76–78.

    Article  Google Scholar 

  • Marsh, D. M., & Hanlon, T. J. (2007). Seeing what we want to see: Confirmation bias in animal behavior research. Ethology, 113(11), 1089–1098.

    Article  Google Scholar 

  • Meho, L. I. (2007). The rise and rise of citation analysis. Physics World, 20(1), 32–36.

    Google Scholar 

  • Nicolini, C., & Nozza, F. (2008). Objective assessment of scientific performances world-wide. Scientometrics, 76(3), 527–541. doi:10.1007/s11192-007-1786-9.

    Article  Google Scholar 

  • Osuna, C., Crux-Castro, L., & Sanz-Menedez, L. (2011). Overturning some assumptions about the effects of evaluation systems on publication performance. Scientometrics, 86, 575–592.

    Article  Google Scholar 

  • Palmer, A. R. (2000). Quasireplication and the contract of error: Lessons from sex ratios, heritabilities and fluctuating asymmetry. Annual Review of Ecology and Systematics, 31, 441–480.

    Article  Google Scholar 

  • Pautasso, M. (2010). Worsening file-drawer problem in the abstracts of natural, medical and social science databases. Scientometrics, 85(1), 193–202. doi:10.1007/s11192-010-0233-5.

    Article  Google Scholar 

  • Qiu, J. (2010). Publish or perish in China. Nature, 463(7278), 142–143. doi:10.1038/463142a.

    Article  Google Scholar 

  • Schmidt, S. (2009). Shall we really do it again? The powerful concept of replication is neglected in the social sciences. Review of General Psychology, 13(2), 90–100. doi:10.1037/a0015108.

    Article  Google Scholar 

  • Shelton, R. D., Foland, P., & Gorelskyy, R. (2007). Do new SCI journals have a different national bias? Proceedings of ISSI 2007: 11th international conference of the international society for scientometrics and informetrics, Vols I and II (pp. 708–717).

  • Shelton, R. D., Foland, P., & Gorelskyy, R. (2009). Do new SCI journals have a different national bias? Scientometrics, 79(2), 351–363. doi:10.1007/s11192-009-0423-1.

    Article  Google Scholar 

  • Silvertown, J., & McConway, K. J. (1997). Does “publication bias” lead to biased science? Oikos, 79(1), 167–168.

    Article  Google Scholar 

  • Simera, I., Moher, D., Hirst, A., Hoey, J., Schulz, K. F., & Altman, D. G. (2010). Transparent and accurate reporting increases reliability, utility, and impact of your research: Reporting guidelines and the EQUATOR Network. Bmc Medicine, 8, 24. doi:10.1186/1741-7015-8-24.

    Article  Google Scholar 

  • Song, F., Parekh, S., Hooper, L., Loke, Y. K., Ryder, J., Sutton, A. J., et al. (2010). Dissemination and publication of research findings: An updated review of related biases. Health Technology Assessment, 14(8), 1–193. doi:10.3310/hta14080.

    Google Scholar 

  • Statzner, B., & Resh, V. H. (2010). Negative changes in the scientific publication process in ecology: Potential causes and consequences. Freshwater Biology, 55(12), 2639–2653. doi:10.1111/j.1365-2427.2010.02484.x.

    Article  Google Scholar 

  • Steen, R. G. (2011). Retractions in the scientific literature: Do authors deliberately commit research fraud? Journal of Medical Ethics, 37(2), 113–117.

    Article  Google Scholar 

  • Sterling, T. D., Rosenbaum, W. L., & Weinkam, J. J. (1995). Publication decisions revisited—The effect of the outcome of statistical tests on the decision to publish and vice versa. American Statistician, 49(1), 108–112.

    Article  Google Scholar 

  • Tsang, E. W. K., & Kwan, K. M. (1999). Replication and theory development in organizational science: A critical realist perspective. Academy of Management Review, 24(4), 759–780.

    Google Scholar 

  • Warner, J. (2000). A critical review of the application of citation studies to the Research Assessment Exercises. Journal of Information Science, 26(6), 453–459.

    Article  Google Scholar 

  • Young, N. S., Ioannidis, J. P. A., & Al-Ubaydi, O. (2008). Why current publication practices may distort science. Plos Medicine, 5(10), 1418–1422. doi:10.1371/journal.pmed.0050201.

    Article  Google Scholar 

  • Yousefi-Nooraie, R., Shakiba, B., & Mortaz-Hejri, S. (2006). Country development and manuscript selection bias: A review of published studies. BMC Medical Research Methodology, 6, 37.

    Article  Google Scholar 

Download references

Acknowledgments

Robin Williams gave helpful comments, and François Briatte crosschecked the coding protocol. This work was supported by a Marie Curie Intra-European Fellowship (Grant Agreement Number PIEF-GA-2008-221441) and a Leverhulme Early-Career fellowship (ECF/2010/0131).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Daniele Fanelli.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Fanelli, D. Negative results are disappearing from most disciplines and countries. Scientometrics 90, 891–904 (2012). https://doi.org/10.1007/s11192-011-0494-7

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-011-0494-7

Keywords

Navigation