Education, Science, Technology, Innovation and Life
Open Access
Sign In

Exploring the Current State and Research Trends of the Reproducibility Crisis: A Bibliometric Analysis

Download as PDF

DOI: 10.23977/infkm.2023.040104 | Downloads: 15 | Views: 623

Author(s)

Chao Huang 1

Affiliation(s)

1 Sichuan University, Chengdu, Sichuan, 610207, China

Corresponding Author

Chao Huang

ABSTRACT

This paper explores the current state and research trends of the reproducibility crisis through a bibliometric analysis using data from Web of Science. The reproducibility crisis has been a major challenge in contemporary science since 2010, and it poses a threat to the credibility of experiments and the academic authority of papers. The paper provides a historical context of the crisis, identifies the research hotspots and trends of the articles about this crisis at various stages, hoping to offer a foundation for further research on the reproducibility crisis. The study found that the number of articles published on the reproducibility crisis has been increasing year by year, indicating an increase in research intensity and attention from the academic community. The paper suggests that bibliometric analysis can be a useful tool for comprehending the progression of the reproducibility crisis and identifying research hotspots and trends. Finally, the paper recommends that institutional constructs and a register of responsible research practices should be established to improve reproducibility in scientific research.

KEYWORDS

Reproducibility crisis, scientometrics, visualization analysis, pre-registration, research integrity

CITE THIS PAPER

Chao Huang, Exploring the Current State and Research Trends of the Reproducibility Crisis: A Bibliometric Analysis. Information and Knowledge Management (2023) Vol. 4: 20-31. DOI: http://dx.doi.org/10.23977/infkm.2023.040104.

REFERENCES

[1] Nosek B.A., J.R. Spies, and M. Motyl, Scientific Utopia: II. Restructuring Incentives and Practices to Promote Truth Over Publishability. Perspectives on Psychological Science, 2012. 7(6): p. 615-631.
[2] J B.D., Feeling the future: experimental evidence for anomalous retroactive influences on cognition and affect. Journal of personality and social psychology, 2011. 100(3).
[3] Collaboration O.S., Estimating the reproducibility of psychological science. Science, 2015. 349(6251).
[4] Munafò M.R., et al., A manifesto for reproducible science. Nature Human Behaviour, 2017. 1(1).
[5] Integrity W.C.O.R., Registry for research on the responsible conduct of research. 2022.
[6] Science C.F.O., Registered Reports: Peer review before results are known to align scientific values and practices. 2022.
[7] Klein R.A., et al., Investigating Variation in Replicability A "Many Labs'' Replication Project. Social Psychology, 2014. 45(3): p. 142-152.
[8] Gilbert D.T., et al., Comment on "Estimating the reproducibility of psychological science". Science, 2016. 351(6277).
[9] Anderson C.J., et al., Response to Comment on "Estimating the reproducibility of psychological science". Science, 2016. 351(6277).
[10] Weir K., A reproducibility crisis. Monitor in Psy chology, 2015. 46(9): p. 39.
[11] R B.D., et al., A Survey on Data Reproducibility and the Effect of Publication Process on the Ethical Reporting of Laboratory Research. Clinical cancer research : an official journal of the American Association for Cancer Research, 2018. 24(14).
[12] Kerr N.L., HARKing: Hypothesizing after the Results are Known. Personality and Social Psychology Review, 1998. 2(3).
[13] Chambers C.D. Registered Reports: A new publishing initiative at Cortex. Cortex, 2013. 49(3).
[14] Science C.F.O. Guidelines for Transparency and Openness Promotion (TOP) in Journal Policies and Practices "The TOP Guidelines". 2014.
[15] Chambers C.D., et al., Registered Reports: Realigning incentives in scientific publishing. Cortex, 2015. 66.
[16] Nosek B.A., et al., The preregistration revolution. Proceedings of the National Academy of Sciences, 2018. 115(11): p. 2600-2606.
[17] Morey R.D., et al., The Peer Reviewers' Openness Initiative: incentivizing open research practices through peer review. Royal Society Open Science, 2016. 3(1).
[18] Berg J.M., et al., Preprints for the life sciences. SCIENCE, 2016. 352(6288): p. 899-901.
[19] Science C.F.O., New Measure Rates Quality of Research Journals’ Policies to Promote Transparency and Reproducibility. 2020.
[20] Chuanjun L. and L. Jingqun, An analytical approach to understanding and solving the replication crisis of the embodiment effect. Advances in Psychological Science, 2018. 26(12): p. 2260-2271.
[21] Erceg-Hurn D.M. and V.M. Mirosevich, Modern Robust Statistical Methods An Easy Way to Maximize the Accuracy and Power of Your Research. American Psychologist, 2008. 63(7): p. 591-601.
[22] Wagenmakers E.-J., et al., Why Psychologists Must Change the Way They Analyze Their Data: The Case of Psi: Comment on Bem (2011). Journal of Personality and Social Psychology, 2011. 100(3): p. 426-432.
[23] Wilcox R., Modern Statistics for the Social and Behavioral Sciences: A Practical Introduction: Second Edition. Modern Statistics for the Social and Behavioral Sciences: A Practical Introduction: Second Edition. 2017. 1-696.
[24] Yu Y., et al., A bibliometric analysis using VOSviewer of publications on COVID-19. Annals of translational medicine, 2020. 8(13): p. 816-816.
[25] van Eck, N.J.P. and L.R. Waltman, Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics, 2010. 84(2): p. 523-538.

All published work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2016 - 2031 Clausius Scientific Press Inc. All Rights Reserved.