The Reproducibility Project: Psychology was a crowdsourced collaboration of 270 contributing authors to repeat 100 published experimental and correlational psychological studies. This project was led by the Center for Open Science and its co-founder, Brian Nosek, who started the project in November 2011. The results of this collaboration were published in August 2015. Reproducibility is the ability to produce the same findings, using the same methodologies as the original work, but on a different dataset (for instance, collected from a different set of participants). The project has illustrated the growing problem of failed reproducibility in social science. This project has started a movement that has spread through the science world with the expanded testing of the reproducibility of published works.
Brian Nosek of University of Virginia and colleagues sought out to replicate 100 different studies that all were published in 2008. The project pulled these studies from three different journals, Psychological Science, the Journal of Personality and Social Psychology, and the Journal of Experimental Psychology: Learning, Memory, and Cognition, published in 2008 to see if they could get the same results as the initial findings. In their initial publications 97 of these 100 studies claimed to have significant results. The group went through extensive measures to remain true to the original studies, including consultation with the original authors. Even with all the extra steps taken to ensure the same conditions of the original 97 studies, only 35 (36.1%) of the studies replicated, and if these effects were replicated, they were often smaller than those in the original papers. The authors emphasized that the findings reflect a problem that affects all of science and not just psychology, and that there is room to improve reproducibility in psychology.
In 2021, the project showed that of 193 experiments from 53 top papers about cancer published between 2010 and 2012, only 50 experiments from 23 papers could get replicated.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Metascience (also known as meta-research) is the use of scientific methodology to study science itself. Metascience seeks to increase the quality of scientific research while reducing inefficiency. It is also known as "research on research" and "the science of science", as it uses research methods to study how research is done and find where improvements can be made. Metascience concerns itself with all fields of research and has been described as "a bird's eye view of science".
The replication crisis (also called the replicability crisis and the reproducibility crisis) is an ongoing methodological crisis in which the results of many scientific studies are difficult or impossible to reproduce. Because the reproducibility of empirical results is an essential part of the scientific method, such failures undermine the credibility of theories building on them and potentially call into question substantial parts of scientific knowledge.
Reproducibility, closely related to replicability and repeatability, is a major principle underpinning the scientific method. For the findings of a study to be reproducible means that results obtained by an experiment or an observational study or in a statistical analysis of a data set should be achieved again with a high degree of reliability when the study is replicated. There are different kinds of replication but typically replication studies involve different researchers using the same methodology.
Related lectures (21)
Le cours "Critical Data Studies" s'inscrit dans la nouvelle offre d'enseignements TILT qui propose de croiser des savoirs provenant des SHS et des sciences de l'ingénieur afin d'aborder des thématique
Kinetic information extracted from biochemical methane potential (BMP) tests is often reported but its value is unclear. Inter-laboratory reproducibility provides a useful indication of its value. Here we extracted estimates of the first-order rate constan ...
2022
Simulation script for the paper "Regularization for distributionally robust state estimation and prediction". Run tests/test_cdc.py to reproduce results. Extended versions can be found at https://github.com/DecodEPFL/. ...
This upload contains the relevant scripts, notebooks, and datasets to reproduce the numerically obtained results of the Journal article "Three-dimensional buoyant hydraulic fractures: finite volume release" by Möri and Lecampion, (2023). ...