Skip to main content

Research publications repository

    • čeština
    • English
  • English 
    • čeština
    • English
  • Login
View Item 
  •   CU Research Publications Repository
  • Fakulty
  • Faculty of Education
  • View Item
  • CU Research Publications Repository
  • Fakulty
  • Faculty of Education
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Many Labs 5: Testing Pre-Data-Collection Peer Review as an Intervention to Increase Replicability

original article
Custom Licence Icon
published version
  • no other version
Thumbnail
File can be accessed.Get publication
Author
Ebersole, Charles
Mathur, Maya B.
Baranski, Erica
Bart-Plange, Diane-Jo
Ropovik, IvanORCiD Profile - 0000-0001-5222-1233WoS Profile - J-7404-2015Scopus Profile - 56095404500

Show other authors

Publication date
2021
Published in
Advances in Methods and Practices in Psychological Science
Volume / Issue
3 (3)
ISBN / ISSN
ISSN: 2515-2459
ISBN / ISSN
eISSN: 2515-2467
Metadata
Show full item record
Collections
  • Faculty of Education

This publication has a published version with DOI 10.1177/2515245920958687

Abstract
Replication studies in psychological science sometimes fail to reproduce prior findings. If these studies use methods that are unfaithful to the original study or ineffective in eliciting the phenomenon of interest, then a failure to replicate may be a failure of the protocol rather than a challenge to the original finding. Formal pre-data-collection peer review by experts may address shortcomings and increase replicability rates. We selected 10 replication studies from the Reproducibility Project: Psychology (RP:P; Open Science Collaboration, 2015) for which the original authors had expressed concerns about the replication designs before data collection; only one of these studies had yielded a statistically significant effect (p < .05). Commenters suggested that lack of adherence to expert review and low-powered tests were the reasons that most of these RP:P studies failed to replicate the original effects. We revised the replication protocols and received formal peer review prior to conducting new replication studies. We administered the RP:P and revised protocols in multiple laboratories (median number of laboratories per original study = 6.5, range = 3-9; median total sample = 1,279.5, range = 276-3,512) for high-powered tests of each original finding with both protocols. Overall, following the preregistered analysis plan, we found that the revised protocols produced effect sizes similar to those of the RP:P protocols (Delta r = .002 or .014, depending on analytic approach). The median effect size for the revised protocols (r = .05) was similar to that of the RP:P protocols (r = .04) and the original RP:P replications (r = .11), and smaller than that of the original studies (r = .37). Analysis of the cumulative evidence across the original studies and the corresponding three replication attempts provided very precise estimates of the 10 tested effects and indicated that their effect sizes (median r = .07, range = .00-.15) were 78% smaller, on average, than the original effect sizes (median r = .37, range = .19-.50).
Keywords
replication, reproducibility, metascience, peer review, Registered Reports, open data, preregistered
Permanent link
https://hdl.handle.net/20.500.14178/2687
Show publication in other systems
WOS:000707042400001
SCOPUS:2-s2.0-85096661669
License

Poskytovatel licencie: Authors, zastupeni prvym autorom, Charles R. Ebersole, Department of Psychology, University of Virginia Nabyvatel licence: Association for Psychological Science, 1800 Massachusetts Ave NW, Suite 402, Washington, DC 20036 Datum uzavretia zmluvy: 13.11.2020 Rok vydania: 2020

(complete license conditions)

xmlui.dri2xhtml.METS-1.0.item-publication-version-

DSpace software copyright © 2002-2016  DuraSpace
Contact Us | Send Feedback
Theme by 
Atmire NV
 

 

About Repository

About This RepositoryResearch outputs typologyRequired metadataDisclaimerCC Linceses

Browse

All of DSpaceCommunities & CollectionsWorkplacesBy Issue DateAuthorsTitlesSubjectsThis CollectionWorkplacesBy Issue DateAuthorsTitlesSubjects

DSpace software copyright © 2002-2016  DuraSpace
Contact Us | Send Feedback
Theme by 
Atmire NV