In order to teach open and reproducible science effectively, educators need to make sense of almost a decade of literature, across several fields, and be informed about ongoing (and often dynamic) debates. This is a tall ask for most educators. So FORRT sought to develop strategies and propose solutions to mitigate the effects of competing academic interests and help scholars implement open and reproducible science tenets in their teaching and mentoring workflow. In an effort to reduce some of the burden on educators wishing to learn or teach these concepts, FORRT has draw on the expertise of more than 50 experts from its community to provide educators with a comprehensive but straightforward accessible didactic framework. FORRT clusters is a result of a comprehensive literature review guided by educational, pedagogical and didactic considerations aiming at providing a pathway towards the incremental adoption of Open and Reproducible Science tenets into educators/scholars teaching and mentoring. The focus lies not on simply aggregating the literature into bins, but on making sense of existing works, weaving connections where none exist, and providing a sensible learning-oriented Open and Reproducible Science taxonomy. FORRT taxonomy is composed of 7 clusters:
We further breakdown each cluster into sub-categories to provide educators/scholars with useful information on the extant of open science scholarship, and how they are connected to one another. The idea behind specifying clusters and sub-clusters it to highlight we have drawn fuzzy boundaries between clusters while allowing for diversification and heterogeneity in how each educator integrates these cluster/sub-clusters with their respective field content. The breakdown of each cluster into sub-categories provides scholars with useful information on the extant of open science scholarship, and how they are connected to one another. To have a look at the sub-clusters within each cluster, please click on the clusters above.
See below for each cluster, its description, sub-clusters, and associated works geared for teaching. And here’s an attempt to visualize FORRT’s clusters:
Building on the clusters we created a “Open and Reproducible Science” Syllabus. We hope it can serve as starting point for your class. .pdf download or editable G-doc version. Check out the FORRT’s syllabus page.
Attainment of foundational knowledge on the emergence of, and importance of, reproducible and open research (i.e., grounding the motivations and theoretical underpinnings of Open and Reproducible Science). Integration with field specific content (i.e., grounded in the history of replicability). There are 6 sub-clusters which aim to further parse the learning and teaching process:
Baker, M. (2016). 1,500 scientists lift the lid on reproducibility. Nature News, 533(7604), 452. doi: https://doi.org/10.1038/533452a
Baker, M. (2016). Is there a reproducibility crisis? Nature, 533(7604), 3â5. doi: https://doi.org/10.1038/d41586-019-00067-3
Chambers, C. (2017). The seven deadly sins of psychology: A manifesto for reforming the culture of scientific practice. Princeton University Press. http://dx.doi.org/10.1515/9781400884940
Crßwell, S., van Doorn, J., Etz, A., Makel, M. C., Moshontz, H., Niebaum, J., ⌠SchulteMecklenbeck, M. (2018, November 16). 7 easy steps to open science: An annotated reading list. https://doi.org/10.31234/osf.io/cfzyx
Edwards, M. A., & Roy, S. (2016). Academic research in the 21st century: Maintaining scientific integrity in a climate of perverse incentives and hypercompetition. Environmental Engineering Science, 34(1), 51-61. DOI: https://doi.org/10.1089/ees.2016.0223
Merton, R., K. (1968). The Matthew effect in science. Science, 159(3810), 56-63. 10.1126/science.159.3810.56
Merton, R., K. (1988). The Matthew Effect in Science, II: Cumulative Advantage and the Symbolism of Intellectual Property. ISIS, 79(4), 606-623. 10.1086/354848
Munafo, M. R., et al. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1, 0021. DOI: 10.0138/s41562-016-0021
Vazire, S. (2018). Implications of the Credibility Revolution for Productivity, Creativity, and Progress. Perspectives on Psychological Science, 13(4), 411-417. https://doi.org/10.1177/1745691617751884
Confirmatory analyses refer to tests of hypotheses that are formulated prior to data collection. Exploratory analyses refer to everything else.
Chambers, C. (2017). The seven deadly sins of psychology: A manifesto for reforming the culture of scientific practice. Princeton University Press. http://dx.doi.org/10.1515/9781400884940
Lin, W., & Green, D. P. (2016). Standard operating procedures: A safety net for pre-analysis plans. PS: Political Science & Politics, 49(3), 495-500.
Wagenmakers, E.-J., Wetzels, R., Borsboom, D., van der Mass, H. L. J., & Kievit, R. A. (2012). An agenda for purely confirmatory research. Perspectives on Psychological Science, 7(6), 632â638. doi:10.1177/1745691612463078
Wagenmakersâ, E.-J., Dutilh, G., & Sarafoglou, A. (2018). The Creativity-Verification Cycle in Psychological Science: New Methods to Combat Old Idols. Perspectives on Psychological Science, 13(4), 418â427. https://doi.org/10.1177/1745691618771357
The ways in which researchers engage in behaviors and decision-making that increase the probability of their (consciously or unconsciously) desired result.
Gelman, A., & Loken, E. (2013). The garden of forking paths: Why multiple comparisons can be a problem, even when there is no âfishing expeditionâ or âp-hackingâ and the research hypothesis was posited ahead of time. Unpublished manuscript. http://www.stat.columbia.edu/~gelman/research/unpublished/p_hacking.pdf
John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23(5), 524-532. https://doi.org/10.1177/0956797611430953
Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant. Psychological Science, 22(11), 1359â1366.https://doi.org/10.1177/0956797611417632
Smaldino, P. E., & McElreath, R. (2016). The natural selection of bad science. Royal Society open science, 3(9), 160384.https://doi.org/10.1098/rsos.160384
Wicherts, J. M., Veldkamp, C. L., Augusteijn, H. E., Bakker, M., Van Aert, R. C., & Van Assen, M. A. (2016). Degrees of freedom in planning, running, analyzing, and reporting psychological studies: A checklist to avoid p-hacking. Frontiers in psychology, 7.
Published checklists and other resources that can be used to shift behavior more toward improved practices.
Crßwell, S., van Doorn, J., Etz, A., Makel, M. C., Moshontz, H., Niebaum, J., ⌠SchulteMecklenbeck, M. (2018, November 16). 7 easy steps to open science: An annotated reading list. https://doi.org/10.31234/osf.io/cfzyx
Lindsay (2020) Seven steps toward transparency and replicability in psychological science. Canadian Psychology/Psychologie canadienne.
Ioannidis, J. P., Munafo, M. R., Fusar-Poli, P., Nosek, B. A., & David, S. P. (2014). Publication and other reporting biases in cognitive sciences: detection, prevalence, and prevention. Trends in cognitive sciences, 18(5), 235-241.
Klein, R. A., Vianello, M., Hasselman, F., Adams, B. G., Adams, R. B., Alper, S., ⌠Nosek, B. A. (2018). Many Labs 2: Investigating Variation in Replicability Across Samples and Settings. Advances in Methods and Practices in Psychological Science, 1(4), 443â490. https://doi.org/10.1177/2515245918810225
Munafo, M. R., et al. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1, 0021. DOI: 10.0138/s41562-016-0021
Peng, R. (2015). The reproducibility crisis in science: A statistical counterattack. Significance, 12(3). https://doi.org/10.1111/j.1740-9713.2015.00827.x
Bahlai et al. (2019). Open science isn’t always open to all scientists. American Scientist, 107(2), 78. DOI: https://doi.org/10.1511/2019.107.2.78
Chen, X., Dallmeier-Tiessen, S., Dasler, R., Feger, S., Fokianos, P., Gonzalez, J. B., … & Rodriguez, D. R. et al. (2019). Open is not enough. Nature : Physics, 15 (2), 113-119. https://doi.org/10.1038/s41567-018-0342-2
Drummond, C. (2018).; Reproducible research: a minority opinion. Journal of Experimental & Theoretical Artificial Intelligence, 30(1), 1-11. https://doi.org/10.1080/0952813X.2017.1413140
Fanelli, D. (2018). Opinion: Is science really facing a reproducibility crisis, and do we need it to? Proceedings of the National Academy of Sciences, 115(11), 2628-2631. https://doi.org/10.1073/pnas.1708272114
Fanelli, D., & Ioannidis, J. P. (2013). US studies may overestimate effect sizes in softer research. Proceedings of the National Academy of Sciences, 110(37), 15031-15036. https://doi.org/10.1073/pnas.1302997110
Fell, M. J. (2019). The economic impacts of open science: A rapid evidence assessment. Publications, 7(3), 46. https://doi.org/10.3390/publications7030046
Pashler, H., & Harris, C. R. (2012). Is the replicability crisis overblown? Three arguments examined. Perspectives on Psychological Science, 7, 531â536. https://doi.org/10.1177/1745691612463401
Brabeck, M. M. (2021). Open science and feminist ethics: Promises and challenges of open access. Psychology of Women Quarterly, 45(4), 457-474. https://doi.org/10.1177/03616843211030926
Bol, T., de Vaan, M., & van de Rijt, A. (2018). The Matthew effect in science funding. Proceedings of the National Academy of Sciences, 115(19), 4887-4890. https://doi.org/10.1073/pnas.1719557115
Chopik, W. J., Bremner, R. H., Defever, A. M., & Keller, V. N. (2018). How (and whether) to teach undergraduates about the replication crisis in psychological science. Teaching of Psychology, 45(2), 158â163. https://doi.org/10.1177/0098628318762900
Edwards, M. A., & Roy, S. (2016). Academic research in the 21st century: Maintaining scientific integrity in a climate of perverse incentives and hypercompetition. Environmental Engineering Science, 34(1), 51-61. DOI: https://doi.org/10.1089/ees.2016.0223
Fell, M. J. (2019). The economic impacts of open science: A rapid evidence assessment. Publications, 7(3), 46. https://doi.org/10.3390/publications7030046
Jones, NL. (2007). A code of ethics for the life sciences. Science and Engineering Ethics, 13, 25-43. DOI:https://doi.org/ 0.1007/s11948-006-0007-x
Attainment of a grounding in fundamental statistics, measurement, and its implications encompassing conceptual knowledge, application, interpretation and communication of statistical analyses. There are 5 sub-clusters which aim to further parse the learning and teaching process:
Banerjee, A., Chitnis, UB., Jadhav, SL., Bhawalkar, JS., Chaudhury, S. (2009). Hypothesis testing, type I and type II errors. Industrial Psychiatry Journal, 18(2), 127-131. https://doi.org/10.1111/j.1740-9713.2015.00827.x
Gelman, A., & Carlin, J. (2014). Beyond power calculations: Assessing Type S (sign) and Type M (magnitude) errors. Perspectives on Psychological Science, 9(6), 641-651. doi: 10.1177/1745691614551642
Lakens, D. Improving your statistical inferences. Online course. https://www.coursera.org/learn/statistical-inferences
Cumming, G. (2014). The new statistics: Why and how. Psychological Science, 25(1), 7-29. https://doi.org/10.1177/0956797613504966
Etz, A., Gronau, Q.F., Dablander, F. et al. (2018). How to become a Bayesian in eight easy steps: An annotated reading list. Psychonomic Bulletin Review, 25, 219â234. https://doi.org/10.3758/s13423-017-1317-5
Greenland, S., Senn, S. J., Rothman, K. J., Carlin, J. B., Poole, C., Goodman, S. N., & Altman, D. G. (2016). Statistical tests, p values, confidence intervals, and power: Aa guide to misinterpretations. European Journal of Epidemiology, 31(4), 337â50. http://doi.org/10.1007/s10654-016-0149-3
Nuzzo, R. (2014). Statistical errors: P values, the âgold standardâ of statistical validity, are not as reliable as many scientists assume. Nature, 506(7487), 150-152. doi:10.1038/506150a
Wagenmakersâ, E.-J., Dutilh, G., & Sarafoglou, A. (2018). The Creativity-Verification Cycle in Psychological Science: New Methods to Combat Old Idols. Perspectives on Psychological Science, 13(4), 418â427. https://doi.org/10.1177/1745691618771357
Brysbaert, M. and Stevens, M. (2018). Power analysis and effect size in mixed effects models: A Tutorial. Journal of Cognition, 1(1): 9, pp. 1â20, DOI: https://doi.org/10.5334/joc.10
Button, K. S., Ioannidis, J. P., Mokrysz, C., Nosek, B. A., Flint, J., Robinson, E. S., & Munafò, M. R. (2013). Power failure: why small sample size undermines the reliability of neuroscience. Nature Reviews Neuroscience, 14(5), 365-376. https://doi.org/10.1038/nrn3475
Greenland, S., Senn, S. J., Rothman, K. J., Carlin, J. B., Poole, C., Goodman, S. N., & Altman, D. G. (2016). Statistical tests, p values, confidence intervals, and power: A guide to misinterpretations. European Journal of Epidemiology, 31(4), 337â50. http://doi.org/10.1007/s10654-016-0149-3
Lakens, D. (2013). Calculating and reporting effect sizes to facilitate cumulative science: a practical primer for t-tests and ANOVAs. Frontiers in Psychology, 4, 863. 10.3389/fpsyg.2013.00863
Pek, J., & Flora, D. B. (2018). Reporting effect sizes in original psychological research: A discussion and tutorial. Psychological Methods, 23(2), 208-225. http://doi.org/10.1037/met0000126
Perugini, M., Gallucci, M., & Costantini, G. (2014). Safeguard power as a protection against imprecise power estimates. Perspectives on Psychological Science, 9, 319-332.
Gervais et al. (2015). A powerful nudge? Presenting calculable consequences of underpowered research shifts incentives towards adequately powered designs. Social Psychological and Personality Science, 6, 847-854. https://doi.org/10.1177/1948550615584199
Perugini, M., Gallucci, M., & Costantini, G. (2014). Safeguard power as a protection against imprecise power estimates. Perspectives on Psychological Science, 9, 319-332.
Wicherts, J. M., Veldkamp, C. L., Augusteijn, H. E., Bakker, M., Van Aert, R., & Van Assen, M. A. (2016). Degrees of freedom in planning, running, analyzing, and reporting psychological studies: A checklist to avoid p-hacking. Frontiers in Psychology, 7, 1832.doi: 10.3389/fpsyg.2016.01832
Flake, J. K., & Fried, E. I. (2019, January 17). Measurement schmeasurement: Questionable measurement practices and how to avoid them. https://doi.org/10.31234/osf.io/hs7wm
Flake, J. K., Pek, J., & Hehman, E. (2017). Construct validation in social and personality research: Current practice and recommendations. Social Psychological and Personality Science, 8(4), 370â378. https://doi.org/10.1177/1948550617693063
Hussey, I., & Hughes, S. (2018, November 19). Hidden invalidity among fifteen commonly used measures in social and personality psychology. https://doi.org/10.31234/osf.io/7rbfp
Rodebaugh, T. L., Scullin, R. B., Langer, J. K., Dixon, D. J., Huppert, J. D., Bernstein, A., . . . Lenze, E. J. (2016). Unreliability as a threat to understanding psychopathology: The cautionary tale of attentional bias. Journal of Abnormal Psychology, 125(6), 840-851. http://dx.doi.org/10.1037/abn0000184
Attainment of the how-to basics of reproducible reports and analyses. It requires students to move towards transparent and scripted analysis practices. There are 6 sub-clusters which aim to further parse the learning and teaching process:
Automating data analysis to make the process easier
Gandrud, C. (2016). Reproducible research with R and R Sstudio. New York; CRC Press
Wilson G, Bryan J, Cranston K, Kitzes J, Nederbragt L, et al. (2017) Good enough practices in scientific computing. PLOS Computational Biology 13(6): e1005510. https://doi.org/10.1371/journal.pcbi.1005510
Monash’s Data Fluency Reproducible Research in R (RRR)
Writing analyses in programming language compared to performing them with a point-and-click menu.
Processing and restructuring data so that it is more useful for analyse.
Nick Fox’s Writing Reproducible Scientific Papers in R
PsuTeachR’s Data Skills for Reproducible Science
Making sure anyone can reproduce analyses through things like well-commented scripts, writing codebooks, etc.
Gandrud, C. (2016). Reproducible research with R and R Sstudio. New York; CRC Press
Wilson, G., Bryan, J., Cranston, K., Kitzes, J., Nederbragt, L., & Teal, T. K. (2017). Good enough practices in scientific computing. PLoS computational biology, 13(6). e1005510. https://doi.org/10.1371/journal.pcbi.1005510
Learning statistics with R: A tutorial for psychology students and other beginners
Includes tools such as statcheck.io, GRIM, and SPRITE
Brown, N. J., & Heathers, J. A. (2016). The GRIM test: A simple technique detects numerous anomalies in the reporting of results in psychology. Social Psychological and Personality Science, 1948550616673876. http://journals.sagepub.com/doi/pdf/10.1177/1948550616673876
Nuijten, M. B., Van Assen, M. A. L. M., Hartgerink, C. H. J., Epskamp, S., & Wicherts, J. M. (2017). The validity of the tool âstatcheckâ in discovering statistical reporting inconsistencies. Preprint retrieved from https://psyarxiv.com/tcxaj/.
van der Zee, T., Anaya, J., & Brown, N. J. (2017). Statistical heartburn: An attempt to digest four pizza publications from the Cornell Food and Brand Lab. BMC Nutrition, 3(1), 54. DOI 10.1186/s40795-017-0167-x
Attainment of a grounding in open (FAIR) data and materials. It requires students to learn about FAIR data (and education materials) principles: findability, accessibility, interoperability, and reusability; engage with reasons to share data, the initiatives designed to increase scientific openness; as well as ethical considerations and consequences of open (FAIR) data practices. There are 6 sub-clusters which aim to further parse the learning and teaching process:
Traditional publication models, open access models, preprints, etc.
Hardwicke, T. E., Mathur, M. B., MacDonald, K., Nilsonne, G., Banks, G. C., Kidwell, M. C., … & Lenne, R. L. (2018). Data availability, reusability, and analytic reproducibility: Evaluating the impact of a mandatory open data policy at the journal Cognition. Royal Society Open Science, 5(8), 180448. http://dx.doi.org/10.1098/rsos.180448
Klein, R. A., Vianello, M., Hasselman, F., Adams, B. G., Adams, R. B., Alper, S., ⌠Nosek, B. A. (2018). Many Labs 2: Investigating Variation in Replicability Across Samples and Settings. Advances in Methods and Practices in Psychological Science, 1(4), 443â490. https://doi.org/10.1177/2515245918810225
Klein, R. A., Ratliff, K. A., Vianello, M., Adams, R. B., BahnĂk, Ĺ ., Bernstein, M. J., et al. (2014). Investigating variation in replicability: A âmany labsâ replication project. Social Psychology, 45, 142â152. https://doi.org/10.1027/1864-9335/a000178
Rouder, J. N. (2016). The what, why, and how of born open data. Behavior Research Methods, 48, 1062â1069. doi:10.3758/s13428-015-0630-z
Siler, K., Haustein, S., Smith, E., Larivière, V., & Alperin, J. P. (2018). Authorial and institutional stratification in open access publishing: the case of global health research. PeerJ, 6, e4269. doi:10.7717/peerj.4269
Tennant, J. P., Waldner, F., Jacques, D. C., Masuzzo, P., Collister, L. B., & Hartgerink, C. H. (2016). The academic, economic and societal impacts of Open Access: an evidence-based review. F1000Research, 5, 632. doi:10.12688/f1000research.8460.3
Colavizza, G., Hrynaszkiewicz, I., Staden, I., Whitaker, K., & McGillivray, B. (2020). The citation advantage of linking publications to research data. PloS One, 15(4), e0230416.
Klein, R. A., Vianello, M., Hasselman, F., Adams, B. G., Adams, R. B., Alper, S., ⌠Nosek, B. A. (2018). Many Labs 2: Investigating Variation in Replicability Across Samples and Settings. Advances in Methods and Practices in Psychological Science, 1(4), 443â490. https://doi.org/10.1177/2515245918810225
Klein, R. A., Ratliff, K. A., Vianello, M., Adams, R. B., BahnĂk, Ĺ ., Bernstein, M. J., et al. (2014). Investigating variation in replicability: A âmany labsâ replication project. Social Psychology, 45, 142â152. https://doi.org/10.1027/1864-9335/a000178
Levenstein, M. C., & Lyle, J. A. (2018). Data: Sharing Is Caring. Advances in Methods and Practices in Psychological Science, 1(1), 95â103. https://doi.org/10.1177/2515245918758319
Piwowar, H.A., & Vision, T.J. (2013). Data reuse and the open data citation advantage. PeerJ, 1, e175 https://doi.org/10.7717/peerj.175
Rouder, J. N. (2016). The what, why, and how of born open data. Behavior Research Methods, 48, 1062â1069. doi:10.3758/s13428-015-0630-z
Stodden, V. C. (2011). Trust your science? Open your data and code. Amstat News, 409, 21-22.
Tennant, J. P., Waldner, F., Jacques, D. C., Masuzzo, P., Collister, L. B., & Hartgerink, C. H. (2016). The academic, economic and societal impacts of Open Access: an evidence-based review. F1000Research, 5, 632. doi:10.12688/f1000research.8460.3
Gilmore, R. O., Kennedy, J. L., & Adolph, K. E. (2018). Practical solutions for sharing data and materials from psychological research. Advances in Methods and Practices in Psychological Science, 1(1), 121â130. https://doi.org/10.1177/2515245917746500
Rouder, J. N. (2016). The what, why, and how of born open data. Behavior Research Methods, 48, 1062â1069. doi:10.3758/s13428-015-0630-z
Soderberg, C. K. (2018). Using OSF to Share Data: A Step-by-Step Guide. Advances in Methods and Practices in Psychological Science, 1(1), 115â120. https://doi.org/10.1177/2515245918757689
Joel, S., Eastwick, P. W., & Finkel, E. J. (2018). Open sharing of data on close relationships and other sensitive social psychological topics: Challenges, tools, and future directions. Advances in Methods and Practices in Psychological Science, 1(1), 86â94. https://doi.org/10.1177/2515245917744281
Klein, R. A., Vianello, M., Hasselman, F., Adams, B. G., Adams, R. B., Alper, S., ⌠Nosek, B. A. (2018). Many Labs 2: Investigating Variation in Replicability Across Samples and Settings. Advances in Methods and Practices in Psychological Science, 1(4), 443â490. https://doi.org/10.1177/2515245918810225
Klein, R. A., Ratliff, K. A., Vianello, M., Adams, R. B., BahnĂk, Ĺ ., Bernstein, M. J., et al. (2014). Investigating variation in replicability: A âmany labsâ replication project. Social Psychology, 45, 142â152. https://doi.org/10.1027/1864-9335/a000178
Piwowar, H.A., & Vision, T.J. (2013). Data reuse and the open data citation advantage. PeerJ, 1, e175 https://doi.org/10.7717/peerj.175
Wicherts, J. M., Borsboom, D., Kats, J., & Molenaar, D. (2006). The poor availability of psychological research data for reanalysis. American Psychologist, 61(7), 726â728. https://doi.org/10.1037/0003-066X.61.7.726
Wicherts, J. M., Veldkamp, C. L., Augusteijn, H. E., Bakker, M., Van Aert, R., & Van Assen, M. A. (2016). Degrees of freedom in planning, running, analyzing, and reporting psychological studies: A checklist to avoid p-hacking. Frontiers in Psychology, 7, 1832.
Hand, D. J. (2018). Aspects of data ethics in a changing world: Where are we now? Big Data, 6(3), :176â190. doi: https://doi.org/10.1089/big.2018.0083
OâCallaghan, E., & Douglas, H. M. (2021). #MeToo Online Disclosures: A Survivor-Informed Approach to Open Science Practices and Ethical Use of Social Media Data. Psychology of Women Quarterly, 45(4), 505â525. https://doi.org/10.1177/03616843211039175
Ross, M. W., Iguchi, M. Y., & Panicker, S. (2018). Ethical aspects of data sharing and research participant protections. American Psychologist, 73(2), 138-145. http://dx.doi.org/10.1037/amp0000240
Siler, K., Haustein, S., Smith, E., Larivière, V., & Alperin, J. P. (2018). Authorial and institutional stratification in open access publishing: the case of global health research. PeerJ, 6, e4269. doi:10.7717/peerj.4269
Walsh, C. G., Xia, W., Li, M., Denny, J. C., Harris, P. A., & Malin, B. A. (2018). Enabling open-science initiatives in clinical psychology and psychiatry without sacrificing patientsâ privacy: Current practices and future challenges. Advances in Methods and Practices in Psychological Science, 1(1), 104â114. https://doi.org/10.1177/2515245917749652
Houtkoop, B. L., Chambers, C., Macleod, M., Bishop, D. V. M., Nichols, T. E., & Wagenmakers, E.-J. (2018). Data sharing in psychology: A survey on barriers and preconditions. Advances in Methods and Practices in Psychological Science, 1(1), 70â85. https://doi.org/10.1177/2515245917751886
Peng, R. (2015). The reproducibility crisis in science: A statistical counterattack. Significance, 12(3). https://doi.org/10.1111/j.1740-9713.2015.00827.x
Rouder, J. N. (2016). The what, why, and how of born open data. Behavior Research Methods, 48, 1062â1069. doi:10.3758/s13428-015-0630-z
Walsh, C. G., Xia, W., Li, M., Denny, J. C., Harris, P. A., & Malin, B. A. (2018). Enabling open-science initiatives in clinical psychology and psychiatry without sacrificing patientsâ privacy: Current practices and future challenges. Advances in Methods and Practices in Psychological Science, 1(1), 104â114. https://doi.org/10.1177/2515245917749652
Wicherts, J. M., Borsboom, D., Kats, J., & Molenaar, D. (2006). The poor availability of psychological research data for reanalysis. American Psychologist, 61(7), 726â728. https://doi.org/10.1037/0003-066X.61.7.726
Wicherts, J. M., Veldkamp, C. L., Augusteijn, H. E., Bakker, M., Van Aert, R., & Van Assen, M. A. (2016). Degrees of freedom in planning, running, analyzing, and reporting psychological studies: A checklist to avoid p-hacking. Frontiers in Psychology, 7, 1832.
Preregistration entails laying out a complete methodology and analysis before a study has been undertaken. This facilitates transparency and removes several potential QRPs. When teaching, students should attain knowledge regarding what a pre-registration entails, why it is important to remove potential QRPs and how to address deviations from preregistered plans. There are 6 sub-clusters which aim to further parse the learning and teaching process:
Distinguishing exploratory and confirmatory analyses, transparency measures.
Dal-RĂŠ, R., Ioannidis, J. P., Bracken, M. B., Buffler, P. A., Chan, A.-W., Franco, E. L., La Vecchia, C., Weiderpass, E. (2014). Making prospective registration of observational research a reality. Science translational medicine, 6(224), 224cm1. DOI: https://doi.org/10.1126/scitranslmed.3007513
Nosek, B. A., & Lakens, D. (2014). Registered reports: A method to increase the credibility of published results. Social Psychology, 45, 137â141.
Lin, W., & Green, D. P. (2016). Standard operating procedures: A safety net for pre-analysis plans. PS: Political Science & Politics, 49(3), 495-500.
Nosek, B. A., Ebersole, C. R., DeHaven, A., & Mellor, D. (2018). The Preregistration Revolution. Proceedings of National Academy Sciences, 115(11), 2600-2606. https://doi.org/10.1073/pnas.1708274114
Wicherts, J. M., Veldkamp, C. L., Augusteijn, H. E., Bakker, M., Van Aert, R., & Van Assen, M. A. (2016). Degrees of freedom in planning, running, analyzing, and reporting psychological studies: A checklist to avoid p-hacking. Frontiers in Psychology, 7, 1832.doi: 10.3389/fpsyg.2016.01832
Nuzzo, R. (2015). How scientists fool themselves â and how they can stop. Nature, 526, 182â185.
Wagenmakers, E. J., & Dutilh, G. (2016). Seven selfish reasons for preregistration. APS Observer, 29(9).
*OSC, CREP, ManyLabs, etc.
Chambers, C. D. (2013). Registered reports: A new publishing initiative at Cortex. Cortex, 49(3), 609-610. https://doi.org/10.1016/j.cortex.2012.12.016
Chambers, C. D., Feredoes, E., Muthukumaraswamy, S. D., & Etchells, P. (2014). Instead of âplaying the gameâ it is time to change the rules: Registered Reports at AIMS Neuroscience and beyond. AIMS Neuroscience, 1(1), 4â17. DOI: 10.3934/Neuroscience2014.1.4
Chambers, C.D., Dienes, Z., McIntosh, R.D., Rotshtein, P., & Willmes, K. (2015). Registered Reports: Realigning incentives in scientific publishing. Cortex, 66, A1-2. DOI: 10.1016/j.cortex.2015.03.022
Chambers, C.D., Dienes, Z., McIntosh, R.D., Rotshtein, P., & Willmes, K. (2015). Registered Reports: Realigning incentives in scientific publishing. Cortex, 66, A1-2. DOI: https://doi.org/10.1016/j.cortex.2015.03.022
Haven, Tamarinde., L. & Van Grootel, Leonie. (2019). Preregistering qualitative research. Accountability in Research, 26(3), 229-244., DOI: https://doi.org/10.1080/08989621.2019.1580147
Kirtley, O. J., Lafit, G., Achterhof, R., Hiekkaranta, A. P., & Myin-Germeys, I. (2019, April 10). Making the black box transparent: A pre-registration template for studies using Experience Sampling Methods (ESM). https://doi.org/10.31234/osf.io/seyq7
Mertens, G., & Krypotos, A. (2019, February 20). Preregistration of secondary analyses. https://doi.org/10.31234/osf.io/ph4q7
Nosek, B. A., Ebersole, C. R., DeHaven, A. C., & Mellor, D. T. (2018). The preregistration revolution. Proceedings of the National Academy of Sciences, 115(11), 2600-2606. https://doi.org/10.1073/pnas.1708274114
Mertens, G., & Krypotos, A. (2019, February 20). Preregistration of secondary analyses. https://doi.org/10.31234/osf.io/ph4q7
COS: 8 Answers About Registered Reports and Research Preregistration
L. Haven, T., & Van Grootel, D. L. (2019). Preregistering qualitative research. Accountability in Research, 26(3), 229-244. https://doi.org/10.1080/08989621.2019.1580147
Nosek, B. A., Ebersole, C. R., DeHaven, A., & Mellor, D. (2018). The Preregistration Revolution. PNAS, 115(11), 2600-2606. https://doi.org/10.1073/pnas.1708274114
Attainment of a grounding in ‘replication research’, which takes a variety of forms, each with a different purpose and contribution. Reproducible science requires replication research. When teaching, students should understand the purpose and need of replications in its variety of forms and being able to conduct (and join) replication projects. There are 6 sub-clusters which aim to further parse the learning and teaching process:
Fidler, F., & Wilcox, J. (2018). Reproducibility of scientific results. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy (Winter 2018). Metaphysics Research Lab, Stanford University. https://plato.stanford.edu/archives/win2018/entries/scientific-reproducibility/
Frank, M. C., & Saxe, R. (2012). Teaching replication. Perspectives on Psychological Science, 7(6), 600â604. https://doi.org/10.1177/1745691612460686
GarcĂa, FM. (2016). Replication and the manufacture of scientific inferences: A formal approach. International Studies Perspectives, 17(4), 408â425. https://doi.org/10.1093/isp/ekv011
Van Bavel, J. J., Mende-Siedlecki, P., Brady, W. J., & Reinero, D. A. (2016). Contextual sensitivity in scientific reproducibility. Proceedings of the National Academy of Sciences, 113(23), 6454-6459. https://doi.org/10.1073/pnas.1521897113
Zwaan, R.A., Etz, A., Lucas, R.E, Donnellan, M.B. (2018). Making replication mainstream. Behavior and Brain Sciences, 41, e120. https://doi.org/10.1017/S0140525X17001972
*OSC, CREP, ManyLabs, etc.
Klein, R. A., Ratliff, K. A., Vianello, M., Adams, R. B., BahnĂk, Ĺ ., Bernstein, M. J., et al. (2014). Investigating variation in replicability: A âmany labsâ replication project. Social Psychology, 45, 142â152. https://doi.org/10.1027/1864-9335/a000178
Klein, R. A., Vianello, M., Hasselman, F., Adams, B. G., Adams, R. B., Alper, S., ⌠Nosek, B. A. (2018). Many Labs 2: Investigating Variation in Replicability Across Samples and Settings. Advances in Methods and Practices in Psychological Science, 1(4), 443â490. https://doi.org/10.1177/2515245918810225
Open Science Collaboration (2012). An open, large-scale, collaborative effort to estimate the reproducibility of psychological science. Perspectives on Psychological Science, 7, 657â660.
Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aaC6716. DOI: 10.1126/science.aaC6716
Van Bavel, J. J., Mende-Siedlecki, P., Brady, W. J., & Reinero, D. A. (2016). Contextual sensitivity in scientific reproducibility. Proceedings of the National Academy of Sciences, 113(23), 6454-6459. https://doi.org/10.1073/pnas.1521897113
Direct replications use the exact same methods and materials, while conceptual replications test the same concept but with different methods, materials, or both.
Kunert, R. (2016). Internal conceptual replications do not increase independent replication success. Psychonomic bulletin & review, 23(5), 1631-1638. https://doi.org/10.3758/s13423-016-1030-9
Simons, D. J. (2014). The Value of Direct Replication. Perspectives on Psychological Science, 9(1), 76â80. https://doi.org/10.1177/1745691613514755
Van Bavel, J. J., Mende-Siedlecki, P., Brady, W. J., & Reinero, D. A. (2016). Contextual sensitivity in scientific reproducibility. Proceedings of the National Academy of Sciences, 113(23), 6454-6459. https://doi.org/10.1073/pnas.1521897113
Zwaan, R.A., Etz, A., Lucas, R.E, Donnellan, M.B. (2018). Making replication mainstream. Behavior and Brain Sciences, 41, e120. https://doi.org/10.1017/S0140525X17001972
Grahe, J. E., Brandt, M. J., Wagge, J. R., Legate, N., Wiggins, B. J., Christopherson, C. D., . . . LePine, S. (2018). Collaborative Replications and Education Project (CREP). Retrieved from https://osf.io/wfc6u/
Grahe, J. E., Reifman, A., Hermann, A. D., Walker, M., Oleson, K. C., Nario-Redmond, M., & Wiebe, R. P. (2012). Harnessing the undiscovered resource of student research projects. Perspectives on Psychological Science, 7(6), 605â607. https://doi.org/10.1177/1745691612459057
Frank, M. C., & Saxe, R. (2012). Teaching replication. Perspectives on Psychological Science, 7(6), 600â604. https://doi.org/10.1177/1745691612460686
Lenne & Mann (2016). CREP project report. https://osf.io/sdj7e/
Stanley, D. J., & Spence, J. R. (2014). Expectations for replications: Are yours realistic? Perspectives on Psychological Science, 9(3), 305-318. https://doi.org/10.1177/1745691614528518
Wagge, J. R., Brandt, M. J., Lazarevic, L. B., Legate, N., Christopherson, C., Wiggins, B., & Grahe, J. E. (2019). Publishing research with undergraduate students via replication work: The collaborative replications and education project. Frontiers in psychology, 10, 247.
Registered Reports are studies that are peer-reviewed prior to data collection, with an agreement between the journal and the author(s) that it will be published regardless of outcome as long as the preregistered methods are reasonably followed. Registered REPLICATION Reports are a special category of these that only include replications.
Simons, D. J., Holcombe, A. O., & Spellman, B. A. (2014). An Introduction to Registered Replication Reports at Perspectives on Psychological Science. Perspectives on Psychological Science, 9(5), 552â555. https://doi.org/10.1177/1745691614543974
Ongoing Replication projects https://www.psychologicalscience.org/publications/replication/ongoing-projects
Alogna, V. K., Attaya, M. K., Aucoin, P., BahnĂk, Ĺ ., Birch, S., Birt, A. R., … & Buswell, K. (2014). Registered replication report: Schooler and engstler-schooler (1990). Perspectives on Psychological Science, 9(5), 556-578.
Eerland, A., Sherrill, A. M., Magliano, J. P., Zwaan, R. A., Arnal, J. D., Aucoin, P., … & Crocker, C. (2016). Registered replication report: Hart & albarracĂn (2011). Perspectives on Psychological Science, 11(1), 158-171.
Psychological Science Accelerator (PSA) Ongoing Replications
Sometimes responses to replication research can be negative. Failed replications of famous work, most notably power posing, ego depletion, stereotype threat, and facial feedback, have received a lot of attention.
Neuliep, J. W., & Crandall, R. (1990). Editorial bias against replication research. Journal of Social Behavior & Personality, 5(4), 85-90.
Neuliep, J. W., & Crandall, R. (1993). Reviewer bias against replication research. Journal of Social Behavior & Personality, 8(6), 21-29.
Attainment of a grounding in topics related to academia and academics. Students should understand how individuals, teams, institutions, and academic culture work together to promote (or hinder) openness, inclusion, diversity, equity and transparency. Gathering perspectives on navigating scientific and academic life. Learning the challenges and rewards in the academic setting, the âhidden curriculumâ in academic life.
There are 8 sub-clusters which aim to further parse the learning and teaching process:
Diversity is the presence of difference within a specific environment, e.g. racial diversity, gender diversity, social-economic diversity, etc.
Bahlai, C., Bartlett, L. J., Burgio, K. R., Fournier, A., Keiser, C. N., Poisot, T., & Whitney, K. S. (2019). Open science isnât always open to all scientists. American Scientist, 107(2), 78-82. https://doi.org/10.1511/2019.107.2.78
Cislak, A., Formanowicz, M., & Saguy, T. (2018). Bias against research on gender bias. Scientometrics, 115(1), 189-200. https://doi.org/10.1007/s11192-018-2667-0
Flaherty, C. (2020, August, 20). Something’s Got to Give. Inside Higher Ed. Retrieved from https://www.insidehighered.com/news/2020/08/20/womens-journal-submission-rates-continue-fall
Kim, E., & Patterson, S. (2020). The Pandemic and Gender Inequality in Academia. Available at SSRN 3666587. http://dx.doi.org/10.2139/ssrn.3666587
Larivière, V., Ni, C., Gingras, Y., Cronin, B., & Sugimoto, C. R. (2013). Bibliometrics: Global gender disparities in science. Nature News, 504(7479), 211. https://doi.org/10.1038/504211a
Myers, K. R., Tham, W. Y., Yin, Y., Cohodes, N., Thursby, J. G., Thursby, M. C., … & Wang, D. (2020). Unequal effects of the COVID-19 pandemic on scientists. Nature human behaviour, 4(9), 880-883.
Quagliata, T. (2008). Is there a positive correlation between socioeconomic status and academic achievement?. Paper: Education masters (p. 78). https://fisherpub.sjfc.edu/cgi/viewcontent.cgi?article=1077&context=education_ETD_masters
Roberson, M. L. (2020). On supporting early-career Black scholars. Nature Human Behaviour, 1-1. https://doi.org/10.1038/s41562-020-0926-6
APA (2010). Sexual Orientation, Gender identity & Socioeconomic Status [Blog post]. Retrieved fromhttps://www.apa.org/pi/ses/resources/publications/lgbt
APA (2010). Disability & Socioeconomic Status [Blog post]. Retrieved from https://www.apa.org/pi/ses/resources/publications/disability
APA. (2017, July). Women & Socioeconomic Status [Blog post]. Retrieved from https://www.apa.org/pi/ses/resources/publications/women
APA (2017, July). Ethnic and Racial Minorities & Socioeconomic Status [Blog post]. Retrieved from https://www.apa.org/pi/ses/resources/publications/minorities
APA (2017, July). Education and Socioeconomic Status [Blog post]. Retrieved from https://www.apa.org/pi/ses/resources/publications/education
Equity is that everyone has access to the same opportunities and that we all have privileges and barriers, thus we do not all start from the same starting position.
APA (2010). Sexual Orientation, Gender identity & Socioeconomic Status [Blog post]. Retrieved fromhttps://www.apa.org/pi/ses/resources/publications/lgbt
APA (2010). Disability & Socioeconomic Status [Blog post]. Retrieved from https://www.apa.org/pi/ses/resources/publications/disability
APA. (2017, July). Women & Socioeconomic Status [Blog post]. Retrieved from https://www.apa.org/pi/ses/resources/publications/women APA (2017, July). Ethnic and Racial Minorities & Socioeconomic Status [Blog post]. Retrieved from https://www.apa.org/pi/ses/resources/publications/minorities
APA (2017, July). Education and Socioeconomic Status [Blog post]. Retrieved from https://www.apa.org/pi/ses/resources/publications/education
Bahlai, C., Bartlett, L. J., Burgio, K. R., Fournier, A., Keiser, C. N., Poisot, T., & Whitney, K. S. (2019). Open science isnât always open to all scientists. American Scientist, 107(2), 78-82. https://doi.org/10.1511/2019.107.2.78
Cislak, A., Formanowicz, M., & Saguy, T. (2018). Bias against research on gender bias. Scientometrics, 115(1), 189-200. https://doi.org/10.1007/s11192-018-2667-0
Flaherty, C. (2020, August, 20). Something’s Got to Give. Inside Higher Ed. Retrieved from https://www.insidehighered.com/news/2020/08/20/womens-journal-submission-rates-continue-fall
Kim, E., & Patterson, S. (2020). The Pandemic and Gender Inequality in Academia. Available at SSRN 3666587. http://dx.doi.org/10.2139/ssrn.3666587
Larivière, V., Ni, C., Gingras, Y., Cronin, B., & Sugimoto, C. R. (2013). Bibliometrics: Global gender disparities in science. Nature News, 504(7479), 211. https://doi.org/10.1038/504211a
Myers, K. R., Tham, W. Y., Yin, Y., Cohodes, N., Thursby, J. G., Thursby, M. C., … & Wang, D. (2020). Unequal effects of the COVID-19 pandemic on scientists. Nature human behaviour, 4(9), 880-883.
Quagliata, T. (2008). Is there a positive correlation between socioeconomic status and academic achievement?. Paper: Education masters (p. 78). https://fisherpub.sjfc.edu/cgi/viewcontent.cgi?article=1077&context=education_ETD_masters
Roberson, M. L. (2020). On supporting early-career Black scholars. Nature Human Behaviour, 1-1. https://doi.org/10.1038/s41562-020-0926-6
Inclusion is that individuals with different representations, identities and feelings being respected, influenced, and welcomed in a specific environment.
Bahlai, C., Bartlett, L. J., Burgio, K. R., Fournier, A., Keiser, C. N., Poisot, T., & Whitney, K. S. (2019). Open science isnât always open to all scientists. American Scientist, 107(2), 78-82. https://doi.org/10.1511/2019.107.2.78
Carli, L. L., Alawa, L., Lee, Y., Zhao, B., & Kim, E. (2016). Stereotypes about gender and science: Womenâ scientists. Psychology of Women Quarterly, 40(2), 244-260 https://doi.org/10.1177/0361684315622645
Cislak, A., Formanowicz, M., & Saguy, T. (2018). Bias against research on gender bias. Scientometrics, 115(1), 189-200. https://doi.org/10.1007/s11192-018-2667-0
Eagly, A. H., & Miller, D. I. (2016). Scientific eminence: Where are the women?. Perspectives on Psychological Science, 11(6), 899-904.
Flaherty, C. (2020, August, 20). Something’s Got to Give. Inside Higher Ed. Retrieved from https://www.insidehighered.com/news/2020/08/20/womens-journal-submission-rates-continue-fall
Henrich, J., Heine, S. & Norenzayan, A. (2010) Most people are not WEIRD. Nature 466, 29. https://doi.org/10.1038/466029a
Larivière, V., Ni, C., Gingras, Y., Cronin, B., & Sugimoto, C. R. (2013). Bibliometrics: Global gender disparities in science. Nature News, 504(7479), 211. https://doi.org/10.1038/504211a
Macoun, A., & Miller, D. (2014). Surviving (thriving) in academia: Feminist support networks and women ECRs. Journal of Gender Studies, 23(3), 287-301. https://doi.org/10.1080/09589236.2014.909718
Myers, K. R., Tham, W. Y., Yin, Y., Cohodes, N., Thursby, J. G., Thursby, M. C., … & Wang, D. (2020). Unequal effects of the COVID-19 pandemic on scientists. Nature human behaviour, 4(9), 880-883.
Risner, L. E., Morin, X. K., Erenrich, E. S., Clifford, P. S., Franke, J., Hurley, I., & Schwartz, N. B. (2020). Leveraging a collaborative consortium model of mentee/mentor training to foster career progression of underrepresented postdoctoral researchers and promote institutional diversity and inclusion. PloS one, 15(9), e0238518. https://doi.org/10.1371/journal.pone.0238518
Roberson, M. L. (2020). On supporting early-career Black scholars. Nature Human Behaviour, 1-1. https://doi.org/10.1038/s41562-020-0926-6
Skitka, L. J., Melton, Z. J., Mueller, A. B., & Wei, K. Y. (2020). The Gender Gap: Who Is (and Is Not) Included on Graduate-Level Syllabi in Social/Personality Psychology. Personality and Social Psychology Bulletin, 0146167220947326. https://doi.org/10.1177/0146167220947326
Citizen science is scientific research conducted, in whole or in part, by amateur (or nonprofessional) scientists. Citizen science is sometimes described as “public participation in scientific research,” participatory monitoring, and participatory action research whose outcomes are often advancements in scientific research by improving the scientific communities capacity, as well as an increasing the public’s understanding of science.
Hart, D. D., & Silka, L. (2020). Rebuilding the ivory tower: bottom-up experiment in aligning research with societal needs. Issues Sci Technol, 36(3), 64-70. https://issues.org/aligning-research-with-societal-needs/
Bonney, R., Cooper, C. B., Dickinson, J., Kelling, S., Phillips, T., Rosenberg, K. V., & Shirk, J. (2009). Citizen science: a developing tool for expanding science knowledge and scientific literacy. BioScience, 59(11), 977-984.
Bonney, R., Shirk, J. L., Phillips, T. B., Wiggins, A., Ballard, H. L., Miller-Rushing, A. J., & Parrish, J. K. (2014). Next steps for citizen science. Science, 343(6178), 1436-1437.
Cohn, J. P. (2008). Citizen science: Can volunteers do real research?. BioScience, 58(3), 192-197.
Team science institutions coordinate a large group of scientists to solve a problem. Individual scientists are rewarded a publication by the institution for their efforts and resources. Once a group signs onto a team science project, the institution serves as a coordinating role, merging the resources from all scientists and focusing on a common project.
Forscher, P. S., Wagenmakers, E. J., DeBruine, L., Coles, N., Silan, M. A., & IJzerman, H. (2020). A Manifesto for Team Science. Retrieved from https://psyarxiv.com/2mdxh Silberzahn, R., & Uhlmann, E. L. (2015). Crowdsourced research: Many hands make tight work. Nature News, 526(7572), 189. https://doi.org/10.1038/526189a
Wagge, J. R., Brandt, M. J., Lazarevic, L. B., Legate, N., Christopherson, C., Wiggins, B., & Grahe, J. E. (2019). Publishing research with undergraduate students via replication work: The collaborative replications and education project. Frontiers in psychology, 10, 247. https://doi.org/10.3389/fpsyg.2019.00247
Tijdink, J. K., Verbeke, R., & Smulders, Y. M. (2014). Publication pressure and scientific misconduct in medical scientists. Journal of Empirical Research on Human Research Ethics, 9(5), 64-71. https://doi.org/10.1177/1556264614552421
Bateman, I., Kahneman, D., Munro, A., Starmer, C., & Sugden, R. (2005). Testing competing models of loss aversion: An adversarial collaboration. Journal of Public Economics, 89(8), 1561-1580.
Sometimes responses to replication research can be negative. Failed replications of famous work, most notably power posing, ego depletion, stereotype threat, and facial feedback, have received a lot of attention.
Bol, T., de Vaan, M., & van de Rijt, A. (2018). The Matthew effect in science funding. Proceedings of the National Academy of Sciences, 115(19), 4887-4890.
Corker, K. S. (2017). Why a Focus on Eminence is Misguided: A Call to Return to Basic Scientific Values. https://doi.org/10.31234/osf.io/yqfrd
Diener, E. (2016). Improving departments of psychology. Perspectives on Psychological Science, 11(6), 909-912.
Ebersole, C. R., Axt, J. R., & Nosek, B. A. (2016). Scientistsâ reputations are based on getting it right, not being right. PLoS biology, 14(5), e1002460.
Fanelli, D. (2012). Negative results are disappearing from most disciplines and countries. Scientometrics, 90(3), 891-904. https://doi.org/10.1007/s11192-011-0494-7
Feist, G. J. (2016). Intrinsic and extrinsic science: A dialectic of scientific fame. Perspectives on Psychological Science, 11(6), 893-898.
Ferreira, F. (2017). Fame: Iâm Skeptical. https://doi.org/10.31234/osf.io/6zb4f
Flier J. (2017) Faculty promotion must assess reproducibility. Nature, 549(7671),133. https://doi.org/10.1038/549133a
Foss, D. J. (2016). Eminence and omniscience: Statistical and clinical prediction of merit. Perspectives on Psychological Science, 11(6), 913-916.
Gernsbacher, M. A. (2018). Rewarding research transparency. Trends in cognitive sciences, 22(11), 953-956. https://doi.org/10.1016/j.tics.2018.07.002
Hirsch, J. E. (2010). An index to quantify an individualâs scientific research output that takes into account the effect of multiple coauthorship. Scientometrics, 85(3), 741-754.
Innes-Ker, Ă . (2017). The Focus on Fame Distorts Science. https://doi.org/10.31234/osf.io/vyr3e
Ioannidis, J. P., & Thombs, B. D. (2019). A userâs guide to inflated and manipulated impact factors. European journal of clinical investigation, 49(9), e13151. https://doi.org/10.1111/eci.13151
Jamieson, K. H., McNutt, M., Kiermer, V., & Sever, R. (2019). Signaling the trustworthiness of science. Proceedings of the National Academy of Sciences, 116(39), 19231-19236.https://doi.org/10.1073/pnas.1913039116
Jamieson, K. H., McNutt, M., Kiermer, V., & Sever, R. (2020). Reply to Kornfeld and Titus: No distraction from misconduct. Proceedings of the National Academy of Sciences of the United States of America, 117(1), 42. https://doi.org/10.1073/pnas.1918001116
Kornfeld, D. S., & Titus, S. L. (2016). Stop ignoring misconduct. Nature, 537(7618), 29-30.https://doi.org/10.1038/537029a
Kornfeld, D. S., & Titus, S. L. (2020). Signaling the trustworthiness of science should not be a substitute for direct action against research misconduct. Proceedings of the National Academy of Sciences of the United States of America, 117(1), 41. https://doi.org/10.1073/pnas.1917490116
Li, W., Aste, T., Caccioli, F., & Livan, G. (2019). Early coauthorship with top scientists predicts success in academic careers. Nature communications, 10(1), 1-9.
Matosin, N., Frank, E., Engel, M., Lum, J. S., & Newell, K. A. (2014). Negativity towards negative results: a discussion of the disconnect between scientific worth and scientific culture. Disease Models & Mechanisms, 7(2), 171. https://doi.org/10.1242/dmm.015123
Morgan, A. C., Economou, D. J., Way, S. F., & Clauset, A. (2018). Prestige drives epistemic inequality in the diffusion of scientific ideas. EPJ Data Science, 7(1), 40. https://doi.org/10.1140/epjds/s13688-018-0166-4
Naudet, F., Ioannidis, J., Miedema, F., Cristea, I. A., Goodman, S. N., & Moher, D. (2018). Six principles for assessing scientists for hiring, promotion, and tenure. Impact of Social Sciences Blog. http://eprints.lse.ac.uk/90753/
Pickett, C. (2017). Let’s Look at the Big Picture: A System-Level Approach to Assessing Scholarly Merit. https://doi.org/10.31234/osf.io/tv6nb
Roediger III, H. L. (2016). Varieties of fame in psychology. Perspectives on Psychological Science, 11(6), 882-887.
Ruscio, J. (2016). Taking advantage of citation measures of scholarly impact: Hip Hip h Index!. Perspectives on Psychological Science, 11(6), 905-908.
Shiota, M. N. (2017). âFameâ is the Problem:Conflation of Visibility With Potential for Long-Term Impact in Psychological Science. https://doi.org/10.31234/osf.io/4kwuq
Simonton, D. K. (2016). Giving credit where creditâs due: Why itâs so hard to do in psychological science. Perspectives on Psychological Science, 11(6), 888-892.
Tressoldi, P. E., GiofrĂŠ, D., Sella, F., & Cumming, G. (2013). High impact= high statistical standards? Not necessarily so. PloS one, 8(2), e56180.
Sternberg, R. J. (2016). âAm I famous yet?â Judging scholarly merit in psychological science: An introduction. Perspectives on Psychological Science, 11(6), 877-881.
Vazire, S. (2017). Against eminence.https://doi.org/10.31234/osf.io/djbcw
Van Dijk, D., Manor, O., & Carey, L. B. (2014). Publication metrics and success on the academic job market. Current Biology, 24(11), R516-R517.
Gomez, P., Anderson, A. R., & Baciero, A. (2017). Lessons for psychology laboratories from industrial laboratories. Research Ethics, 13(3-4), 155-160. https://doi.org/10.1177/1747016117693827
Nature. (2019). Postdocs in crisis: science cannot risk losing the next generation. Nature, 580, 160. https://doi.org/10.1038/d41586-020-02541-9
Nature. (2019). The mental health of PhD researchers demands urgent attention. Nature, 575, 257-258. https://doi.org/10.1038/d41586-019-03489-1
Nature. (2020). Seeking an âexit planâ for leaving academia amid coronavirus worries. Nature 583, 645-646. https://doi.org/10.1038/d41586-020-02029-6.
It aims to understand the nature of gender inequality. Themes explored include discrimination, objectification, oppression, patriarchy, stereotyping, and aesthetics. It examines women’s and men’s social roles, experiences, interests, chores, and feminist politics in a variety of fields.
Eagly, A. H., Eaton, A., Rose, S. M., Riger, S., & McHugh, M. C. (2012). Feminism and psychology: Analysis of a half-century of research on women and gender. American Psychologist, 67(3), 211-230, https://doi.org/10.1037/a0027260
Matsick, J. L., Kruk, M., Oswald, F., & Palmer, L. (2021). Bridging Feminist Psychology and Open Science: Feminist Tools and Shared Values Inform Best Practices for Science Reform. Psychology of Women Quarterly, https://doi.org/10.1177/03616843211026564
Lazard, L., & McAvoy, J. (2020). Doing reflexivity in psychological research: Whatâs the point? Whatâs the practice?. Qualitative research in psychology, 17(2), 159-177 https://doi.org/10.1080/14780887.2017.1400144
Macleod, C. I., Capdevila, R., Marecek, J., Braun, V., Gavey, N., & Wilkinson, S. (2021). Celebrating 30 years of Feminism & Psychology. Feminism & Psychology, 31(3), 313-325. https://doi.org/10.1177/09593535211027457
Crawford, M., & Marecek, J. (1989). Feminist theory, feminist psychology: A bibliography of epistemology, critical analysis, and applications. Psychology of Women Quarterly, 13(4), 477-491. https://doi.org/10.1111/j.1471-6402.1989.tb01015.x
Marecek, J. (2016). Invited reflection: Intersectionality theory and feminist psychology. Psychology of Women Quarterly, 40(2), 177-181. https://doi.org/10.1177/0361684316641090
OpenSexism Archives on Open Science https://opensexism.wordpress.com/tag/open-science/