2023
Krähmer, Daniel; Schächtele, Laura; Schneck, Andreas
Care to share? Experimental evidence on code sharing behavior in the social sciences Journal Article
In: PLOS ONE, vol. 18, no. 8, pp. e0289380, 2023.
@article{krahmer_care_2023,
title = {Care to share? Experimental evidence on code sharing behavior in the social sciences},
author = {Daniel Krähmer and Laura Schächtele and Andreas Schneck},
url = {https://dx.plos.org/10.1371/journal.pone.0289380},
doi = {10.1371/journal.pone.0289380},
year = {2023},
date = {2023-08-01},
urldate = {2023-08-16},
journal = {PLOS ONE},
volume = {18},
number = {8},
pages = {e0289380},
abstract = {Transparency and peer control are cornerstones of good scientific practice and entail the replication and reproduction of findings. The feasibility of replications, however, hinges on the premise that original researchers make their data and research code publicly available. This applies in particular to large-N observational studies, where analysis code is complex and may involve several ambiguous analytical decisions. To investigate which specific factors influence researchers’ code sharing behavior upon request, we emailed code requests to 1,206 authors who published research articles based on data from the European Social Survey between 2015 and 2020. In this preregistered multifactorial field experiment, we randomly varied three aspects of our code request’s wording in a 2x4x2 factorial design: the overall framing of our request (enhancement of social science research, response to replication crisis), the appeal why researchers should share their code (FAIR principles, academic altruism, prospect of citation, no information), and the perceived effort associated with code sharing (no code cleaning required, no information). Overall, 37.5% of successfully contacted authors supplied their analysis code. Of our experimental treatments, only framing affected researchers’ code sharing behavior, though in the opposite direction we expected: Scientists who received the negative wording alluding to the replication crisis were more likely to share their research code. Taken together, our results highlight that the availability of research code will hardly be enhanced by small-scale individual interventions but instead requires large-scale institutional norms.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
Transparency and peer control are cornerstones of good scientific practice and entail the replication and reproduction of findings. The feasibility of replications, however, hinges on the premise that original researchers make their data and research code publicly available. This applies in particular to large-N observational studies, where analysis code is complex and may involve several ambiguous analytical decisions. To investigate which specific factors influence researchers’ code sharing behavior upon request, we emailed code requests to 1,206 authors who published research articles based on data from the European Social Survey between 2015 and 2020. In this preregistered multifactorial field experiment, we randomly varied three aspects of our code request’s wording in a 2x4x2 factorial design: the overall framing of our request (enhancement of social science research, response to replication crisis), the appeal why researchers should share their code (FAIR principles, academic altruism, prospect of citation, no information), and the perceived effort associated with code sharing (no code cleaning required, no information). Overall, 37.5% of successfully contacted authors supplied their analysis code. Of our experimental treatments, only framing affected researchers’ code sharing behavior, though in the opposite direction we expected: Scientists who received the negative wording alluding to the replication crisis were more likely to share their research code. Taken together, our results highlight that the availability of research code will hardly be enhanced by small-scale individual interventions but instead requires large-scale institutional norms.
Krähmer, Daniel
MFCURVE: Stata module for plotting results from multifactorial research designs. Miscellaneous
2023, (test).
@misc{krahmer_mfcurve_2023,
title = {MFCURVE: Stata module for plotting results from multifactorial research designs.},
author = {Daniel Krähmer},
year = {2023},
date = {2023-01-01},
publisher = {Statistical Software Components S459224, Boston College Department of Economics},
abstract = {mfcurve plots an outcome across multifactorial treatment combinations. The ado facilitates the visualization of results from factorial survey experiments, conjoint analysis, and other related research designs. mfcurve produces a plot that consists of two subgraphs: an upper panel, displaying the mean outcome per group; and a lower panel, indicating the presence/absence of each factor level. The graph mimics the aesthetics of a specification curve but displays effect variation across treatment specifications (instead of variation across model specifications).},
note = {test},
keywords = {},
pubstate = {published},
tppubtype = {misc}
}
mfcurve plots an outcome across multifactorial treatment combinations. The ado facilitates the visualization of results from factorial survey experiments, conjoint analysis, and other related research designs. mfcurve produces a plot that consists of two subgraphs: an upper panel, displaying the mean outcome per group; and a lower panel, indicating the presence/absence of each factor level. The graph mimics the aesthetics of a specification curve but displays effect variation across treatment specifications (instead of variation across model specifications).