Reproducibility Internships: Goals and Outcomes

Lars Vilhuber

2024-10-07

Richard Ball asked the (rhetorical) question “Is it feasible to include reproducible research methods in undergraduate training?” (Ball 2023), answering that question with “Yes we can!”.

I will go beyond that, and argue “Yes we should!”

Why?

There is a lot of opportunity

  • As of September 2024, the LDI Replication Lab has verified around 2500 distinct AEA journal manuscripts as to their computational reproducibility. (Vilhuber 2024)

This is being done right now …

  • Since 2014 (starting before the official AEA work), more than 200 undergraduates have been trained, and have helped in the assessment.

… but not in the classroom (mostly)

  • The training is not part of a regular (Cornell) curriculum.

What is a replication package?

Student involvement

What do the students do?

  • Analyze data provenance as described by authors
  • Verify ability to computationally reproduce results
  • Attempt reproduction as per instructions (README)
  • Inevitably debug
  • Write a shareable report

How are the students trained?

  • Intense 1-day training on principles and technology
  • Loose follow-up on three examples
  • Then real cases.

What are students trained on?

  • Versioning: All reproducibility checks are versioned (git crucial)
  • Running code (R, Stata, Python, Julia, Matlab, SAS, …)
  • Running code reproducibly

Data Acumen

  • What is data provenance?
  • How to obtain data
  • Importance of knowing your rights and obligations (licenses, etc.)
  • How to convey how you obtained the data, and where

Communication

  • Debugging: There is always some … and reproducibly debugging is important (internal communication)
  • Writing reports: external communication to authors, conveying what was done, and what may have gone wrong (constructive and objective criticism)

Organization

  • Organizing their work within the Lab
  • Structuring reports and other documents

Technical reasons

  • Use of specific workflow management system
  • Use of Windows computers!
  • other tools

Is this useful?

Students think so

2020 sociology graduate working for a nonprofit research organization

“[I received] overwhelmingly positive feedback on my documentation method in code reviews, which is all thanks to my time with LDI”

2021 Economics graduate, currently pursuing a Ph.D. in Political Economics

“… I feel like [LDI Replicator position] was the single most important thing … to prepare myself to succeed in [predoctoral fellowship]….”

2024 Economics intern

For every issue I did solve, there was a gratifying moment, knowing I’ve explored something new and that the authors would read and heed my documented solution. It’s always a huge confidence boost whenever I take the initiative to research and dabble with problems I might not be able to solve.

Summer 2024 internship at AEA

Participants

  • Initiated by Cornell University (Vilhuber)
  • 5 partner institutions (Wellesley, Haverford, Bryn Mawr, Notre Dame, U Colorado at Boulder, UNC Chapel Hill\(^*\)), each with a local coordinator
  • 9 students selected in total

Timeline

  • Training in April (to skirt finals)
  • Start after finals (May 20 - June 1, depending)
  • 12 weeks core participation

Structure

  • 12-16 weeks of participation
  • Funded by journal for real work (real articles, real reports)
  • Partially supported by various host institutions

Student contributions

  • Part of regular team (others were continuing Cornell students)
  • 12-16 “cases” (~ 1/week)
  • A small number of revisions (same article, improved replication package)

Student learning

  • Successful debugging (with assistance when needed)
  • Structured internal reporting (how to convey what you did that did not work)
  • Writing reports

Student learning (technical)

  • Use of remote Windows and in some cases Linux servers
  • Basic use of multiple software not previously used
  • Repeated use of modern software development tools: Git, Markdown, automation

Student experiences and reproducibility services

Student experiences and reproducibility services

  • Journals are organizationally distinct from universities (mostly)

Student experiences and reproducibility services

  • Most curricula currently do not include the methods and tools necessary for reproducible work

Student experiences and reproducibility services

  • Most research institutions to not provide reproducibility services to their researchers

Timing of publication

  • In the context of the AEA journals, students provided feedback prior to (first) publication
  • World Bank, some research institutes are providing service to their staff / members with public artifacts (early institute-level publication)1
  • i4R is providing more in-depth feedback post-publication

A note on graduate education

A note on graduate education

  • Many graduate seminars have elements of this… yet not nearly enough!
  • This morning and tomorrow: at ROCKWOOL Foundation/ Humboldt Universität! (thanks to Alexandra Spitz-Oener for putting together)
  • Also Cornell ECON 7850

Ideal graduate curriculum

  • Day 1 reproducibility
  • Useful tools: Why you should learn (and love) the command line
  • Writing articles that combine text and code (and when not to do this)
  • High-performance computing and why you should care about it
  • Reproducibility, transparency, and data ethics: How and when to share data, and when not to

Ideal graduate curriculum

Yes we should

Yes we should

  • Include training on reproducible practices in undergraduate curricula (junior / penultimate year) 3
  • Include training on reproducible practices in graduate curricula (after coursework)
  • Training should include practice on pre-publication (own faculty) and post-publication (journal publications)
  • May involve structured reporting (Social Science Reproduction Platform)

Thank you.

References

Ball, Richard. 2023. Yes We Can!’: A Practical Approach to Teaching Reproducibility to Undergraduates.” Harvard Data Science Review 5 (3). https://doi.org/10.1162/99608f92.9e002f7b.
Butler, Courtney R. 2023. “Publishing Replication Packages: Insights From the Federal Reserve Bank of Kansas City.” Harvard Data Science Review 5 (3). https://doi.org/10.1162/99608f92.aba61304.
Jones, Maria Ruth. forthcoming. “Introducing Reproducible Research Standards at the World Bank.” Harvard Data Science Review 6 (4). https://hdsr.mitpress.mit.edu/.
Mendez-Carbajo, Diego, and Alejandro Dellachiesa. 2023. “Data Citations and Reproducibility in the Undergraduate Curriculum.” Harvard Data Science Review 5 (3). https://doi.org/10.1162/99608f92.c2835391.
Peer, Limor. 2024. “Why and How We Share Reproducible Research at Yale University’s Institution for Social and Policy Studies.” Harvard Data Science Review 6 (1). https://doi.org/10.1162/99608f92.dca148ba.
Vilhuber, Lars. 2024. “Report of the AEA Data Editor.” AEA Papers and Proceedings 114 (May): 878–90. https://doi.org/10.1257/pandp.114.878.
Vilhuber, Lars, Hyuk Harry Son, Meredith Welch, David N. Wasser, and Michael Darisse. 2022. “Teaching for Large-Scale Reproducibility Verification.” Journal of Statistics and Data Science Education 30 (3): 274–81. https://doi.org/10.1080/26939169.2022.2074582.

Footnotes

  1. Butler (2023) , Peer (2024), Jones (forthcoming)

  2. Possibly see https://labordynamicsinstitute.github.io/tutorial-data-sharing-archiving-2021/

  3. See also Mendez-Carbajo and Dellachiesa (2023)