Metascience and Lab2

I have multi-disciplinary research interests and a strong taste for methodological questions. These interests naturally brought me to the field of metascience, the study of the science of science.

As a research practitioner, I have become really passionate about understanding my own trial-and-error process, so I can be more rigorous about my attempts and provide better guidance to my students about their own. I have also been wondering what methods to suggest to them and what paradigms are most robust

To understand this, I have become convinced that studying one data point (ourselves) is simply not enough (although it is necessary). There is a need for a systematic approach to studying how science works and how to interpret certain scientific results

Without theory and data, we are only left with intuition. But we can do better because we know those tools.

To build scale, I joined a growing team of researchers to work on a metascience incubator called Lab2. Current efforts are led by Levent Neyse (WZB) and Anna Dreber Almenberg (SSE). You can learn about it here. If you wish to contribute, please reach out!

 

PEER REVIEW

To understand how we produce research, it is also important that we examine how we evaluate research. I started some preliminary work in 2020-2021 to study the peer review process. With several colleagues from other institutions, we surveyed over 1,400 economists about their views on the current system and how it could be improved. While there are very few low-hanging fruits, it is only by measuring the current activity and interactions of researchers with the system that we can identify ways of improving its efficiency and equity. You can find a summary of our main results in a VoxEU article with more in depth discussions on a dedicated website here. A preliminary manuscript is also available: 

Peer Review in Economics: Where Are We? Where Can We Go?, with G. Charness, A. Dreber, D. Evans and A. Gill (2023)

Abstract: Peer review is the cornerstone of scientific publishing in economics, but data issues have inhibited past efforts to study it. To address this challenge, we combine insights from a novel survey dataset documenting the submission and review activity of over 1,400 researchers with observational evidence on how peer review in economics compares to the evaluation processes in other disciplines. In this paper, we present the main takeaways of our efforts, with an emphasis on exploring three key trade-offs that govern any attempts at reform: (i) the appropriate balance between relying on expert opinions and seeking input from a diverse pool of referees; (ii) the challenge of reducing turnaround time while maintaining rigor in peer review; and (iii) the difficulty of increasing the transparency of evaluations while preserving confidentiality. We conclude by fleshing out possible steps towards reform on these sets of issues.