Incentives for replication: A collective action problem of journals

Science Magazine Cover

Science of December 2 has a great special issue on Data Replication and Reproducibility. This might sounds like a dull topic, but it is critical for science. In the publish or perish era there is a focus on novel results and not on replication of existing studies. Given the high profile fraud cases in previous years where scholars fabricated their own data, we should put more emphasis on documenting and archiving the data. There is a cost of doing this, and that’s why it is less common scholars volunteer their time in those activities, although most of them are paid from tax revenue.

Fortunately NSF start requiring a data management plan, although there is not a consensus what the minimum conditions for a valid plan are. Journals may require archiving of data, but many don’t.
I am involved in different fields and experience different standards. Journals who publish experimental economics papers require typically a full documentation of the experimental protocol as a condition of acceptable of the paper. Although it is not always stated in the guidelines, reviewers will deny acceptance without proper documentation.

In journals publishing results of agent-based modeling, the culture of model archiving and replication is diverse and problematic. Journals often don’t require model documentation and expect reviewers to evaluate papers on pretty figures. From the journal’s perspective the increasing requirement will reduce the submission of interesting papers that will boost their impact factor. Since other journals don’t require documentation, they don’t too. As such many papers have been published, including in journals like Science, Nature and PNAS, which results could not be reproduced, or only after finding out the shaky assumptions behind the models. This shows that better documentation and model archiving for the development of the field. Not doing so will led agent-based modeling becoming a temporary fashion without being taken seriously by the broader scientific community.

I am involved with openabm.org which contains a model archive and more than 100 models are now archived. This is only a small fraction of the published models. Being involved in a small conference that required, but not enforced, model archiving, only 10% of the models were archived.

There is also some positive news. We worked with the journal Ecology & Society to make it a requirement for authors to archive their work on openabm.org if they use an agent-based model. We will have to learn what kind of problems will pop up, but we plan to approach more journals to start requiring proper documentation and archiving.

So, if you are involved with agent-based modeling tools, archive your work. If you review papers, request proper documentation to replicate the results. Contact journals on which you serve on the editorial board to increase their standards of replicability of the research.

Leave a Reply

Your email address will not be published. Required fields are marked *