Skip to content

Latest commit

 

History

History
46 lines (21 loc) · 4.77 KB

responsetoreviewers.md

File metadata and controls

46 lines (21 loc) · 4.77 KB

Hi Cristian and Carsten,

Thanks for giving us the chance to revise the paper on workflow.

In response to your comments and those of the reviewer we revised the paper. Because we realize that the issue is really about transparency and not experiments in particular, we did not end up mentioning experiments in our current set of revisions. We could add a few mentions here and there if that would help.

About the comment that much of the material appeared in an non-peer reviewed journal previously: When I was approached to participate in this journal, I volunteered to send this previously non-peer reviewed article as a submission to be peer-reviewed. That idea was accepted. As I worked to update the previously published version, I invited Maarten Voors to work on the paper with me given shared interests and to diversify the paper (i.e. to ensure that this was not merely the opinion of one single person based on one set of experiences). Maarten has a PhD in economics and works in an economics department, so we both agreed that we would have different perspectives which would enhance the paper. So the version that we sent differs a bit more from the 2011 version than I had anticipated. That said, I don't currently see us writing a whole new paper on this topic. We really appreciate that you allowed our paper to go through the peer-review process and very much hope that the version that we attach here is satisfactory. It is still more of a version 2.0 than a completely new set of thoughts on the topic.

We have attached the file here in pdf format and will happily follow with LaTeX and image sources when needed. Don't hesitate to let us know if you would like other revisions.

Best

Jake and Maarten

Regarding the comments of Reviewer 1:

We very much appreciate the thoughtful and supportive and helpful review. We respond to the points made here.

  • We have now pointed out explicitly that we have updated and changed the paper to reflect our own changes thinking and the changes in the technology landscape.

  • We have kept the primary lessons the same: we submitted this manuscript to RCP in order to gain more impact from the non-peer reviewed version, and this idea was approved by the folks working on the project at that time. We also feel like the primary lessons stand the test of time even as the specific methods for achieving them might change.

  • The editors highlighted to us that the issue is not primarily about experiments but rather about transparency, so we have not mentioned experiments more in this version.

  • We hope that we have clarified the role of Dropbox, and about makefiles and the make system.

  • We also have edited the paper to try to emphasize that there may be many best practices as long as they all are created to keep in mind the central lessons of the paper. (For example, we believe that one can do transparent scientific workflow using many different programming languages, computing systems, and processes as long as the central lessons are followed.)

  • We have tried to quash the typos and formatting problems although we expect that your copyeditors will help continue this work.

  • We have downplayed the parenting references.

Reviewer 1

This is a helpful paper that guides readers through a set of lessons to improve the way they organize their work. If this piece appears in RCP, the authors should do more to highlight the new contributions over the very similar piece in The Political Methodologist. The primary lessons and many of the examples are identical or very similar. The paper should also be more closely tied to experimental work, rather than data analysis in general. Are any of these problems specific to experiments? Are there other issues that come up in experiments that are unique to that research environment?

The authors’ perspective on the role of Dropbox is unclear. It seems that there are a lot of potential problems here. In this way, the paper wavers from being about best practices to being about a diversity of approaches. The authors may want to take a harder line, at the expense of trying to be comprehensive. Also, if it mentions them, the paper should be more clear about the content and role of the “makefile” (p. 7) and “make” (p. 10).

The paper has many typos and mechanical problems like missing spaces, inconsistent formatting of “GitHub” and “LaTeX”, a missing figure number (“??”), inconsistent spacing around operators in code (see the tests on p.9 vs. the modeling on pp. 2 and 3), etc. There are also some odd phrasings and interjections that I recommend removing (footnote 12, “Google those terms”, “People at UC Berkeley”, etc. I appreciate and enjoyed the paper’s casual tone, but it goes too far in my opinion. The parenting references, for example, are frequent and detailed enough to distract the reader.