addressing outdated software in paper reviews
1
2
Entering edit mode
5.9 years ago
igor 13k

What is the best approach in dealing with outdated software during paper reviews (or other official correspondence)?

For example, whenever anyone here asks a question involving TopHat, the first response is usually that TopHat is outdated and not to use it. If you are reviewing a paper that used TopHat, would that be an appropriate comment? Is there a way to phrase it more eloquently? Ideally, you would cite a reference, but that's not really an option in this type of a situation as far as I can tell.

I don't mean to imply that papers should be rejected, but reviewers frequently raise concerns over specific words or phrases that do not really affect the results. Some would say that outdated software would be on a similar level.

software • 1.4k views
ADD COMMENT
3
Entering edit mode

As long as there is additional evidence included in the paper that supports the conclusions I would say you can't ask them to use a different program. TopHat has not been proven wrong, it is just that there are newer/better methods available and people should be encouraged to use them, if they are starting new analyses.

ADD REPLY
1
Entering edit mode

I would totally side with Genomax and Devon on this -- context is everything, IMO. If this is a mostly wet lab paper where the bulk of the evidence for the story comes from diverse sets of experiments, asking to re-do the analysis of the RNA-seq data set that probably only served for hypothesis generation is annoying and unnecessary. However, I would definitely let the researchers know that TopHat is not state of the art and that they may want to look into alternative tools for their _future_ analyses (also to increase their likelihood of identifying interest things they can actually eventually confirm).

On the other hand, if this is a bioinformatics paper that claims to present a fantastic new tool for predicting biology based on RNA-seq data (e.g., a tool that expects some DE measures as input), I would definitely ask them to repeat their benchmarking and repeat their data preparation with tools that have been identified as best of breed.

ADD REPLY
0
Entering edit mode

For a bioinformatics paper, no question about it.

For a wet lab paper, I agree it shouldn't matter if the relevant part of the results is validated anyway. On the other hand, if the paper is meant to be used as resource, the implication is that all the results are relevant (not just the parts that you discuss in detail).

ADD REPLY
1
Entering edit mode

Well, then give them the choice between re-running their analyses and being a useful resource or dropping the claim of being a resource from the manuscript. Again, I would say this should be context-dependent and totally based on the main message(s) that the authors want to get across.

ADD REPLY
2
Entering edit mode

I would ask them to repeat with HISAT2 and then recommend (to the journal) a major revision. I would write:

'I am unsure why the authors utilised TopHat when this program is no longer recommend for use by its own developers (citation 1; citation 2). The upgraded version of TopHat is HISAT2 (citation 3), which the authors should use in order to repeat their analysis'

Citation 1: https://www.the-scientist.com/?articles.view/articleNo/51260/title/Scientists-Continue-to-Use-Outdated-Methods/

Citation 2: Pachter, L (2017)

Citation 3: https://ccb.jhu.edu/software/hisat2/

ADD REPLY
1
Entering edit mode

Peer_Reviewal

ADD REPLY
1
Entering edit mode

We may be getting into semantics but how does one define

outdated software

Is it outdated because someone is saying so (TopHat is not a good example since author's there are encouraging people to look elsewhere), is it because it was written a long time ago or is it because reviewer in question does not like it for some reason. Chances of an author using a piece of software that is producing incorrect results should be small (hard to say that it will be zero).

ADD REPLY
0
Entering edit mode

Good point. I used TopHat as an example because it's a very clear one. But even for such an obvious case, it would be hard to argue that it's wrong to use and the authors should repeat the analysis if any findings that they discuss are validated (as you already pointed out in an earlier comment).

ADD REPLY
1
Entering edit mode

if outdated means flawed here, then advising authors on using latest software is fine. Current software doesn't mean bug free. I think as long as experimental results support theoretical outcomes, then I guess it is good to go. IMO, experimental validation is gold standard for biologist, not the software (outdated or new). Theoretically speaking, if the methods (theoretical/ software) are wrong, experiment would go wrong, anyways.

ADD REPLY
6
Entering edit mode
5.9 years ago

From the reviewer's side

Before addressing how to reply, let's first think about when it's appropriate to do so.

There are two basic types of paper that are using software: those using it to generate one piece of a research project and those using it for comparisons with other methods (often those of the author's creation). In the latter case, it is absolutely correct as a reviewer to point out such inappropriate comparisons, which are the equivalent to using an inappropriate control in the wet lab. For the former group of papers, however, I urge caution in making strong demands here. I would argue that it's only appropriate to suggest (read "demand", since we all know how this works) that a wet-lab paper using an out-dated tool if you can reasonably argue that it's likely said tool produced results that are biased sufficiently to skew the results of the paper. Generally such papers present a story with many parallel confirmatory experiments, so it's the totality of the evidence you need to go on. Just as it's unfair to ask authors to perform time-consuming experiments only vaguely related to their paper (the stereotypical reviewer 2 reply), it's unfair to ask them to reanalyze their data when you yourself think it unlikely to make any difference.

Regarding the actual reply, concretely lay out whether the tool itself is out-dated (citations are nice, but not a must), laying out the theoretical grounds why the tools is inappropriate for the task at hand (e.g., having a known bias in its results). If the tool is generally OK, but they're using some ancient version then mention some of the issues in the tool that might be relevant that have been addressed in subsequent releases. Again, it's only fair criticism to mention things like this if the tool or the version of it used is likely to produce either problematic results or an unfair comparison that the naive reader is unlikely to be aware of.

From the author's side

It's really annoying to receive nit-picking replies from reviewers. The best practice (in my opinion) is to first estimate how much time would be involved in complying with the request. If you've used Snakemake (or equivalent) to create a reproducible workflow, then it's probably only a day or two of crunching to reanalyze things. This is often pointless, but as my father always said, "choose your battles". You're in a better position with the editor if you aren't combative about everything.

There are, of course, cases where it's not feasible to redo the entire analysis. In such cases you have two options:

  1. Rebut the reviewers request. This requires at least a theoretical argument and ideally a published comparison. The latter is quite important if the editor isn't him/her-self an expert in the topic.
  2. Reanalyze a subset of the data and see whether it makes a difference. If you can show that it doesn't then so be it. If it does, well then the reviewer was correct.
ADD COMMENT
0
Entering edit mode

This is a very thoughtful viewpoint.

I didn't mean to imply that you should demand a new analysis, but reviewers frequently raise concerns over specific words or phrases that do not really affect the results. Some would say that outdated software would be a similar "offense". Even if certain findings were validated, perhaps others are missing.

I also clarified the initial question.

ADD REPLY
0
Entering edit mode

Ah, I was understanding the question as arising from the viewpoint of a reviewer. I'll update my answer to address things from the author side too.

ADD REPLY

Login before adding your answer.

Traffic: 1821 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6