When misconduct occurs, how should journals and institutions work together?

Image result for When misconduct occurs, how should journals and institutions work together?

When the World Conference on Research Integrity kicks off at the end of this month, one topic that will be on attendees’ minds is how journals and research institutions should collaborate when investigating the integrity of individual publications. That’s because this week, a group of stakeholders from institutions and the publishing world released draft guidelines on bioRxiv for how this relationship might work, dubbed the Cooperation And Liaison Between Universities And Editors (CLUE) guidelines. We spoke with first author Elizabeth Wager, Publications Consultant at Sideview and a co-Editor-in-Chief at Research Integrity & Peer Review (as well as a member of the board of direcftors of our parent non-profit organization).

Retraction Watch: The Committee on Publication Ethics (COPE) has issued guidelines for how research institutions and journals should cooperate when investigating concerns over research integrity. What do the CLUE guidelines add to that discussion?

Elizabeth Wager: The COPE guidelines are a great start but we realized they didn’t answer all the questions. For example, institutions were concerned that if journals always asked authors for an explanation before alerting the institution, this might give the researchers a chance to destroy or tamper with evidence. Also, we wanted to understand more about differences around the world to understand why some institutions were able to share reports from investigations but others maintained these were strictly confidential.

RW: In what ways do the CLUE guidelines differ from what COPE recommends?

EW: The CLUE guidelines suggest that journals need to think about times when they should contact an institution directly, either before or at the same time as alerting the authors. This would only be in rare cases where the journal had strong suspicions of data fabrication or falsification, but they need to recognize that this might happen.

We also address the question of problems with research done some time ago and the need for institutions to have data archives, and for journals to keep records of peer review. Typical practice on storing data varies, but for clinical data in the US, the current recommendation is only six years.

With cloud storage, space shouldn’t be an issue, so ideally records should be kept permanently, and certainly for at least 10 years. Journals often hit problems when investigators say they can’t supply data because it has been destroyed. We feel institutions and funders should take responsibility to make sure this can’t happen, by making sure there is proper archiving.

Another new proposal is that journals could ask authors for contact details of their university’s research integrity officer when they submit a manuscript. We don’t envisage these details being published, just kept on file in case they’re needed. This is a response to the problems journals often report of finding the right person to contact. We also think it might encourage universities to appoint such people and publicize their role, both externally and to researchers.

RW: What are your more radical proposals?

EW: I think our most radical suggestion is that institutions should develop new systems designed to assess the reliability of a publication. This wouldn’t prevent investigations into individual researchers designed to show whether they are guilty of misconduct, but would be quite separate from them. The CLUE guidelines recognize that publications may be unreliable due to honest error, or sloppy science, but the current system — focusing on narrow definitions of misconduct — doesn’t address these questions. We hope that processes for evaluating reported research could be much quicker (and cheaper) than misconduct hearings. That would also help journals alert readers to problematic publications sooner.

RW: You suggest that institutions provide journals with relevant sections of reports of misconduct investigations – what is the typical practice now?

EW: This is so varied, it’s not really possible to say what’s typical. In some parts of the world, institutions (or research integrity bodies) publish reports — Retraction Watch contains several examples, such as investigations into Stapel (in The Netherlands), and Fujii (in Japan). Some national bodies also release reports, eg in Denmark, and the U.S., but in other parts of Europe, such as the UK, universities often won’t release reports.

Editors also say that universities sometimes share a report but then forbid the journal from citing or quoting from it, which can make writing an informative retraction notice tricky.

RW: We’ve published a few stories recently that show how long it can take for journals to act on a paper that an institution has told them is problematic – meaning, it should be retracted or corrected. Do the new guidelines offer any suggestions for how to speed up that process?

EW: Yes, we hope our suggestion for investigating publications rather than people should speed things up considerably.  Currently, journals may have to wait not only for an investigation to conclude (which can take several years) but for an appeal process on top of that.  And if a university can’t even say whether it IS investigating somebody, the journal may not even feel it can issue an Expression of Concern. Being able to judge the validity of an article without having to decide if a researcher committed misconduct could really speed things up.

Another thing we propose is that institutions should contact journals directly rather than relying on researchers to do this.  This would apply both after an investigation but also if our new system for assessing publications works out.

Like Retraction Watch? Consider making a tax-deductible contribution to support our growth. You can also follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up on our homepage for an email every time there’s a new post, or subscribe to our daily digest. Click here to review our Comments Policy. For a sneak peek at what we’re working on, click here.