Making Peer Review More Transparent with Open Annotation

By heatherstaines | 13 September, 2017

Crossposted on ORCID as a part of Peer Review Week 2017

Magnifying glass held over ancient scientific book highlighting text.Editors, reviewers and scholars are recognizing the potential for open annotation to streamline and improve traditional forms of peer review and create a framework for new practices. Transparency, the focus of this year’s Peer Review Week, ultimately depends on design choices that communities make when agreeing how scholarship can benefit from collective review. Open annotation can dramatically improve the potential for transparency by enabling novel approaches for review that otherwise would be difficult or impossible to achieve.

Annotation can enhance all types of peer review, including single- and double-blind, open review, and post-publication peer review. By connecting observations, questions and suggestions to text selections, annotations enable a precise, fine-grained collaborative conversation on top of documents that can persist across versions and even after publication. Annotating in groups, both private ones for closed reviews or public-facing groups for open or post-publication reviews, colleagues and peers can provide a variety of different kinds of feedback using different models — even on the same article.

Hypothesis is an organization dedicated to the development and spread of open, standards-based annotation technologies and practices that enable anyone to annotate anywhere. Publishers embed Hypothesis in their platforms to support pre-publication workflows like peer review and post-publication engagement with invited experts and general readers. Scientists and researchers use Hypothesis to engage with documents and their peers, organize research, and embed related resources on top of existing texts. Educators and students in K12 and higher education annotate with Hypothesis to embed teaching and learning directly in digital content. Journalists use Hypothesis to connect and discuss documents in investigative research and enrich coverage of other texts.

Hypothesis and ORCID have a longstanding collaboration to connect scholarly identifiers in publication workflows, documents, and annotations to establish reliable mechanisms that support trust, attribution and transparency across all scholarship. Scholars can already add their ORCIDs to their Hypothesis profiles and publishers and platforms that use ORCID for authentication can now provision their users with annotation capabilities automatically.

Traditional Peer Review

While manuscript submission systems have automated many parts of the traditional peer review process, until now the reviewer work process has changed little. Critique is still delivered via an unwieldy long-form document that references page, paragraph, and line numbers, sending editors and authors on scavenger hunts to track down related text. Further, as the revision process proceeds, all those numbers can be rendered meaningless as the text changes. Inline annotation brings the critique directly over the relevant text and allows a fluid conversation to unfold with editorial guidance.

Open source annotation frameworks like Hypothesis allow submission systems to incorporate annotation directly into their existing web apps, even hosting the annotations locally if they prefer. APIs enable annotated reviews to flow into dashboards for editors, reviewers, and authors, respecting granular permissions that indicate who can (or should) see various types of feedback. Additional capabilities like custom tagging or filtering can be added easily. Authors and reviewers can see the annotations in context atop the documents themselves and via summary documents like decision letters. Hypothesis’ deep linking also enables reviewers and authors to connect specific passages to additional resources across the web to augment the review process and the final manuscript.

Post-publication Peer Review

Post-publication or crowdsourced peer review is another iteration on traditional peer review, but even here a number of challenges remain. Readers may lack context without access to data or additional resources that traditional peer reviewers might receive. All comments may well be lumped into a single bucket irrespective of reviewer expertise. Reader comments that live on blogs or on Twitter may well not connect back to the original article, so other readers don’t even know that such feedback exists. Further, post-publication reviewers might submit corrections or updates that also don’t connect effectively back to original documents. With the way that content is disseminated today on multiple platforms, readers may well be contributing feedback only on one copy of many live on the web.

Annotation technology can help remove these barriers. Through the use of in-line annotation, additional materials can be connected through deep linking, by authors, journal editors, or post-publication reviewers. Readers can tie their feedback to particular parts of the article according to their expertise, making it easier for other readers to hone in on those precise areas. Deep linking can also connect discussions taking place on other platforms to the publisher version of record, which can also be used to provide corrections that are visible to readers. More importantly, relational URLs, DOIs, and PDF fingerprinting can connect reviewer comments across multiple copies of articles published in varying document formats on different hosting platforms.

Peer Review of Pre-prints, Data, and More

Outside of journals — for preprints, conference proceedings, textual data sets and other scholarly works that fall outside of the traditional pre-publication review process — annotation can provide an equally powerful collaborative capability. Only a tiny percentage of the scholarly literature even offers the most rudimentary form of commenting capability, so generally scholarship online lives in a silent place without easily discoverable discussion or critical review that can help improve the work or inform others. Where it does exist, commenting mostly fails to deliver a meaningful transport layer for peer feedback due to its lack of specific anchoring in texts and its lack of mechanisms to connect user identities to scholarly credentials via ORCID and other identity systems. As pre-prints become part of peer review workflows, there is a need for reviewer remarks to flow with manuscripts into production and even through to publication of the versions of record. Various innovations have been tried over the years, including overlay journals like Discrete Analysis. These suffer from the same fundamental problem: they operate at a distance from the article itself. Annotation is a powerful mechanism that overcomes this limitation and can enable experimentation with innovative approaches to peer feedback in a way that’s fundamentally more precise, discoverable and flexible.

Hypothesis and Peer Review

Bringing a powerful toolchain to the workflow of editors, reviewers, and researchers fits with the Hypothesis mission of bringing annotation to all knowledge. By empowering peer review processes with the same basic tools that researchers use in writing and updating their manuscripts, Hypothesis harnesses the power of standards-based and interoperable technologies that fulfill the promise of the web. We are excited to work with partners to explore how annotation can make peer review more transparent.

If you’re interested in seeing how Hypothesis works — whether as an editor, an author, or other reviewer — you can experiment right away, before making technical integrations, by setting up your own private Hypothesis groups to add annotations to your journals, preprints and other documents.

Get in touch to learn more about peer review using open annotation with Hypothesis.

Share this article