Making Peer Review More Transparent with Open Annotation
Editors, reviewers and scholars are recognizing the potential for open annotation to streamline and improve traditional forms of peer review and create a framework for new review practices.
We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.
The cookies that are categorised as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ...
Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.
Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.
Performance cookies are used to understand and analyse the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Advertisement cookies are used to provide visitors with customised advertisements based on the pages you visited previously and to analyse the effectiveness of the ad campaigns.
Editors, reviewers and scholars are recognizing the potential for open annotation to streamline and improve traditional forms of peer review and create a framework for new review practices.
Hypothesis and Cold Spring Harbor Laboratory announce the selection of Hypothesis as the primary annotation mechanism for the bioRxiv preprint service.
MIT Press adds Hypothesis annotation to their CogNet platform to offer open, standards-based collaboration tools to researchers in the brain and cognitive sciences.
Join the conversation connecting FAIR data to digital annotation at the second annual Annotating All Knowledge Coalition face-to-face meeting, co-located in Berlin with FORCE2017.
With support from the Hypothesis Open Annotation Fund, the TextThresher team has developed software that allows researchers to enlist citizen scientists in the complex annotation of large bodies of text.
Scientific journals come and go, but the scientific record is permanent, and its annotation layer should be too. New Hypothesis support for DOIs (digital object identifiers) helps ensure a robust connection between articles and annotations.
Take a deep dive into open annotation 31 July–4 August, 2017: two intensive courses at the FORCE11 Scholarly Communications Summer Institute.
Originally published 12 May 2017 on the QDR blog by Sebastian Karcher. Scholars are increasingly being called on – by journal editors, funders, and each other – to “show their […]
Originally posted at Pundit by Francesca Di Donato The diffusion and the public endorsement of data FAIRness has been rapid. The FAIR Data Principles were were published in late 2014 and early 2015. […]
Hypothesis and HighWire Press are announcing a partnership to bring a high quality, open annotation capability to over 3,000 journals, books, reference works, and proceedings published on HighWire’s JCore platform.