Annotation-Powered Questionnaires
Radio buttons, checkboxes, and input boxes are the usual ways to answer survey questions. But what if the answer to a question is a selection in a document?
We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.
The cookies that are categorised as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ...
Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.
Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.
Performance cookies are used to understand and analyse the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Advertisement cookies are used to provide visitors with customised advertisements based on the pages you visited previously and to analyse the effectiveness of the ad campaigns.
Radio buttons, checkboxes, and input boxes are the usual ways to answer survey questions. But what if the answer to a question is a selection in a document?
Seven colleges and universities are conducting joint research on annotation’s impact on student reading comprehension and writing outcomes.
In August 2018, 86 people from 58 different organizations gathered in Berkeley, CA and remotely to attend the first workshop convened by the Joint Roadmap for Open Science Tools.
The open-access journal Murmurations collaborated with the Public Knowledge Project to launch open peer review using Hypothesis annotation on the Open Journal Systems platform.
Explore all the proceedings from I Annotate 2018, the sixth annual conference for annotation technologies and practices.
On 1 June 2018, The Andrew W. Mellon Foundation approved a 2-year $2M grant, Scaling Annotation in Scholarship and the Humanities, to Hypothesis to support feature enhancements for its annotation software and activities related to the expansion of its humanities user base.
No pop ups, no mass emails: Hypothesis takes a different approach to GDPR, making real changes for privacy, accessibility, and community without the pesky notifications.
Michigan Publishing announces a collaboration with Hypothesis to integrate open annotation with Fulcrum, the new ebook publishing platform.
Just 6 months after reaching two million annotations, Hypothesis users have now created over three million annotations just as our new team members were hitting their stride.
Cambridge University Press, the Qualitative Data Repository & Hypothesis join to pioneer making qualitative research more transparent with annotation.