1,000 Students and 20,000 Annotations: First Data in Large-Scale Social Annotation Research

1,000 Students and 20,000 Annotations: First Data in Large-Scale Social Annotation Research

By |2021-10-08T10:38:09-07:0027 Sep 2021|
Black and white headshot of Dr. Remi Kalir.

This special guest post by Dr. Remi Kalir, Associate Professor at the University of Colorado Denver School of Education and Human Development, and inaugural Hypothesis scholar in residence, is the second in a series we are publishing about the large-scale research collaboration we are participating in at Indiana University to investigate how social annotation improves reading and writing practices for undergraduate students in core English literature and composition courses for majors and non-majors. Read the first post in the series, by project lead Dr. Justin Hodgson, and subscribe to hear about additional posts as the team prepares formal work for peer-reviewed publication.

During the Spring 2021 semester, our interdisciplinary research team at Indiana University Bloomington (IU) made noteworthy progress studying how undergraduate students participate in social annotation (SA) as a form of writing, and how participation in SA activities can help improve students’ writing and learning outcomes. We started this research because SA is an important feature of various English literature and composition courses that enroll both majors and non-majors at IU, including English W131 which is required for all first-year students. Our study represents a distinctive opportunity to facilitate large-scale, multi-stakeholder research as coordinated among IU Bloomington’s Department of English, IU’s eLearning Research and Practice Lab, and Hypothesis so as to better understand the relationship between students’ SA and their learning.

With the project having completed its first full academic term, we are looking ahead to how we now have data we can look at to answer questions like how students’ SA activity correlates with their course outcomes; if student SA activity generates social networks and how those peer-to-peer discursive networks might change over time; and how course-level SA activity differs among sections that share similar characteristics. We are eager to report findings from these analyses and more in future peer-reviewed publications, and informally in this series of blog posts.

Over the past few months, members of our research team have begun to share resources and presentations that provide useful background information about the study, as well as our early insights and curiosities. As a complement to this project update, Principal Investigator Dr. Justin Hodgson, Associate Professor of English at IU, has invited discussion about the role of instructor presence and participation in students’ SA activities through his post “To Annotate or Not To Annotate (With Students).” During I Annotate 2021, Chris Andrews — a PhD candidate in Learning Sciences at IU — presented “Instructors’ Design and Use of Social Annotation Activities in Undergraduate Reading and Composition Courses,” as part of the Social Annotation Research panel. I Annotate also featured team members Sarah Fischer (a PhD student in English), Laura Rosche (a PhD candidate studying English rhetoric), and Mary Helen Truglia (a PhD candidate studying Early Modern Literature and an IU Teaching Fellow) in conversation with Dr. Hodgson during the conference’s final Educator “Office Hours” session.

Following our first semester of data collection we can report that our study’s current sample of participants and annotation data include:

  • 1,063 undergraduate students, 967 of whom participated in at least one SA activity during the spring 2021 semester.
  • 57 courses and/or sections that incorporated Hypothesis SA, including 51 sections of English W131 (first-year composition).
  • 50 instructors, including 12 who completed a questionnaire about their use of Hypothesis, 8 who participated in SA activities alongside students, and 6 who joined the semester-long inquiry team facilitated by Chris Andrews to discuss their instructional planning and pedagogy.
  • 26,175 Hypothesis annotations were written by students and instructors, generating 852,227 words.

Given that SA functions as a form of close reading and discussion directly “anchored” to a source text, it is important that our team also identified the texts that served as discursive contexts throughout the semester. 126 unique texts were annotated across the 62 course sections, inclusive of both readings and assignments (students, in some instances, voluntarily added a handful of annotations to descriptions of their course’s writing assignments). For English W131, instructors chose whether five SA activities during the semester featured texts from a shared department-curated archive of readings (like essays) or texts from other “outside” sources. Students and instructors in the 51 W131 sections annotated 90 unique texts, 23 of which appeared in more than five sections. Four readings were annotated in at least 10 sections of W131, including:

  • Chapter 1 from Jonathan Crary’s 24/7: Late Capitalism and the Ends of Sleep (12 sections);
  • “Monster Culture (Seven Theses)” from Jeffrey Jerome Cohen’s Monster Theory: Reading Culture (11 sections);
  • Chapter 3 from John Berger’s Ways of Seeing (11 sections); and
  • “How to Tame a Wild Tongue” from Gloria Anzaldúa’s Borderlands/La Frontera: The New Mestiza (10 sections).

As our team begins data analysis, it has been no small task to first collect, de-identify, and also organize our study’s first semester of data. This summer, thanks to support from the Faculty Assistance in Data Science program, we have been assisted by Chinmayee Modak and Chinmayee Mundhe, both of whom are current IU graduate students studying data sciences. With their support, we have established the technical infrastructure necessary to: a) determine how characteristics of students’ SA activity (such as total number of annotations, replies, word count) correlates with their course outcomes (i.e., final grades); b) identify social networks created through student participation in SA activity, and how those peer-to-peer discursive networks change over time; and c) compare course-level SA activity among sections of W131 that share similar characteristics. We are eager to report findings from these analyses in future peer reviewed publications.

Our efforts continue this fall semester with ongoing data collection and data analysis while we remain attentive to contextual circumstances surrounding the study. Whereas all courses included in our study were online during the spring 2021 semester due to the pandemic, fall 2021 courses will be offered via in-person, fully online, and hybrid formats while continuing to incorporate SA activities. We are also currently planning a series of professional development opportunities for instructors interested in refining and reflecting on their use of SA to support students’ writing and learning. As our study continues, any researchers interested in social annotation are welcome to contact our team at hodgson@indiana.edu.

This research project has been supported by a grant in Support of Research and Creative Activity from the College Arts and Humanities Institute at Indiana University Bloomington, awarded to PI Justin Hodgson, Ph.D. Data analysis on this project has been completed by Chinmayee Modak and Chinmayee Mundhe, whose work was supported by the Faculty Assistance in Data Science Program through the Office of the Vice Provost and the University Graduate School, Indiana University Bloomington.

Subscribe

Subscribe stay informed about social annotation with news from Hypothesis.

About Hypothesis

Hypothesis is a mission-driven organization dedicated to the development and spread of open, standards-based annotation technologies and practices that enable anyone to annotate anywhere on the web. Our mission is to help people reason more effectively together through a shared, collaborative discussion layer over all knowledge. Hypothesis is based in San Francisco, CA, USA, with a worldwide team.

Hypothesis develops its open-source annotation software in collaboration with many contributors. We thank our funderspartners, and entire community for working with us to advance standards-based, interoperable annotation for all.

Contacts

Media: Nate Angell, Director of Marketing
Twitter: @hypothes_is
Web: web.hypothes.is