Teaching Students to Marshal Evidence and Evaluate Claims

By judell | 11 April, 2017

Note: This post appeared in EDUCAUSE Review vol. 52, no. 2 (March/April 2017), and is reprinted here with permission.


New Horizons - The Technologies Ahead

A month before the 2016 U.S. presidential election, President Barack Obama spoke at the White House Frontiers Conference and said: “We’re going to have to rebuild, within this Wild Wild West of information flow, some sort of curating function that people agree to.”1 In the 1960s and 1970s, Walter Cronkite’s nightly newscast sign-off (“That’s the way it is”) reached tens of millions of viewers and defined a broad consensus. How can we rebuild such a consensus? Here’s one way higher education can help: teach critical thinking modes that bring scholarly best practices to the modern web.

To evaluate literary, scientific, or historical evidence, scholars and researchers must first marshal that evidence. Footnotes identify sources. Links to web pages and PDFs grant access to those sources. And now online annotations can identify and link to claims in those web pages and PDFs. Web annotation marries an ancient tradition — underlining passages in books, writing glosses in their margins — to modern publishing that’s online and social.2 My company, Hypothesis, is among those enterprises that are developing web annotation software used to highlight online evidence, attach notes to the highlights, and discuss the cited passages in groups or on the open web. Unlike comments at the bottom of online news stories, or in Twitter replies, or in Facebook posts, such annotations appear in overlays that are separate from — but precisely connected to — the evidence to which they refer.

The creator of one such overlay is Climate Feedback, a group of scientists who vet mainstream reporting on climate change. When a Climate Feedback scientist evaluates a climate-related claim in the Wall Street Journal, for example, readers using annotation-aware browsers see that expert gloss directly on the WSJ web page. Sites may or may not choose to invite this kind of intimate analysis. But the web’s open architecture guarantees that one way or another, it’s possible to create and share authoritative overlays. Annotation tools and services are converging on open standards that will enable them to work with one another,3 just as different kinds of web browser and email clients are able to work with different kinds of web and email servers. This movement toward open and interoperable web annotation sets the stage for a democratization of the scholarly arts of close reading, line-by-line analysis, and accurate citation.

Here are some of the ways teachers use web annotation:

  • To prepopulate an online text with questions for students to answer
  • To mark and explain rhetorical strategies
  • To teach students to check facts, trace provenance, and evaluate sources4

In 2017 the need to teach fact-checking and source analysis looms larger than ever. Among the responses to that need, Mike Caulfield, 2017 editor of this New Horizons series of columns in EDUCAUSE Review, has launched the Digital Polarization Initiative. It’s a template for a cross-institutional course in which students learn how to evaluate claims in news stories. Here’s a sample claim: “Minnesota Affordable Care Act insurance premiums increased by up to 66% last year.” A student begins by citing the claim itself, using an annotation tool to select the statement as it appears in the story and to create an annotation that anchors to the claim. The annotation is represented by a link that points not just to the page but, more precisely, to the highlighted statement within the page. This direct link5 captures context, and because each annotation can grow a discussion thread, it enables students to work together in that context.

From there, the investigation moves upstream to discover and cite the sources on which the story relies and laterally to gather the background information needed to evaluate the claim and its sources. A single investigation may require students to find, organize, and present evidence found online in dozens of HTML or PDF documents. For each document, the student may need to cite several statements, ideally using annotations to point to them directly. Once all this evidence has been gathered and organized, the student draws on it to write an analysis, which may conclude that the claim is true, false, or indeterminate.

The Digital Polarization Initiative aims to inculcate both traditional and modern literacies. Footnotes and bibliographies belong to a tradition that we must preserve and adapt for the web. Evaluating the sources noted and listed, though, requires some genuinely new skills. To help students master them, we at Hypothes.is have created the DigiPo toolkit.6 It’s a Chrome extension that embodies best practices for fact-checkers and works closely with the DigiPo wiki widgets that display annotation-based evidence.

To evaluate the reputation of an unfamiliar website, for example, students are taught to use an advanced search that excludes that site’s own pages from search results. The toolkit keeps that Google query handy, just a right-click away. Another right-click option sends a selected statement to a set of fact-checking websites. Because not all sources are available online, yet another right-click option sends a book title to the Online Computer Library Center’s WorldCat service, which may report that a copy is available in the student’s local library. Fact-checking is hard work! When there’s a lot of evidence to process, these affordances help streamline the process.

These helpers also build an awareness of capabilities that can make students more competent web citizens and thus better critical thinkers. “Many assume that because young people are fluent in social media they are equally savvy about what they find there,” the Stanford History Education Group wrote in a recent report. “Our work shows the opposite.”7 So we need to teach students how to debunk fake news, know when they are reading sponsored content, and separate national newspapers of record from fringe publications.

More broadly, we need to lay a foundation for evidence-based reasoning in social, professional, and civic realms. Students must know how to marshal and manage growing bodies of evidence distributed around the web. To that end, the DigiPo toolkit also provides right-click options that embody best practices for web information management.

Here’s an underappreciated best practice: if you tag a set of documents consistently, you create a collection that can be cited with a URL that queries for the tag. In the Digital Polarization Initiative projects, every investigation happens on its own wiki page. When annotations are tagged with the name of the wiki page, they appear in several collections included in the page. One collection gathers all of the evidence that supports the investigation. Another arranges a subset of the evidence on a timeline so that investigators (and readers) can reason about the history of the topic. Students could assign those tags manually, but that’s awkward and error-prone. So right-click options to tag a source page (or a selected claim) offer a list of current investigations. Selecting from the list is an easy way to add evidence to collections. It also teaches controlled naming, a form of digital literacy that, like the advanced Google queries mentioned earlier, won’t always be so helpfully supported with training wheels.

Other best practices are emerging as web annotation matures:

  • Cite evidence using links that resolve to quotes in context
  • Work with others in annotation layers that gather and enhance dispersed web resources
  • Use annotation tools that are open, standard, and interoperable

What the Digital Polarization Initiative aims to teach, above all, is a set of strategies for evaluating claims: go upstream, read laterally, check sources, marshal evidence. If higher education can build consensus around those strategies and the digital literacies that support them, it will help us establish “some sort of curating function that people can agree to.”

Notes

  1. “White House Frontiers Conference” (video), Pittsburgh, PA, October 13, 2016; “Remarks by the President in Opening Remarks and Panel Discussion at White House Frontiers Conference” (transcript), Office of the Press Secretary, The White House, October 13, 2016.
  2. For more on web annotation, see the W3C Web Annotation Working Group web page.
  3. See “Web Annotation Data Model,” W3C Proposed Recommendation, January 17, 2017.
  4. Jeremy Dean, “Back to School with Annotation: 10 Ways to Annotate with Students,” Hypothes.is blog, August 25, 2015.
  5. Bob Salera, “Huge Obamacare Premium Increases in Minnesota: Where are Rick Nolan and Angie Craig?” NRCC blog, September 1, 2016 (Hypothes.is annotated version).
  6. Jon Udell, “A Hypothesis-Powered Toolkit for Fact Checkers,” Hypothes.is blog, January 17, 2017.
  7. Stanford History Education Group, “Evaluating Information: The Cornerstone of Civic Online Reasoning,” November 22, 2016, p. 7 (Hypothes.is annotated version).

Jon Udell is Director, Integrations, for Hypothes.is.

© 2017 Jon Udell. The text of this article is licensed under the Creative Commons BY 4.0 International License.

EDUCAUSE Review 52, no. 2 (March/April 2017)

Share this article