Opinion
Assessment CTQ Collaboratory

A Practical Guide to Evidence-Based Writing Across Subject Areas

By Linda Yaron — August 30, 2016 6 min read
  • Save to favorites
  • Print

The newly redesigned SAT is the latest example of the need for students to skillfully communicate evidence-based ideas across content areas.

The “treacherous waters of effective evidence-based writing,” as a colleague recently referred to it, is far more intricate than might initially be expected. A gatekeeper skill that appears indirectly as early as 1st grade, it can play a large role in determining college admissions and success. The essay on the new SAT asks students: “As you read the passage below, consider how [the author] uses evidence, such as facts or examples, to support claims.” This is strikingly similar to the Common Core Writing Standard 1 for high schoolers, which asks students to “write arguments to support claims in an analysis of substantive topics or texts, using valid reasoning and relevant and sufficient evidence.”

As part of an evolving process, I’ve found the following 10 accessible strategies to be helpful in teaching students to skillfully work with evidence in responding to essay prompts. To layer a growth mindset while building confidence and engagement, I begin with paragraph writing and build outwards in quantity of writing and complexity of texts and questions.

1. Selection. To introduce the idea of evidence, students mark the top two sentences in their essay that best capture the message of a text. They share with a partner, explaining why they chose those sentences, what they mean to them, and what message the author is trying to convey. In this seemingly simple task, students evaluate and differentiate evidence, justify why they made the selection, analyze the quotation, and create an idea based on these thoughts. For classes that might need more scaffolding, sentences can be selected as a class, or students can work as a team to select, justify, analyze, and create their ideas.

2. Validity. Students evaluate various types of evidence, examining what makes evidence valid in proving an idea. This can be incorporated into lessons where students are given pre-determined texts and need to select evidence from the texts to prove an idea. Or students can find and evaluate their own researched sources to determine the credibility of the source and author, the purpose of the text, and any existing biases. Students can sequence evidence or sources from most to least effective, providing rationales for their ideas, or they can hold up fingers to differentiate the strength of validity on a scale of level 3-definitely, level 2-somewhat, or level 1-nonexistent. They justify responses with peers or the class to defend their scores.

3. Sufficiency. In the same way, students determine what makes for sufficient evidence in their readings or in their lives. They may evaluate legislation, court cases, or pro/con issues that apply to a given subject area, and then examine how much valid evidence is needed to vote for or against something. They may also apply the need for valid evidence to their own lives, and examine questions like: How much (valid) proof do you need to hear in order to believe _________?

Ideas for Using Evidence Across Subject Areas:

Social Studies: Analyze evidence that political leaders give to justify their decisions to go to war or propose policies. Science: Evaluate evidence needed to prove a scientific theory (or create their own theories) and evaluate the function of that evidence. Math: Determine evidence for how they arrived at their answers and justify their responses. Art: Justify evidence as to why they chose certain colors, styles, or techniques in their own work. Or evaluate the validity of evidence used to determine authenticity in the artwork of others. World Language: Using evidence from linguistic research, analyze the effectiveness of U.S. language policy in determining how students learn language. Physical Education: Evaluate evidence-based theories of the impact of physical activity on health.

4. Ideas. Creating valid ideas involves understanding and addressing the prompt. Students initially highlight key words in the prompt, paying special attention to the verbs in the prompt. They then number or list the different tasks the prompt asks of them. Students then can examine, sort, and classify their evidence to create their ideas. To do this, they develop a coding system to label the evidence about one topic in one color and another topic in another color. They may put a star next to evidence that fits one idea and a circle next to the ones that fit another. Or students write out (or type) their evidence, each on a separate index card, and then group them. They then look at what the threads are between the different pieces of evidence. Do they each express a certain concept or word? What are the connections between the different types of evidence? Does the evidence address the prompt? For a beginning thesis, students can initially reword the prompt question into a statement that includes their idea.

5. Relevance. Students can paraphrase both evidence and ideas to determine if they match. Or they can even do a matching “game” to pair ideas with relevant evidence. After writing, it’s also helpful for them to outline their ideas and evidence to clearly see if they are relevant to each other and the prompt.

6. Sequencing. Three ways I initially suggest sequencing evidence are chronologically, within order of importance, or leading and ending with students’ best ideas. I’m not as interested in the sequence they choose as I am in how they are thinking through sequencing their ideas.

7. Analysis. Students initially work with the function of analysis as elaborating on and extending the evidence as to how and why it proves their main ideas. When a student contributes a quotation in class discussion, I initially ask, “What does/might that mean to you?” This phrasing removes the idea that there is a right answer, and emboldens students to freely voice their interpretations. For variations, the class can work with one quote, each student analyzing it in different ways. They can view samples of analysis, create analysis together in groups or as a class, use sentence starters and extenders, fill in teacher-created blanks, or do a gallery of student analysis to evaluate each other’s work by placing post-its with their responses to analysis. For stylistic flow, they can also use transitional words or create synonyms from key words of the evidence to use in their analysis.

8. Three Questions. To put it all together, students need to ask three questions in regards to their ideas, evidence, and analysis. Does it have depth? Is it relevant? Is it precise? Through examining samples and peer critiques, students can evaluate whether answers are surface level or deep, whether they directly answer the question, and whether their work is specific enough to avoid generalizing or vagueness.

9. Feedback. A good rubric is a crucial part of the feedback process. Mine is student generated, easily accessible, and standard-assessment aligned. It essentially includes deep, relevant and precise ideas, evidence, analysis, style, and organization. For team or peer feedback, students also utilize our writing work-list holistically or in component parts.

10. Revision. I’ve found that students greatly benefit from a targeted essay upgrade (sometimes to the tune of Beyoncé’s song Upgrade U) where they have a fixed period of time to incorporate feedback into their papers. Following a writing workshop, they may color code each writing component (Ideas, Evidence, Analysis) to build organization and balance. Or code their writing with an “I” for where they list ideas, “E” for evidence, or “A” for analysis. They adjust ideas that don’t directly answer the prompt, examine the validity and sufficiency of evidence, or go through each analysis sentence and add depth with “why” explanations.

A student wrote in a class reflection: “My writing skills have grown by tremendous measures... Before entering your class, the evidence and analysis aspects were lacking. Well, that is not the case anymore. I have really started to read (for) and interpret the evidence and analyze it.”

In the same way students are being asked to do what they’ve never done, teachers are too. And that will take time, quality professional development, and a school community willing to work together to grapple with how to increase the strength of our collective teaching of evidence-based writing in order to increase learning opportunities for our students.

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Data Webinar
Using Integrated Analytics To Uncover Student Needs
Overwhelmed by data? Learn how an integrated approach to data analytics can help.

Content provided by Instructure
Classroom Technology Webinar How Pandemic Tech Is (and Is Not) Transforming K-12 Schools
The COVID-19 pandemic—and the resulting rise in virtual learning and big investments in digital learning tools— helped educators propel their technology skills to the next level. Teachers have become more adept at using learning management
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Data Webinar
How Can Data-Driven Instructional Programming Promote Equity and Student Achievement?
By now, you’ve started the new school year and begun gathering new academic data on your learners from interim, summative, and perhaps even social and emotional learning (SEL) assessment sources. These data points help you
Content provided by ACT

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment State Test Results Are In. Are They Useless?
While states, districts, and schools pore over data from spring 2021 tests, experts urge caution over how to interpret and use the results.
9 min read
FILE - In this Jan. 17, 2016 file photo, a sign is seen at the entrance to a hall for a college test preparation class in Bethesda, Md. The $380 million test coaching industry is facing competition from free or low-cost alternatives in what their founders hope will make the process of applying to college more equitable. Such innovations are also raising questions about the relevance and the fairness of relying on standardized tests in admissions process.
A sign is posted at the entrance to a hall for a test-preparation class. Assessment experts say educators should use data from spring 2021 tests with caution.
Alex Brandon/AP
Assessment Data Young Adolescents' Scores Trended to Historic Lows on National Tests. And That's Before COVID Hit
The past decade saw unprecedented declines in the National Assessment of Educational Progress's longitudinal study.
3 min read
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Whitepaper
Proven Techniques for Assessing Students with Technology
Dr. Doug Fisher’s proven assessment techniques help your students become active learners and increase their chances for higher learning g...
Content provided by Achieve3000
Assessment Long a Testing Bastion, Florida Plans to End 'Outdated' Year-End Exams
Florida Gov. Ron DeSantis said the state will shift to "progress monitoring" starting in the 2022-23 school year.
5 min read
Florida Governor Ron DeSantis speaks at the opening of a monoclonal antibody site in Pembroke Pines, Fla., on Aug. 18, 2021.
Florida Gov. Ron DeSantis said he believes a new testing regimen is needed to replace the Florida Standards Assessment, which has been given since 2015.
Marta Lavandier/AP