How Test Prep Works Against the Spirit of Formative Assessment
And What We Can Do About It
As a classroom assessment specialist, I understand that formative assessment is essential for good teaching and learning. Teachers closely observe students, using a wide range of strategies to figure out where they are in their individual learning journeys, and what assistance they need, while also empowering students to become more reflective about their own learning.
Many teachers are working hard to get better at these strategies, and in the best cases, have supportive school leaders who carve out time for professional learning. But sometimes, under the pressure of external assessments, teachers use formative assessment processes to support test preparation. This use not only hijacks an important classroom practice, but risks derailing teachers’ well-intended work to master it.
What am I saying? Hang in there. I’d love to explain.
Recently, I was part of a group of assessment researchers that visited an elementary school in Singapore. This school felt very similar to a U.S. school, its walls decorated with science projects and information about the school’s history and values. The school leadership team was supportive of their teachers as they learned about and used formative assessment, and eager for us to hear from two of their teachers.
The Spirit vs. the Letter of Formative Assessment
The teachers presented some recent revisions they had made to a rubric to better support formative assessment practices with their students. They used their weekly professional development hour to examine the rubric’s language and revised it when they realized it was not student-friendly. They tried out the revised rubric with students to help them self-assess their work, and also used it themselves to provide feedback to the students to help them refine their self-assessment skills. It all sounded very promising.
So what was the problem? The rubric was designed to help Grade 3 students self-assess whether they had correctly followed the steps for answering multiple-choice test questions: Did they underline key words in the question stem? Did they eliminate obvious wrong answers? Did they check their work? So, while the rubric was a supporting structure to help students self-assess, it was not focused on the most important stuff: their understanding of a big idea in the discipline. Instead, it served as a procedural checklist for how to answer test questions.
Two researchers I admire, Bethan Marshall and Mary Jane Drummond, wrote an article about the letter versus the spirit of formative assessment, and I thought a lot about this article as I listened to the two teachers. In their well-intentioned efforts to help students reflect on their learning, these teachers seem to have lost the spirit of formative assessment (reflecting on the learning itself) and retained only the letter of formative assessment (having a tool to support a fairly procedural approach to a low-level strategy for test questions).
How Exam Culture Can Hijack Formative Assessment
I’ve been back from Singapore about a month now and I’m still thinking about the effect of external assessments on formative practices. Admittedly, my experience of the impact of exam culture in Singapore was limited. But I caught a glimpse of the same test-prep dynamics at work there that I see here in the U.S.: how easily external assessments can push aside deeper learning, rendering formative assessment a learning process in name only.
One example of test culture trickling down to classrooms is in the use of practice assessments before students take state summative assessments in the spring. We certainly don’t want students surprised by the format of a high-stakes assessment, but I wonder: How much exposure to the test format do Grade 7 students, for instance, really need? They’ve been tested annually in at least two content areas since Grade 3 and likely also take online interim assessments. I doubt that many students are unfamiliar with the test format by this stage.
You also see the trickle-down effect in the format of classroom and district-level assessments. I’ve noticed that some teachers and school and district leaders are unwilling to use assessments during the school year that diverge too much from the format of items on the state summative assessment, which can lead to an atomization of content and low-level or procedural thinking. Since test questions, particularly in mathematics, tend to focus on discrete content from a single standard, focusing on this form of assessment tends to result in classroom teaching and learning following suit, with few opportunities to integrate learning across multiple standards.
Resisting Test Culture, Protecting Formative Assessment Practices
How can teachers and education leaders protect formative strategies from the influence of accountability test culture?
Define your vision of teaching and learning. To do this, you need a disciplinary perspective on learning. What does it mean to think and act like a mathematician, scientist, writer, or historian? These disciplinary ways of thinking rarely, if ever, require responding to multiple-choice questions. Understanding that the ultimate outcomes from school are not measured just by test results must affect how we think about the kinds of messy, authentic learning opportunities that will produce those outcomes.
Embrace the spirit of formative assessment. Those messy, authentic learning experiences lend themselves directly to formative practices. Engaging students in learning experiences that help to learn disciplinary ways of knowing and doing (or having access to high quality curricula that do this so that teachers don’t have to continually reinvent the wheel) requires the use of formative assessment to help students understand where learning is going, to help both students and teachers understand how students are progressing in that learning and to identify what will help them continue to progress.
A group of teachers I worked with developed a habit of asking each other, “What’s formative about that?” It was a great way of reminding themselves to make sure that new classroom strategies really did align to the spirit of formative assessment rather than just the letter.
But perhaps we need to add a couple additional questions to help us stay focused on the nature of the learning. Maybe one is something like this: “Does this learning activity help students become better mathematicians, scientists, writers, or historians?” And the other is this: “Was I able to use students’ responses to give them useful, formative feedback that helps them deepen their learning?”
Insulate the classroom as much as possible. Trends and patterns in state test results can be used to monitor and evaluate educational outcomes, contribute to program evaluation, and inform resource allocation. All important things, but they cannot inform day-to-day instruction. School and district leaders need to be vocal in reminding teachers about that. They must keep the vision of disciplinary learning at the foreground in school and avoid things that put an undue emphasis on the state tests such as testing pep rallies or test prep boot camps.
Make it a collective effort. While individual teachers can perhaps do these things on their own, they shouldn’t have to. Formative assessment is much more likely to be sustained when it is championed by school and district leaders who lead the work on defining a vision of teaching and learning, protect teacher collaboration time and support meaningful classroom assessment practices.
This is hard work. And there are fewer systemic examples of truly balanced assessment systems than I care to admit. Creating them will require attention at all levels about what it means to have a learning-centered assessment system, along with discussions about visions of what teaching and learning should look like.
These visions need to be developed locally, grounded in culturally relevant pedagogy and the latest research on how students learn, independent of accountability assessments. Accountability assessments also need to be designed to support these visions of teaching and learning rather than distorting the vision to support the assessment.