Teaching & Assessing 21st Century Skills: A Focus on Analytical Thinking
Research and Best Practices: One in a Series on 21st Century Skills
My colleagues and I have spent the past several years writing about promising instructional and assessment practices to cultivate 21st century skills in K-12 schools. We now have a repository of literature reviews and blog posts on defining, teaching and assessing these skills. In this blog, I’ll summarize what we’ve learned about one of those skills: analytical thinking.
Defining Analytical Thinking
We developed a working definition of analytical thinking with our partners at the International Baccalaureate (IB).
Analytical thinking is a cognitive process that consists of (1) identifying and decomposing a complex concept, problem, system, or process into parts, (2) examining those parts and their distinct characteristics or functions, and (3) communicating or articulating how the parts relate to the whole.
Analytical thinking is most often used to deeply understand how something works, whether that thing is a concept, problem, system, or process. For example, analytical thinking may be applied to discover trends in data, identify cause-and-effect relationships, make connections between factors, identify patterns or themes, and develop heuristics. As a tool for deeper understanding, analytical thinking also acts as an essential ingredient in problem-solving, critical thinking, and creative thinking.
For a deeper dive into what we’ve learned about analytical thinking, see our literature review in the Center for Assessment’s toolkit, Assessing 21st Century Skills.
Relationship With Other 21st Century Skills
Analytical thinking has been framed as an umbrella term that encompasses many cognitive skills. Likewise, many cognitive-thinking skills include some form of analytical thinking as an essential component.
For example, the World Economic Forum’s “Defining Education 4.0” taxonomy organizes creativity, critical thinking, digital skills, programming, problem-solving, and systems analysis under the general label of analytical thinking. Conversely, definitions of critical thinking and creative thinking include aspects of analytical thinking, such as examining or refining ideas, as essential sub-skills.
Because of this, these frameworks may initially appear contradictory. Does analytical thinking subsume critical and creative thinking, or vice versa? A deeper dive into these skills reveals their overlapping nature and how both claims can be true.
The relationship among these skills—for instance, whether analytical thinking should be considered an overarching skill or a subskill of critical and creative thinking—depends on the lens we’re applying to a task or problem.
When students think critically about something, analysis is a necessary subskill. When students want to create something, analysis is a necessary subskill. When students think analytically, they might be doing so to understand how something works (analytical thinking), render a judgment (critical thinking), or create something useful (creative thinking).
Instruction for Analytical Thinking
Research suggests that analytical thinking has domain-general and domain-specific characteristics. This has prompted educators to ask, “Should analytical thinking be taught as a domain-general process (explicitly teaching students the general steps involved in analytical thinking)? Or is analytical thinking better taught and learned through domain-specific instruction, where these steps are implicitly used?” Though not definitive, recent research suggests that both approaches can be effective.
Two meta-analyses (Abrami et al 2015 and Abrami et al 2008) shed light on effective strategies for teaching analytical-thinking skills and dispositions both via domain-general and domain-specific instructional approaches. Moreover, these studies identified several effective strategies associated with the development of analytical-thinking skills and achievement. These included: (1) opportunity for dialogue, (2) exposure of students to authentic or situated problems and examples, and (3) mentoring, tutoring, coaching and apprenticeship opportunities that include one-on-one interaction between an expert (the teacher) and a novice (the student).
Implications of Research for Assessment Design and Use
Below, I summarize three practical suggestions for school leaders and teachers for how to assess analytical thinking. Notably, these suggestions apply when assessing any 21st century skill.
Develop (or adopt) a clear, research-based construct definition of analytical thinking. A clear definition is essential for ensuring that the assessment is truly measuring analytical thinking and that teachers know how to teach it. The literature review that my colleague Will Lorié and I wrote on analytical thinking provides more detail that clearly defines this skill.
Use open-ended items and performance-based tasks to elicit the depth of student thinking and performance on analytical thinking. When designing items/tasks, consider their intended use by asking, “What are the claims that I want to make about a student’s analytical-thinking skills from this task?” For example, a performance task may be used:
- as a summative judgment to determine how well a student applies analytical-thinking skills in solving a specific problem
- as feedback for teachers and students to reflect and adjust their process as they attempt to solve a problem
In the first case, the assessment task is results-oriented, representing what a student knows and can do at a certain point in time. In this case, learning is assumed to be static during the assessment process. Tools for learning are isolated and held constant. To ensure results are credible, a teacher might post helpful tips on the wall, prohibit hints from being provided, and require students to work alone on the task. This helps to ensure that a student’s skill level at end of the performance task is identical to when she started the task.
In the second case, the task is process-oriented; the assessment task becomes a tool for learning. Learning is assumed to be dynamic during the assessment process, and the assessment’s credibility rests on the quality and usefulness of information it provides to improve a student’s skills level at certain steps, or milestones, in the assessment process. The goal in this case is to ensure that a student’s skill level at the end of the task is higher than when she started the task.
Both use cases above are important for teaching and learning. The key is to be clear about the purposes/uses of the assessment at the start. That way, the administration and/or instructional strategies a teacher applies during the assessment process match the assessment’s intended purposes and uses.
Provide students a variety of opportunities to practice and demonstrate analytical-thinking across content areas and settings. A teacher’s primary use of assessment information should be to support the learning process, so most assessments should address use case #2 above. As students practice applying these skills in a wide range of content areas and novel situations, and as teachers use assessment information to appropriately scaffold instruction, students will naturally become better analytical thinkers.
In a recent book, my colleagues Carla Evans and Scott Marion identified 10 assessment features that work together to support instructionally useful assessment. These features are important when interrogating an assessment’s instructional utility in developing analytical thinking (not to mention other 21st century skills).
Concluding Thoughts
Developing robust approaches for teaching and assessing analytical thinking is essential to help students thrive in today’s world. Through our work with our partners at IB, we now know more about how to define, teach, and assess analytical thinking and other skills.
Our research on analytical thinking was supported by the International Baccalaureate (IB) Programme. The author would like to acknowledge the thought partnership of colleagues who contributed to this work, including Jen Merriman and Magda Balica from IB, and Will Lorié and Carla Evans at the Center for Assessment. Any errors and omissions are my own.