More on assessment empowerment: The power of knowing your purpose

Have you ever stopped to note how many decisions you make in a day? It can be as many as 1,500! No wonder you’re tired.

Educators’ decisions about assessment processes, tools, and data use can become automatic, which helps us consider all our learners’ strengths, interests, and needs as quickly as possible. If we don’t regularly check our decisions, however, we risk making choices that undermine student learning success, well-being, and self-efficacy. One thing we may not know to check, forget to examine, or become too busy to question is assessment purpose. Assessment processes and tools are essential parts of the responsive teaching and learning cycle, but are we always clear what our assessments are actually meant to do?

Assessment empowerment principle #3, attend to purpose, reminds us to pay close attention to the reason for our assessments processes and tools. If we don’t have a clear idea, we can end up over-assessing, under-assessing, or using inaccurate results to inform important decisions. These issues can add to educator and learner fatigue, distrust, and toxic stress, not to mention inequity.

How to approach assessment with intention

We need to ensure that our assessment processes, tools, and data use are part of the responsive learning cycle that fuels learners and learning. To do that, we can partner with colleagues, students, and families to regularly pause and ask three key questions:

  1. What is the purpose? (Remember that for learner and learning-centered success, educators and students partner to routinely engage in three main assessment purposes: formative, interim, and summative. Formative processes elicit evidence to inform day-to-day responsive teaching and learning “moves.” Interim assessment processes occur at intervals, for example, every 6–8 weeks, and are used to further elicit evidence to inform responsive learning moves. Summative assessment processes are used at or near the end of a learning journey, like at the end of a unit or quarter, to elicit learning evidence that informs determinations such as grade reports, certification of competency, class/course placement, or program improvement.)
  2. Do the assessment and its purpose fit the context, chosen outcome (e.g., learning goal), and placement in the learning cycle, progression, or continuum of development?
  3. Do the assessment process, tool, and data use match the purpose it was designed to serve?

Notice that these three questions are deliberately crafted for use at multiple levels of the education ecosystem: classroom, professional learning community (PLC), school, district, and beyond. For a balanced, coherent, and articulated web of assessment processes, tools, and data use that effectively improves learning outcomes, addresses unfinished learning, and closes opportunity gaps, all levels must be involved and aligned. Educational organizations and policymakers need to be aligned, too, so that their actions, resources, and guidance do not create barriers for the other parts of the educational ecosystem.

Examples of intention in action

Let’s explore how the three purpose questions can be applied to two scenarios from different levels of the education ecosystem. After demonstrating how to answer the questions, I’ve included suggested next steps because checking for assessment purpose is a continuous improvement process.

Scenario 1: PLC example

The members of the eighth-grade ELA PLC at Easton Middle School are collaborating to develop a common formative assessment (CFA) so they can work smarter, not harder, to generate data that can inform how they adjust responsive teaching and learning practices. The current common learning goal of focus is the ELA literacy standard RL.8.2: “Determine a theme or central idea of a text and analyze its development over the course of the text, including its relationship to the characters, setting, and plot; provide an objective summary of the text.”

Assessment is essential, but are we always clear what it’s for?

For the first CFA in the sequence of learning for this goal, the PLC members decide to copy and paste five comprehension questions from the unit summative materials that are provided in the teacher’s edition of the district-wide adopted textbook. They agree to each use the five copied-and-pasted questions with their students, gather and analyze data on their responses, and then determine next steps. Here’s how they answered the three purpose questions as a team.

1. What is the purpose?

The reason for the CFA is to inform how to adjust instruction and supports as students practice how to analyze the development of theme over the course of a text.

2. Do the assessment and its purpose fit the context, chosen outcome (e.g., learning goal), and placement in the learning cycle, progression, or continuum of development?

No. There’s a mismatch between the learning goal (analyzing theme) and what the questions measure (comprehension).

Building and checking for comprehension are certainly necessary steps toward successful theme analysis, but we can’t just copy and paste comprehension questions without adjusting them to scaffold up to questions that directly ask about the development of theme. If we don’t fix this, we’ll go through the CFA process only to find we don’t have the data we need to make sure all students are learning and growing in the learning goal of focus.

3. Do the assessment process, tool, and data use match the purpose it was designed to serve?

No. We selected comprehension questions from the end-of-unit summative materials without adjusting for the learning goal (analyzing theme) or the formative purpose (gauge student progress in analyzing theme). The summative questions could certainly inform how to craft or select CFA questions, but they should not be copied, pasted, and used as-is because they were designed and placed in the textbook for summative purpose.

Next steps

Because there’s a clear disconnect with the assessment and purpose in this example, the PLC has some work ahead of them. But how much better is it to regroup this early in the process than to realize the assessment data you’ve spent days, weeks, or longer gathering can’t help you gain the understanding or make the decisions you need?

Educators may feel like they must use district- or school-adopted curriculum materials, such as assessments in provided textbooks, as-is, just like the eighth-grade PLC did. But keep in mind that textbook writers must create resources for a broad audience; they do not know your specific students, your learning goal(s) of focus, or when and where the learning goal is placed in your course or class learning journey. In other words, they don’t know the best-fit responsive moves; you do. Especially in the formative phases of learning, it is important to check and, if necessary, adjust assessment tasks or processes to ensure a match between assessment purpose and learning goal, timeframe, and scaffolds needed for all students to succeed in demonstrating their abilities. If we don’t, we risk operating as though our educators and learners are robots on an assembly line, which can undermine the learner- and learning-centered aim of assessment empowerment. A factory model approach to education used to be the expectation and norm; that is no longer the case.

Scenario 2: Leadership example

At Seaside Elementary, third-graders take MAP® Growth™, NWEA’s interim assessment, in the fall, winter, and spring. The district leadership team is working on improving reading instruction, so they use MAP Growth results to evaluate teacher performance.

1. What is the purpose?

We are using MAP Growth to evaluate teacher performance.

2. Do the assessment and its purpose fit the context, chosen outcome (e.g., learning goal), and placement in the learning cycle, progression, or continuum of development?

No. As district leaders, we are working on improving reading instruction, and while MAP Growth measures reading skills, it’s very difficult to draw a straight line between teacher instructional practice and student outcomes on a test because there are so many other factors to consider. In addition, we cannot look at a single data point. Are we also looking at a baseline, progress monitoring, feedback, or any other empowering components used in a responsive cycle or progression of teacher skill development? By using MAP Growth in this way, we are taking a leap to an evaluative (summative) use without attention to a continuum of development.

3. Do the assessment process, tool, and data use match the purpose it was designed to serve?

No. MAP Growth was not designed for the purposes of teacher evaluation, according to NWEA’s document “Guidance on the use of student test results in teacher evaluation systems.”

Next steps

Just like with the PLC group, there’s a mismatch in assessment and purpose here. There’s also a lot to be gained from slowing down and finding alignment.

[C]hecking assessment processes, tools, and data use for alignment to purpose was not something I was taught to do in my teacher preparation program, nor was it a topic of professional development until I was well into my career.

District leaders can continue to monitor researchers’ most recent reports regarding the best combination of teacher performance measures and read articles like “Measuring teacher effectiveness” by the RAND Corporation. It takes time to find the right blend of processes and tools for teacher evaluation, yet investing time up front to ensure proper purpose will save time, energy, and frustration down the line. Furthermore, ensuring appropriate purpose of teacher performance measures is a powerful way to model assessment empowerment principle #3 for other levels of the education ecosystem; your leadership actions with assessment purpose will speak louder than words.

Moving forward

I don’t know about you, but checking assessment processes, tools, and data use for alignment to purpose was not something I was taught to do in my teacher preparation program, nor was it a topic of professional development until I was well into my career. I wish I had known about all of this many years ago. I could have saved so many hours of teacher and student frustration and even toxic stress!

I cannot go back and change those decisions, but moving forward I can commit to checking for purpose and supporting others to do the same. If you’re not sure how to go about this work, ask for help from an instructional coach, special education specialist, or assessment for learning leader. Know that once you practice checking for purpose mismatches in more formal, planned assessments, like in the example earlier, you’ll create new habits that can help you make quick, well-informed, and aligned decisions about in-the-moment assessments, too. And why not involve your students in this process? What a great way to empower them to build self-efficacy and share the weight of this hard work.

Here are some questions to get you started thinking about all of this and how it applies to your school or classroom:

  • In what ways do I already partner with colleagues to check the purpose of assessment processes, tools, and data use?
  • How do I talk about assessment purpose with my students and their families in ways that emphasize learner- and learning-empowering reasons and benefits (instead of compliance-driven reasons and consequences)?
  • What is an example of one assessment process, tool, or data use I can check for purpose right now? How can I partner with others to do so?

To learn more about assessment empowerment, read “It’s time to embrace assessment empowerment,” “Begin your assessment empowerment journey with principle #1: Learner context,” “Continue your assessment empowerment work with principle #2: Learning environments and relationships,” “Assessment empowerment principle #4: Responsive learning cycles,” and “Last but not least: The role of communication in assessment empowerment.”

Website

Reading differentiation made easy

MAP Reading Fluency now includes Coach, a virtual tutor designed to help students strengthen reading skills in as little as 30 minutes a week.

Learn more

Webinar

Dyslexia 101

Learn more about dyslexia with the on-demand version of our webinar Straight facts on dyslexia: What the research actually tells us.

Watch now

Guide

Put the science of reading into action

The science of reading is not a buzzword. It’s the converging evidence of what matters and what works in literacy instruction. We can help you make it part of your practice.

Get the guide

Article

Support teachers with PL

High-quality professional learning can help teachers feel invested—and supported—in their work.

Read the article

Content disclaimer:

Teach. Learn. Grow. includes diverse perspectives that are meant to be a resource to educators and leaders across the country and around the world. The views expressed are those of the authors and do not necessarily represent those of NWEA.