The subject line on the email from my daughter’s third grade teacher read, “Totally Baffled.”
It was mid-January, and my daughter’s class had taken the winter MAP Growth Math assessment that day. Her teacher was aware that I worked for NWEA and that I had used MAP Growth data in my own classroom when I was teaching, so she’d come to me in the fall with some questions about understanding and utilizing her class summary data.
This time, however, the question she had was about an individual student whose score had dropped between fall and winter testing: my daughter’s.
As a classroom teacher using MAP Growth data and in the years I’ve worked with NWEA, I’ve addressed the question about what it means when scores go down with many parents and educators. I’ll admit it felt a little strange to be explaining it as the parent to the teacher instead of the other way around.
My daughter’s score had gone down four RIT points, the email said. Four points is not too much greater than the standard error of measure, so I wasn’t all that alarmed. I still wanted to know, however, more about what happened.
I asked the same questions I ask whenever I see that a student’s score has gone down:
- Did she take much less time on the winter test than she did in the fall?Often, when a student score drops, the test duration time helps tell the story. Maybe the student took 45 minutes to complete the test the first time, but then rushed through the second test and finished in 20 minutes. In this case, the drop in the score was likely because he was rushing and not necessarily trying his best.
In my daughter’s case, however, the reports showed that she had actually taken longer on the second test than she had on the first.
- Did she seem distracted? Was there anything going on that day or during testing that may have made it hard for her to do a good job?For my sweet, creative, social, and often scatter-brained child, I knew this was a distinct possibility. I would not have been surprised to find out that she’d been whispering with a friend or drawing pictures of animals on scratch paper during the assessment.This time, however, her teacher said the opposite was true. In the fall, she’d been off task a lot, but for the winter test, the proctor reported that my daughter had been “not very wiggly” or off task.
I was certainly glad to hear it, even if it didn’t help explain the test score.
- How did she feel about how she had done?I taught upper grade students (grades 5-8), and when I talked with my own students about their test scores, I’d often start by asking how they felt about how they did – no matter what their scores were. Hearing their perspective was often very informative. I remember one student who seemed unhappy when we sat down to talk about his score. He said that the test had seemed really hard and that he’d struggled. He didn’t think he had done well. Looking at his scores, however, it turned out that he had grown several RIT points beyond his growth projection! The test felt harder because he had gotten into some higher-level questions. He’d done exceptionally well! He left that conference with me with a huge smile on his face.
In my daughter’s situation, she’d been smiling, too. Her teacher said my daughter was very proud of how she had done on the test. And her teacher was proud of her effort and focus, regardless of how the score turned out.
So, finally, I asked the most important question:
- Is she learning? Do you see evidence in class that she is making progress on the skills she needs?The teacher assured me that my daughter was learning, that she had shown a lot of progress in class since the fall. She mentioned some specific concepts they’d been working on and some class assignments recently completed.
We decided that we would keep an eye on her, but otherwise not worry too much about the drop, and that we’d see in the spring if her score came back up.
Why did my daughter’s score go down? It’s hard to know for sure. However, now MAP Growth is providing even more information about student engagement during the test which could have helped answer this question.
The researchers at NWEA have done a lot of work around student engagement in assessment. As a result, this past fall the MAP Growth assessments featured exciting new tools that monitor the length of time it takes a student to respond to each test item and compare the time to the average amount it has taken other students to respond to the same item.
Two new tools are available:
- First, if a student is going a lot more quickly than is typical for the test items, called rapid-guessing, an alert will appear on the proctor’s screen so the proctor can redirect the student to get back to focusing on the test. The difficulty of the items will lock at the current level, instead of becoming more or less difficult. It will start adapting the questions again when the student reengages with the test, responding more purposefully.
- Second, two additional pieces of data have been added to the Student Profile Report. The data reports on what percentage of the test the student was rapid-guessing and gives an estimate of how much their test score was impacted because of that behavior.
As a classroom teacher, I relied heavily on MAP Growth data as a starting point when planning instruction for my students. For the teachers who are doing the same now, these new tools will make it that much easier to have accurate data to use in knowing what a student is ready to learn, and – in situations like my daughter’s – to help answer additional questions that may arise.