School & test engagement
Educators need accurate assessment data to help students learn. But when students rapid-guess or otherwise disengage on tests the validity of scores can be affected. Our research examines the causes of test disengagement, how it relates to students’ overall academic engagement, and its impacts on individual test scores. We look at its effects on aggregated metrics used for school and teacher evaluations, achievement gap studies, and more. This research also explores better ways to measure and improve engagement and to help ensure that test scores more accurately reflect what students know and can do.
Researchers can detect when students aren’t trying on computerized tests
A testing company spots disengaged students who are guessing answers too quickly.
The Hechinger Report
Mentions: Steven Wise
Topics: Equity, Innovations in reporting & assessment, School & test engagement
Why we’re all-in on student test engagement
A lack of test engagement can negatively impact scores. Learn about NWEA’s work to prevent and mitigate impacts of rapid guessing.
By: Steven Wise
Topics: School & test engagement, Innovations in reporting & assessment
Computer-based testing offers glimpse into ‘rapid guessing’ habits
When students speed through a computer-based test, their responses are far less likely to be accurate than if they took longer to find the solution, according to new research.
Education Dive
Mentions: Steven Wise
Topics: Equity, School & test engagement
Can test metadata help schools measure social-emotional learning?
Social-emotional learning (SEL) competencies like self-efficacy and conscientiousness can be predictive of long-term academic achievement. But they can also be difficult to measure. In a new study led by NWEA’s James Soland, researchers investigated whether assessment metadata – the way students approach tests and surveys – can provide useful SEL data to schools and educators. Soland joins CPRE research specialist Tesla DuBois to discuss his findings, their implications, and the promise and limitations of student metadata in general.
Consortium for Policy Research in Education Knowledge Hub podcast
Mentions: James Soland
Topics: School & test engagement, Innovations in reporting & assessment, Social-emotional learning
Are achievement gap estimates biased by differential student test effort?
New research shows that test effort differs substantially across student gender and racial subgroups. What does this mean for achievement gap estimates?
By: James Soland
Topics: Equity, School & test engagement, Social-emotional learning
The relationship between test-taking disengagement and performance on MAP Growth retests
Educators sometimes ask: do students rapidly guess because they don’t know the answer to a question, or do rapid guesses reflect a lack of engagement with the test? Would a student’s scores improve if that student engaged more with the assessment and rapidly guessed on fewer items? Examining MAP® Growth™ test scores and levels of student test engagement for over 100,000 tests for which students retested within one day, the results showed that students’ test taking engagement often differed between the initial test and the retest.
By: Steven Wise
Topics: School & test engagement
In this study, we introduce those disengagement metrics for a policy and evaluation audience, including how disengagement might bias estimates of educational effectiveness. Analytically, we use data from a state administering a computer-based test to examine the effect of test disengagement on estimates of school contributions to student growth, achievement gaps, and summer learning loss.
By: Megan Kuhfeld, James Soland
Topics: Measurement & scaling, School & test engagement, Student growth & accountability policies