Before I address the question, indulge me while I set some context by discussing NAEP scores. As you know, 2015 NAEP scores showed a pretty broad-based decline breaking a trend of small increases over several administrations. We had become used to the idea that NAEP scores would creep up incrementally, so these results demanded explanation. Immediately, education analysts began to assert causes. Opponents of the Common Core State Standards (CCSS) pointed to the CCSS, proponents of the CCSS discussed an “implementation dip,” others pointed to the lingering effects of the great recession, others to the rise of charter schools, others pointed to state policy changes that included more students, and on and on, ad infinitum.
These explanations can be broken into two categories: changes in the tested population and changes in the treatment (instruction) of the tested population. Documenting changes in tested population—though arduous—can be done, though the actual effect on scores will be less clear. The effect of changing instruction will always be more speculative and how convincing those explanations are will depend on your personal views about education.
NWEA Norms Changed, Too
Both factors mentioned for NAEP, as well as one additional factor, influenced the minor changes in 2015 NWEA norms. This third factor is changes in the methodology used to develop the norms. We can document the changes in methodology and the changes in the demographic composition of the sample used to develop the norms. Just as with NAEP, it is less clear exactly how each factor affected the norms. It is much harder to document the changes in the instructional environment during the last five years though there are three broad trends we will cite: standards, economics and policy.
Norms changed because what the norms describe changed and how the norms are developed also has changed.
Changes in student demographics.
- Student demographics have evolved between 2011 and 2015, with more minority, high poverty, and English Language Learners (ELL) students entering schools.
Changes in the educational environment.
- The implementation of the Common Core State Standards and other sets of new career and college ready standards with the associated assessments in almost all states.
- This implementation was asynchronous; that is, schools and districts adopted the assessments at different times, at different paces, and with differing degrees of effectiveness.
- The aftermath of the last recession, which resulted in the largest layoff of teachers since the Great Depression.
- The implementation of Race to the Top, which more closely tied teacher evaluation and school accountability to student performance on tests.
Improvements to the norming process.
- A larger and more diverse student population from which a more representative sample could be drawn.
- Refinements to the model for estimating growth that included changes to better estimate summer loss.
- Moving from five terms of data to nine terms of data supports better estimates of growth.
- Other methodological improvements.
Basically, the norms changed because what the norms describe changed and how the norms are developed also has changed. The result are norms that are more valid and precise. The 2011 norms were our best effort to describe student achievement and growth that was occurring in 2009-2010 and the 2015 norms are our best effort to describe student achievement and growth that was occurring during 2012-2014. NWEA researchers will continue to refine our methods to create the best description we can of student achievement and growth and these efforts will also likely lead to changes in norms. In the end, NWEA will provide reliable data that districts, schools and teachers can use to make informed decisions.
—