By Sunil Joy
Research Data Analyst for Kent ISD
Welcome to the 2018-19 school year everyone! Let’s hope this year is our best school year yet. And what better way to kick things off than to talk about state test results, right?
Okay, probably not.
Truth is, the new school year also happens to coincide with the release of the last school year’s state assessment data (2017-18) — the Michigan Student Test of Educational Progress (M-STEP). And for at least a few days annually, the public gets inundated with stories through television, radio, newspapers and social media — all seeking to understand the newest results and their repercussions.Now we realize that most parents have plenty of other things they need to worry about in regards to their child’s education. And it is likely that most busy parents probably don’t fully understand what the fuss is all about, or if M-STEP is something that should warrant their attention. In reality, these results have big implications, meaning we all should have a better understanding of what they are telling us.
For these reasons, we decided to go beyond the simple headlines and soundbites, instead, taking a deeper dive into the test’s purpose and implications. It is our hope this information will equip parents with the knowledge necessary to make meaning of the results for themselves.
Getting Some Background
Before diving head first into interpreting the results, it is important to understand the background behind the M-STEP, including the context in which it exists in our state.
What is the M-STEP?
M-STEP is Michigan’s common assessment system: The M-STEP is Michigan’s mandated state assessment for public elementary and middle school students. The assessment evaluates performance in English Language Arts and Mathematics, along with Science and Social Studies in select grades. The vast majority of public school students—with a few exceptions for special populations—are required to take this assessment during the spring of each school year.
Why are students required to take the M-STEP assessment each year and to what end?
M-STEP is required by federal and state law, and is designed to measure mastery of state academic standards: In exchange for federal and state dollars, public school students are required to take a statewide assessment for certain subjects and grade-levels each school year (Michigan public school students are required to take the SAT exam in high school). More broadly, these tests are aimed at measuring a student’s performance against the state’s academic standards—the subject-level expectations students should be able meet at the end of each grade-level.
My child’s teacher already administers their own tests, why do they need to take the M-STEP too?
M-STEP’s primary purpose is to provide a snapshot of end-of-year student learning across the state: When teachers give students assessments throughout the school year, the primary purpose is to gauge their students’ progress, and make adjustments to their instruction as necessary. Parents also get a sense of how their child is advancing during the school year from the assessments teachers prepare.
The M-STEP does not serve this purpose. Instead, think of the M-STEP as a final exam you would take in high school. Once the exam is over and you’ve received your results, you can’t simply step inside a time machine and take the exam again. Similarly, once a teacher gets their students’ M-STEP results, the school year has already ended. As a summative assessment rather, the M-STEP’s purpose is to provide a snapshot of student learning over the past school year. And because most public school students statewide take the same assessment every year, it’s possible to compare one school’s results to the next.
What happens if my child or school performs poorly on the M-STEP?
Poor performance on the M-STEP has real repercussions both for schools and students: The state uses the M-STEP as one of the primary sources for identifying and intervening in schools that are underperforming or failing. Current state law also requires that teachers receive annual performance evaluations, which in part must be based on their students’ state assessment data.
And while there historically haven’t been prescribed consequences beyond educators and schools, a 2016 state law passed by the legislature requires students unable to pass the third grade English Language Arts assessment be retained in third grade beginning in 2019-20. Though exemptions for good cause exist, in this example, consequences for poor performance aren’t just for adults, but for students.
How to soundly interpret the M-STEP results
Now that you have some background and assuming you are still awake, let’s turn our attention now towards how to best make meaning of the M-STEP.
I’ve heard Michigan’s M-STEP scores are not so great. Should I move my child to another state where kids perform much better?
Michigan’s results aren’t so unique when compared to other states: With each release of statewide test scores, the recent norm for news stories is to point out that Michigan’s students are falling behind the nation. It would lead one to conclude that something distinctively “bad” is happening in Michigan. But as we’ve highlighted in the past, our state isn’t as unique as some would like us to believe. To further demonstrate this concept, we decided to look at state assessment results from Massachusetts and Tennessee—places that experts had previously identified as top education states—as it compares to M-STEP. What we found were headlines not too different than Michigan. While we realize that every state has its own assessment systems—meaning comparisons aren’t exactly apples to apples—the following examples are here simply to illustrate this idea:
- According to the 2017 Massachusetts assessment results for third grade reading, 47 percent of students were proficient. In Michigan, 44 percent of students were proficient in English Language Arts for 2017 in third grade.
- In Tennessee, 35 percent of students were proficient on the 2018 Mathematics assessment in grades 6-8. In comparison, 34 percent of students were proficient in Michigan.
Now let’s look at a few recent media headlines, putting into perspective our own state’s headlines:
- Massachusetts: “Just half of Massachusetts students met MCAS expectations in 2017”
- Tennessee: “TNReady testing scores mostly flat, but results dip in high school English”
- Michigan: “More than half of Michigan students failed M-STEP literacy exam”
There are clearly areas for improvement. Is this because teachers are doing a bad job or students simply aren’t that smart?
Blaming students or their teachers for subpar results focuses attention in the wrong places: Often when M-STEP results aren’t where they ought to be for a school or district, the immediate inclination is to either blame students or their teachers. This occurs in large part because the major focus of media reporting are often the deficits, with little explanation on “why” such inequities may exist. And when little explanation is provided, it shouldn’t be a surprise that most people would blame either the test takers (students) or the people preparing the test takers (teachers). But by narrowly focusing on teachers and students only, we ignore the systemic reasons that may be contributing to the results. Moreover, this limited mindset distracts us from the questions that need to be answered in order to genuinely move forward:
- Are schools and districts receiving adequate and equitable resources to best support student learning—particularly for disadvantaged students who have historically lagged behind their peers? Students face a diverse set of obstacles every day—ranging anywhere from emotional trauma to access to clean clothes. Research from the School Finance Research Collaborative reveals that despite the diverse obstacles students may face, our state isn’t distributing resources equitably to those students who need them the most.
- Are we providing schools with relevant and engaging instruction that makes students not only interested in their courses, but able to reach their maximum potential?
- Are we ensuring educators have access to high-quality instructional or curriculum materials and ongoing professional development, which we know are essential for teaching and learning?
- Are we proving strategies to address matters that go beyond the four walls of the school building, including in areas like chronic absenteeism?
- Are we strategizing on approaches to better involve parents/guardians in their child’s education?
- Are those schools that are truly making an impact or “beating the odds” being promoted as models for success regionally and statewide?
Does the M-STEP point to a child’s success later in life?
M-STEP is a single data point, and can’t paint the whole story of student success alone: It would be naïve to believe that one assessment given at a single point in time—regardless of quality—can sufficiently predict a child’s future. To be clear, this is not to say M-STEP’s data on content mastery is unimportant. In fact, the Kent Intermediate Superintendent Association (KISA)—a group composed of Kent ISD’s 20 school district superintendents—have anchored three of their four long-term goals on data provided by current state assessment systems.
Rather, we know from our business partners in the region that indicators like collaboration, innovation, teamwork and others are just as vital for life success. Albeit difficult to measure, we simply can’t forget that these matter.
Our purpose here is to ensure our readers are better informed on this complex, yet important topic in public education. Despite its intricacies, the implications behind the M-STEP are too important to overlook.
Ensuring high academic achievement for students is multi-faceted strategy—including the strategic use of data. But it also means understanding what the data is and isn’t telling us. Only by doing so can we truly ensure that success is attainable for all of our students.
Check out School News Network for more stories about students, schools, and faculty in West Michigan.