Table of Contents

Misconception: If students are already high (or low) achieving, it is harder to show growth.

Educators serving high or low achieving students are often concerned that their students' entering achievement level makes it more difficult for them to show growth. However, with EVAAS, educators are neither advantaged nor disadvantaged by the type of students that they serve. The modeling reflects the philosophy that all students deserve to make appropriate academic growth each year; as such, EVAAS provides reliable and valid measures of growth for students regardless of their achievement level.

EVAAS in Theory

The value-added models used in Michigan are designed to follow the progress of individual students over time and estimate whether these students made the average amount of growth observed in the state or population of test-takers in the current year for the subject for the assessment of interest.

Furthermore, although M-STEP and MAP assessments are designed to discriminate proficiency from non-proficiency, they are also designed to have sufficient stretch to measure student performance at a wide range of achievement levels. Accordingly, there is sufficient stretch in the M-STEP assessment testing scales to measure the growth of high or low achieving students.

In fact, any test that is used in EVAAS analyses must meet three criteria, and the M-STEP and MAP assessments meet these criteria. The tests:

  • Must be designed to assess the academic standards.
  • Must be sufficiently reliable from one year to the next.
  • Must demonstrate sufficient stretch at the extremes to ensure that progress can be measured for both low-achieving students as well as high-achieving students.

Some educators are concerned about their students who make perfect scores and how that might impact their value-added reporting. In truth, very few students make perfect scores in the same subject from year to year. In 2019, the number of students who made a perfect score in consecutive years for M-STEP Math was a tiny fraction of a percent—only 0.14%. For M-STEP ELA, the percentage was 0.11%.

Some educators are concerned about their students who make very low scores and how that may impact their value-added reporting. However, EVAAS is focused on growth rather than achievement, and this approach uses multiple years of data, when available, to follow the progress of individual students over time. The growth model itself assesses whether, on average, the achievement for a group of students increased, decreased, or stayed about the same over a period of time. This can happen regardless of whether students' prior achievement was relatively low, middle, or high. As a conceptual example, if students' average prior achievement was at the 10th NCE (similar to a percentile), the growth model would expect those students' ending achievement to be near the 10th NCE. Likewise, if students' average prior achievement was at the 70th percentile, the growth model would expect those students' exiting achievement to be near the 70th percentile. In other words, educators are not disadvantaged by serving low-achieving students who are not yet proficient.

EVAAS in Practice

Actual data might be the most readily apparent way to demonstrate that high or low achieving students show similar growth as other achievement groups. The figure below plots the average entering achievement for each school in Michigan against its growth index (the value-added estimate divided by its standard error) for M-STEP Mathematics in grades 4–8 in 2019. There is typically little or no correlation between the school's academic achievement and the growth index. In other words, the dots representing each school do not trend up or down as achievement increases; the cluster of dots is fairly even across the achievement spectrum.

MICHIGAN GROWTH INDEX VERSUS AVERAGE ACHIEVEMENT BY SCHOOL