How to Measure Student Growth with Screening Data

By: Yvette Arañas

Measuring student growth can help teachers find out whether students are learning or improving in a particular skill. There are a few questions we need to ask ourselves when we measure student growth. These are important questions to ask because students come into school with varying levels; some are ready for school, while others may be at risk for falling behind their peers. Some students show a higher rate of improvement than others.

  1. What is the student’s ability level at the beginning of a school year?
  2. Is the student meeting benchmark or grade level expectations? How much growth has this student made before (i.e., from fall to winter and winter to spring)? How much across school years?
  3. How does the student’s level and growth compare to other students?

Keep in mind that an individual student’s performance can be compared to different groups of students (e.g., class, school, district, and nationwide).

There are a few ways teachers can measure growth. When looking at screening data across seasons, you can see how much a student has improved weekly over time. This is called the rate of improvement, or ROI. FastBridge Learning provides ROI growth reports for groups of students.

ROI Fall to Spring Sample Growth Report in FAST

The sample growth report to the left shows fourth grade students’ rate of improvement from fall to spring for CBMreading. (Note that FastBridge Learning’s group growth reports also allow users to compare growth from winter to spring and to look at monthly growth). Under the “Fall” and “Spring” columns are the students’ scores (number of words read correct per minute) for those benchmark seasons. The students’ rates of improvement are listed under the column titled “Growth,” which indicates how many words read correct per minute the student gained from fall to spring per week. In the example, you can see that Ivan Freely showed the highest rate of improvement out of the four students, gaining an average of 5.10 words a week from fall to spring. Jennell Jernigan showed the lowest rate of improvement (0.67 words a week). One might decide that Jennell probably would benefit from reading support to help her increase her rate of improvement.

Users also can see how each student’s rate of improvement compares to other students. The report provides percentile ranks at the class, school, district, and national levels. For Doloris Denham, her rate of improvement of 2.49 falls at the 33rd percentile in her district. This means that her estimated growth is the same or higher than 33 percent of all fourth graders in her district.

You might notice that there are two columns for the national percentile ranks. “National All” compares the student’s growth to all fourth graders nationwide. “National Banded” compares the student’s growth only to students who had the same starting score in the fall. When comparing Jennell Jernigan’s growth to all fourth graders in the country, her rate of improvement is the same or better than 38 percent of fourth graders. When we compare her growth only to students who had a score of 45 in the fall, you can see that she scored the same or better than 32 percent of those students.

Individual growth reports are also available. Below is an example of a report for aMath.

3_Individual Growth Report Graph Sample Individual Growth Report in FAST

In the sample report on the left, you can see that Dawyne Buchanan began the school year meeting grade-level expectations and his scores from fall to winter to spring have been increasing. This suggests that Dawyne has been showing some growth in the 2015-16 school year. Dawyne’s scores suggest that he is exceeding grade-level expectations and is not at risk for math difficulty this school year.

If data are available from previous years, you can also compare growth across school years.

 

4_Individual Growth Report Graph 2

In this second example, fourth grader Doloris Denham has CBMreading scores from fall 2013 (from when she was in second grade) all the way to spring 2016. In her case, you can see that her scores have varied greatly over time. Most of her scores fell in the high risk or some risk ranges, but also has met grade-level expectations in fall 2013 and winter 2014. When looking at the data, a teacher might want to look to see if any notes were documented that would suggest that the test administration was not done correctly, or that environmental factors may have affected her performance. If CBMreading was administered with fidelity during all screening periods and her scores are believed to be valid estimates of her reading ability, a teacher might want to see what type of instruction or interventions Doloris received during those two time points. Then, the teacher might decide whether those interventions should be implemented in the present day to help Doloris improve again.

There are a few things to consider when interpreting student growth data. First of all, it is important to remember that the scores that students obtain (e.g., the number of correct words per minute) are estimates of achievement in a particular skill. In other words, some level of error and unreliability might be associated with the scores. To minimize unreliability, it is important to make sure that the screening assessments are administered correctly. Second, some students’ scores may decrease over time; this is more common with high-ability students, particularly those who have a very high starting score. While it is important to make sure these students don’t fall behind, negative growth is not as much of a concern for these students compared to those who begin with a much lower starting score.

Measuring student growth can help teachers determine whether students are improving over time. Fortunately, FastBridge Learning provides a way to display this information automatically, making it easy for teachers to look at their students’ growth over time. For those students who start out at some or high risk, it is important to provide interventions that will help them improve more quickly than other students. This is because unless they make additional gains (called catch up growth) they may never be able to get to grade level expectations.

Yvette Arañas is a doctoral student at the University of Minnesota. She was a part of FastBridge Learning’s research team for four years and contributed to developing the FAST™ reading assessments. Yvette is currently completing an internship in school psychology at a rural district in Minnesota.

achievement data, Ask the Experts, benchmark assessments, Blog, CBMreading, measuring student growth

Related posts

TerriSoutor

TerriSoutor