By: Rachel Brown, Ph.d., NCSP
As the school year moves into its final months, this blog serves to remind us all about how a problem-solving approach is the best way to help students who are still struggling in school. FastBridge Learning endorses a 5-step problem solving method that includes:
- Problem Identification
- Problem Analysis
- Plan Development
- Plan Implementation
- Plan Evaluation
When used at any time of the school year, the problem solving model helps school teams engage in data-based decision-making for the benefit of all students.
Problem Identification
Any teacher knows that a student can develop school-related problems at any time of the school year. Problem identification is the point in time when a possible problem first shows up on the “radar” among school staff. In the spring, especially prior to the final screening period, problems that might arise include things like:
- Students who have not (yet) responded to intervention
- Students who were doing well earlier in the year but are struggling with more challenging material
- Efforts to have 80% or more of all students meet the winter learning benchmarks were not achieved
In each of the above cases, the next step is to analyze the problem based on available evidence.
Problem Analysis
Problem analysis involves using collected data to identify the size and effects of the problem. Some problems might be small enough that they do not justify additional resources to address. For example, if a student’s progress data indicated somewhat lower scores after a school break, but quickly returned to higher levels, school attention to this probably does not make sense because the student’s performance improved once school resumed. A bigger problem could be if many students in a class or grade did not meet a benchmark screening goal. It is expected that all students will make growth throughout the school year. To recognize this growth, benchmark goals are adjusted upwards for each screening period during the school year. If there are students who met the benchmark in the fall, but not winter, more information to analyze the source of the problem is needed. Examples include:
- What percentages of students did and did not meet the goal?
- Were they all from the same or different classes?
- Did they have similar scores in the fall?
- How close to the goal are they now?
- Exactly what instruction has been provided for such students?
In order to address the above questions and complete a thorough analysis, additional data about student performance and instructional practices might be needed. The most recent (e.g., winter) screening scores should provide answers to the first four questions. An interview with teachers and/or classroom observations could answer the final question. The goal is to develop a hypothesis about exactly why an unexpected number of students did not meet the winter benchmark.
Plan Development
With the hypothesis in hand, the school team then turns to consider possible plans that can address the problem. For the underachieving students, the planning needs to cover what steps will be taken to improve their scores. One approach could be to add those students to existing intervention groups so that they can participate in additional instruction. This might work if the number of students is small, but if there are many students who need intervention, there might not be enough groups or interventionists to teach them. Another approach would be to revise the Tier 1 core general education instruction so that it includes the specific skills that these students are lacking. This approach has the benefit of meeting the needs of more students at one time, but will work only if the students have relatively similar learning needs. The team might need to explore both of these options and then compare the costs and benefits of each in order to make a decision.
In addition to developing a short-term solution to the students’ learning gaps, the team should think about and work on plans to develop a way to prevent the same thing from happening next year. Collecting and using continuous data to set longer-term goals is an important part of system-level data-based decision-making. If the team checks and reviews the effects of the selected plan to support the struggling students throughout the rest of the school year, the information gained can help in a decision about whether additional new plans are needed. Specifically, in the course of observing the effects of either small group interventions or whole-class instructional changes, the team will likely have more information about whether the original Tier 1 core instruction that appeared to work in the fall needs to be changed in the future. Such changes could include additional training for teachers, a new pacing guide that narrows the focus of what to teach, or an entirely new set of materials and methods. Using group-level data from the current school year is an important and effective way to plan ahead for the future needs of all students.
Plan Implementation
Once a specific plan for addressing the students’ needs is developed, the next step is to implement it. For any plan, certain additional resources will likely be needed. If the students are added to existing intervention groups, who will gather and prepare necessary materials? If the Tier 1 core instruction is changed, how will the teacher(s) learn about the changes and become ready to teach the revised lessons? In order to support those charged with plan implementation, it is very helpful for a member of the school’s problem-solving team to conduct regular “check ins” with those implementing the plan. These checks can be weekly and will help the staff know that they are supported in their efforts to meet individual student needs. Another component of implementation is to verify intervention or teaching accuracy. This is also called teaching integrity or fidelity. It is important because it provides data about whether the planned change was done correctly. Checking on teaching integrity can involve interviewing the teacher(s) or conducting observations of lessons. The method used should match the type, location, and nature of the instruction. Having teaching integrity is important because unless the data collected as part of the instructional change can be trusted, there is no point in evaluating the plan.
Plan Evaluation
The final step of the problem-solving model is to evaluate the data collected during and after the changes in order to see if they worked. In the case of instructional changes made between the winter and spring benchmark periods, one way to evaluate the plan would be to see if the students’ spring screening scores reflect a significant improvement over the winter scores. The downside of relying entirely on the spring screening scores is that they might not be collected for weeks or months. Instead, it could be better to gather additional data on student performance before the spring screening. If the team opted to have the students join existing intervention groups, the newly added students should complete regular progress measures alongside all the other students in the groups. Given the time of the year and urgency of the learning needs, weekly progress monitoring would be recommended. If the team opted to change the Tier 1 core instruction, alternate assessments might be better, depending on whether there are brief and easy to administer options that all students could complete.
FastBridge Learning offers a number of assessments that could be used for both weekly progress monitoring as well as whole-class interim assessments. FastBridge assessments for reading that would work for both small and large groups include AUTOreading and COMPefficiency. In math, FastBridge offers CBMmath-Automaticity and CBMmath-Concepts and Applications (CAP). These are all computer administered and scored assessments that provide immediate feedback on student performance. For younger students, selected measures from earlyReading and earlyMath can also be used for progress monitoring as they are used at least monthly.
Summary
All FastBridge Learning tools are aligned with a problem-solving approach to assisting students. The five problem solving steps can be used continuously throughout the school year to identify, define, and address both individual and group learning needs. Importantly, educators do not need to wait until the new school year before addressing such learning needs. Instead, they can continue to use the problem-solving steps with new problems as they arise. Making use of the problem-solving approach throughout each school year has benefits for both individual students as well as groups. At the individual student level, the benefit is improvement in skills and a more satisfactory school experience. At the group (e.g., grade or school) level, the benefit is planning for improved system-level practices and will make both students’ and teachers’ school experiences better in the future. For more information about the FastBridge Learning assessments mentioned in this blog, see the FastBridge Knowledge Base.
Dr. Rachel Brown is FastBridge Learning’s Senior Academic Officer. She previously served as Associate Professor of Educational Psychology at the University of Southern Maine. Her research focuses on effective academic assessment and intervention, including multi-tier systems of support, and she has authored several books on Response to Intervention and MTSS.