“Use it or lose it” approach to technology garners impressive results


High stakes test scores are like time–both must continually advance.

A distinctive trait of North Carolina’s Cumberland County Schools (CCS) is not resting on our laurels, which explains why we went searching for a way to improve student performance at a time when the district was doing pretty well. We were not satisfied with test scores staying the same; we wanted to do even better than we had. Plus, at that time, the No Child Left Behind Act, with its consequent statewide accountability and Adequate Yearly Progress requirements was just coming into effect, and we wanted to ensure that we could show improvement each year.

Additionally, in 2003, the district created the "Vision 2010: What Our Schools Will Become" program, which was designed to help us meet the needs of 21st century students. Among the program’s elements were mandates for the district to provide integrated instructional technology and flexible structures for teaching and learning.

At the time, we had become interested in how brain research could improve student learning. However, since we know we’d be using state, local, or federal funds, we wanted to find something that was scientifically based and proven to be successful. Several staff members, including myself, had heard about a company called Scientific Learning, which developed software based on neuroscience research.  A group of us investigated the research behind their Fast ForWord software and found that it accelerated learning by training the brain to process more efficiently, the way physical workouts train the body to be fit and strong.

We persuaded the superintendent to pilot the program at one primary school. We pre- and post-tested the students, and the results were so good the superintendent agreed to immediately fund more schools. However, because of our diverse student population, each school implements the software differently. As a matter of fact, we have 45 different implementation models, one for each school currently using the software.

Each time CCS expands the program’s implementation, schools must submit a proposal to the district to apply for the software. This process creates buy-in from the school improvement team instead of being only the principal’s decision. Further, if the principal leaves, the team can continue to ensure the software is successfully implemented.

The proposal form asks questions such as, "How will this enhance your current reading initiative?" We emphasize that the software is not to replace the school’s basal text; it is to enhance what educators already doing. We ask what type of data they’ll look at–both quantitative and qualitative. While it’s good to see test scores rise, if students’ scores don’t rise and their self-esteem does, that’s important as well.

The schools have to show why they want the software, which children will use it, which protocols they’ll follow, and what’s going to set them apart from another school that wants it. I have not had any push back from anyone having to use the proposal form. It’s a good exercise for the schools because it provides them with a plan. If they address all these issues before they ever install the software, it takes care of a lot of the problems that might arise.

The district then rates the proposals to determine which schools will get the software.

We have removed the software from two schools that were not properly implementing it. One was not implementing it at all and the other made only a half-hearted attempt. They weren’t getting results, so we moved the software to the next two schools on the waiting list. With 87 schools, when you have something this valuable–and with the investment the county has made–we want to be sure it’s used in a way that gets results. The district’s goal is to continue to see improvement at all schools at all levels.

The software works by delivering intensive exercises that adapt to each student’s level and build the cognitive skills required to read and learn effectively. The computer-based activities focus on the National Reading Panel’s five essential components for becoming an effective reader: phonemic awareness, phonics, fluency, vocabulary and comprehension. This approach fit in very well with what we were already doing with our district’s literacy initiative, so teachers didn’t feel like we were adding something new.

Although we run all types of students through the software, we try to specially target those who are borderline at-risk. We try to catch them even before they’re tested at the third-grade level and found to be low performers. That way, we can keep them out of remedial or Exceptional Children programs from the start.

I should note that the implementation process was not without its challenges. For one thing, we discovered during the first year of use that it was imperative to use the right protocol to get optimal results. For example, we initially ran a 100-minute protocol, and the children burned out after a while. But when we switched to 40- and 50-minute protocols, students could fully enjoy the program.

We also found scheduling to be a challenge at first. It was very easy for a principal to simply say, "We can’t do this because our schedule is full." In response, we told those principals that if they found the lab time too difficult to schedule, we could move the software elsewhere. No one complained, so we were able to work it out. Of course, once principals saw the gains after the first and second year, they suddenly became very creative in their scheduling.

Getting buy-in from teachers and parents was critical as well. When teachers have to give up 50 minutes a day all semester and then be accountable for students’ test scores, they better know why that lab time is so important. Parents need to know, too. It’s a good idea to bring parents in for parent nights and let them do exercises on the computer. We didn’t do these things initially and had some resistance. It’s imperative everyone understands why a program is important and how it works for it to succeed.

We learned from our mistakes and during the fall semester following the pilot, we had implemented the software in 14 schools. A year later, we had the software implemented in 28 schools, and by that Christmas, the total number of schools using it totaled 31. Today, 45 of the district’s 87 schools use the software in computer labs.

All in all, CCS’s emphasis on improving literacy has yielded many positive results. Years after the initial software implementation, we have made great strides, boosting student achievement and self-confidence, thanks to our flexible yet accountability-driven technology implementation. End-of-Grade Test scores have risen each year. Several schools have told dramatic stories of how students completely turned around after using the software.

As I mentioned, each of the schools have their own implementation model. For the most part, there is growth no matter which model you look at. Whether we’re looking at Running Records in grades K-2 or End-of-Grade Tests in grades 3-8, we find that students using the reading intervention software tend to make better growth than those who did not. We can say that for the last six years of data we have.

For example, one of our elementary schools, Stedman, tested two groups of fourth grade students using the Scholastic Reading Inventory (SRI). They tested the Fast ForWord students before and after completing the program, and tested the comparison group at the same time. Before using the software, the Fast ForWord students’ average SRI score was 442. The comparison group’s average was 507. After using the software, the Fast ForWord students’ average was 615 and the comparison group’s average was 561. Both had growth but the Fast ForWord group had much more.
 
Another example was Douglas Byrd High School where students who failed the North Carolina Competency Test had to use the software. After completing the program and testing again, there was an average increase of 10 points on the reading comprehension portion. In addition, all 12th graders who went through the program passed the Competency Test.

Whether it was lower or higher achieving students or anyone in between, all of them made growth and continue to grow each year. While we’ve targeted reading, we’ve seen math scores improve as well. Principals tell us students’ attention in class is better, attendance is better, behavior problems are down, and students are more confident. Parents tell us the same things. It affects the development of the whole child–academically, socially, mentally, every way that one can think of. When students start to have success, many problems from being unsuccessful vanish. We want to continue to get as many students involved in the program as we can.

Anyone considering any software implementation should consider these tips:

• Make sure your computers and infrastructure will support the software.

• Get buy-in from principals, teachers, and parents.

• If you can’t schedule the software into the school day, be creative. Use it before school, after school, or in summer school programs.

• Follow the software guidelines and protocols to get optimal results. Too little time or too much time can negatively impact students’ results.

Ron Phipps is the executive director of the testing and assessment center for Cumberland County Schools.

Sign up for our K-12 newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

Latest posts by eSchool News (see all)

Want to share a great resource? Let us know at submissions@eschoolmedia.com.

New AI Resource Center
Get the latest updates and insights on AI in education to keep you and your students current.
Get Free Access Today!

"*" indicates required fields

Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Email Newsletters:

By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

eSchool News uses cookies to improve your experience. Visit our Privacy Policy for more information.