Thanks to a U.S. Department of Education (ED) grant, school district administrators across the nation are receiving training on how to make informed decisions about instruction using data collected from students.

In 1999, the American Association of School Administrators (AASA) wrote a grant in conjunction with the University of California Los Angeles (UCLA) to ED’s Office of Education Research and Improvement (OERI).

The grant has helped further the development of a software program from UCLA’s Center for Research on Evaluation, Standards, and Student Testing (CRESST) that enables data-driven decision making at school districts.

CRESST’s Quality School Portfolio (QSP) software lets users disaggregate data with flexibility, import data from a variety of sources, and report on the data in any of 12 different formats that are tailored to the way educators tend to use and report school information.

The grant also enabled CRESST researchers and officials from AASA’s Center for Accountability Solutions to create a training program for educators and administrators interested in learning how to use QSP and how to make data-driven instructional decisions.

The three-year grant—now in year two—will train 15 districts each year. This year’s second group of five districts completed their training May 6.

New Hampshire’s Rochester School District was one of the first districts to receive training through the program.

“At the training sessions we learned different ways to use the data we collect,” said Rochester Superintendent Raymond Yeagley.

“For instance, we can use QSP to pull up student records information, Iowa Test of Basic Skills scores, and state assessment information and examine all that information for patterns,” said Yeagley.

The benefits of this type of cohesive comparison are numerous, say advocates of data-driven decision making.

“There are tons of data collected by school districts,” said Michael Parker, AASA’s assistant director of the Center for Accountability Solutions. “What’s new is that we are trying to get school districts to organize their data electronically and use that data to drive instructional practices.”

One benefit is that using QSP helps teachers and administrators disaggregate their data, so they can recognize early on if there are groups of students in need of intervention. And on a districtwide basis, the software allows educators to evaluate some of their programs and take corrective action if necessary, said Parker.

“We can also report things better,” said Yeagley. “One of my board members recently said, ‘I wish we knew how our kids are doing on this set of reading skills,’ so I pulled up QSP on my laptop, ran the report right there, and we had the answer to that question within five minutes.”

QSP works by allowing educators to export student data into fields that are crucial to that student’s record. Users import the data into QSP, then enter any assessment data they have, such as national test scores or locally developed assessments, to get a picture of how that student—or group of students—is performing throughout the year.

According to Parker, QSP is not meant to replace a district’s student information system; rather, it is meant to inform decision-making.

“QSP can hold approximately 54,000 student records every year and around 100 different data fields in each record,” he said.

“The technology has made this possible,” said Yeagley. “We are now able to do analysis on laptop computers that 30 years ago required a mainframe that was only available to a university. And the best part is, our laptops will do it faster and better.”

Yeagley uses QSP on a 166 MHz laptop. He said the minimum technology requirements would not be restrictive to most school districts.

What’s more, most districts stand to benefit greatly from the implementation of data-driven decision making and tools such as QSP.

Many schools are moving toward using research-based instructional techniques, said Yeagley. But the problem is, those schools often aren’t tracking the results.

“They need to know how those different software packages or school initiatives are working to make sure they are being implemented equally across the school population,” he said. In schools that are racially and economically diverse, teachers can use QSP to make sure no groups are disenfranchised.

The key, according to educators and AASA officials, lies in figuring out what the right questions are for your specific district.

“Sometimes districts approach this in the wrong way—they’ll pull all the indicators out there into one giant mass of data, and it can be overwhelming,” said Yeagley. “It is info overload.” QSP allows educators to manage those mountains of data.

“Info overload” is just one of the challenges facing administrators who want to start making data-driven decisions.

According to Parker and Yeagley, educators coming out of the training have agreed that it is definitely beneficial, but they admit there are a number of changes that still need to take place before districtwide adoption can take place.

“The biggest challenge is connected with the classroom practices,” said Parker. “The first thing that has to happen is that teachers [must] want to use data.”

According to Yeagley, QSP is a very user-friendly product, but it is not the only solution on the market. For example, a program from the Center for Resource Management, called Socrates, also provides schools with a web-based solution, in which the company does the analysis portion for educators.

“QSP happens to be free, while Socrates you pay for,” said Yeagley. “But with QSP, you have to delve in and find out the answers yourself, whereas Socrates will do the work for you. So it really depends on what you want.”

Parker urged school leaders to evaluate which indicators of student achievement they should use to report the student’s progress.

“Data can be scary and underutilized, but [they] can be [an educator’s] friend and make a difference in student achievement,” said Parker. AASA and CRESST emphasize that the cultural changes needed for making data-driven decisions are different for every school or district.

“It’s not something that has a formula. Every school district is unique, and it takes a long time to plan,” he said. “We recommend bringing the decision-makers in on the ground floor.”

Links:

AASA’s Center for Accountability Solutions
http://www.aasa.org/cas

Center for Research on Evaluation, Standards, and Student Testing (CRESST)
http://www.cse.ucla.edu

Quality School Portfolio (QSP)
http://qsp.cse.ucla.edu

The Center for Resource Management’s Socrates
http://www.crminc.com/socrates/