New report reveals the problem isn’t time on testing; it’s the quality of the tests

testing-student-misconceptions A new report is shedding light on what the nation might not know but teachers have known for a while: Time spent on testing depends on district requirements and the quality of the tests. The report argues that it’s time to switch the national conversation from time-spent-on-testing to quality of testing.

The report, “The Student & the Stopwatch: How much time do American students spend on testing?” produced by Teach Plus and authored by Mark Teoh, Ed.D., director of Research& Knowledge at Teach Plus and a former teacher and administrator, encompasses research from 32 districts across the U.S.—both urban and suburban—and over 300 teachers.

The report measured testing habits in both English Language Arts (ELA) and math in kindergarten, third and seventh grades.

What researchers found was that there are three enormous misconceptions when it comes to students testing, with implications for district policy, test design and the implementation of Common Core testing.

“The report clearly demonstrated that the current polarized testing debate is not rooted in the reality our students face across the country,” said Dr. Celine Coggins, CEO of Teach Plus. “The amount of time students spend taking tests is considerably lower than most people would estimate. It is time to shift the national conversation on testing from the amount of test time to the quality of tests and ensuring that teachers have the information they need to help their students succeed.”

(Next page: 3 misconceptions)

3 misconceptions of student testing

1. Time spent on testing takes up most of the school year.

According to the report, across 12 urban districts, the average amount of time students spend on state and district test equals just 1.7 percent of the school year in third and seventh grade and substantially less in kindergarten.

The typical kindergartener’s time on state and district testing is calculated at 2.1 hours for ELA and 1.0 hours of math annually, says the report.

Third and seventh grade students tend to have similar experiences in terms of time-on-testing, spending about 10 hours per year on mandated ELA testing and more than six hours on mandated math testing.

2. What administrators report for testing is how long students really spend on testing.

Even though the amount of time logged by administrative reports for student testing may seem low (1.7 percent on average), this data only encompasses how long students spend taking the actual test.

Teachers calculate test administration time to be more than double the length reported in district calendars in elementary grades, notes the report.

“These…figures do not reflect the many time demands that may be associated with testing such as preparing students or analyzing data,” the report explains. “However, it is an important baseline figure. It reflects the cumulative time impact that districts currently use to communicate with parents and the general public about the time students are being tested.”

In several open-ended questions as part of the report, teachers emphasize that classroom and school-based assessments absorbed substantially more time than state- and district-mandated assessments.

Also, teachers note that the quality of the assessments factored greatly into time spent on testing.

One third-grade teacher, who describes good assessments, says the assessments help—rather than hinder—time in the classroom, since “district assessments are administered to students when it is convenient to teachers during a two-week window…teachers utilize these results much more successfully than state-mandated tests. They are timely and inform teaching immediately. Instructional time is often not missed because the tests are based on the standards being taught.”

However, one seventh-grade teacher says assessments done poorly take up too much student time, as students must “practice getting into testing groups, take practice tests, etc. We also typically take time from our usual instruction to focus on test prep in the week or two leading to the test. For example, I stop teaching the novel we are reading for a week to do multiple choice test prep. Also, during the week of the test, we literally have no instruction. I would say, overall, we lose about 15-20 days of instruction to testing.”

(Next page: Misconception 3 and recommendations)

3. All districts spend roughly the same amount of time on testing.

Researchers in the report found that urban districts spend, on average, more time than their suburban counterparts on testing, with suburban districts averaging less than 1.3 percent of the school year on testing, compared to 1.7 for urban districts.

Also, the variation in test time across urban districts is large, with high-test districts spending five times as much time on testing as low-test districts.

For example, kindergarten testing is highest in Atlanta, Ga., at 10 hours; whereas Shelby County, Tenn., spends zero hours on testing.

Testing for third-graders is highest in Cleveland, OH, at 25 hours; whereas Chicago, Ill., spends only 5.1 hours. Testing for seventh grade is highest in Houston, TX, at 25 hours; whereas Chicago again spends only 5.1 hours.

“Consider that the typical students in the district with the most testing in our sample, Denver, will have about 159.4 hours of math and ELA testing by the time he/she finishes the eighth grade,” emphasizes the report. “By comparison, the typical student in Chicago will have had just 38.8 hours of math and ELA testing.”

“The difference of about 120 hours, after nine school years, amounts to about 22 instructional days, or more than four weeks of school,” continued the report.

How to improve the discussion on testing

Teach Plus lists several policy recommendations to focus on what it considers to be the real issues of student testing:

  • Shift the debate from global to local: According to the report, the research shows that the tests that take up the most time are not the state tests administered in response to federal requirements, but district tests. “Individual districts should evaluate their current testing regime,” says the report.
  • Work with teachers to streamline testing in high-test districts: Districts at the higher-end of the testing spectrum should commit to “ensuring that their students are not shortchanged on instructional time and should streamline testing requirements,” explains the report. “As a first step, they should ask teachers which district-mandated tests are useful—and which aren’t.”
  • Focus on test content over test time.
  • Recognize that some of the test features teachers value take time: Constructed response items, essays, and other assignments of higher-order thinking take longer than simple multiple choice test items, yet teachers want the data they provide, says the report.
  • Proceed with Common Core implementation, recognizing that long-term gain will exceed short-term pain.
  • Report test time in ways that better reflect teachers reality, especially in the elementary grades.

For more detailed information on student testing and the report’s findings, read the report.