Report: International tests severely misrank U.S. students

By Meris Stansbury, Associate Editor
January 16th, 2013

Authors confused why PISA releases its first data sets on averages; say its misleading.

“What’s puzzling is why international tests like PISA release overall average scores first, then more nuanced data weeks or months later, since this promotes misleading analyses,” said a co-author of the report.

Prominent international tests skew comparisons of test scores, and U.S. student performance actually ranks much higher than believed, according to a new report released by the Economic Policy Institute (EPI).

The truth, says the report, is that—when comparing apples to apples in weighing U.S. student performance against that of other industrialized countries—U.S. students don’t rank 25th in math, but 10th; and in reading, the country is not 14th, but 4th.

The report, “What do international tests really show about U.S. student performance?,” is the first of its kind to detail what it claims is an inaccurate analysis of student performance on international tests such as the Program on International Student Assessment (PISA) and Third International Mathematics and Science Study (TIMSS).

The report’s analysis found that average U.S. scores in reading and math on the PISA are low partly because a “disproportionately greater share of U.S. students come from disadvantaged social class groups, whose performance is relatively low in every country.”

When differences in countries’ social class compositions are adequately taken into account, the report says, the performance of U.S. students in relation to students in other countries improves markedly. “Errors in selecting sample populations of test-takers and arbitrary choices regarding test content contribute to results that appear to show U.S. students lagging,” it says.

(Next page: In-depth findings)

1 2 3 Read More »

11 Responses to “Report: International tests severely misrank U.S. students”

So we as US educators are to be satisfied because we’re doing ok / average (how are our competitors doing? As long as there doing about the same, that’s all we care …)? When we know we can do so much better – quite possibly with the resources already available, why should we be satisfied with this result???

January 16, 2013

I don’t find a single statement in the text that suggests we should be satisfied with the results. Rather the message is that misinterpretation of data could well result in misguided policy decisions. Certainly many have used the rankings to criticize public education as a whole with the achievement gap being an area of focus. It turns out that the US is better in both of those measures and it can’t hurt to give credit where credit is due. Educators might say a little good news is OVERdue.

January 16, 2013

That is not the point here. We do need to have higher aspirations, that’s a given and well taken, but the real message is that US educational policy MAY be misguided and over-reacting.
The original numbers paint a dismal picture, one that prompts a quick, perhaps “knee-jerk” reaction and one that shouts, “Crisis!” If these numbers are not that dismal, not that crisis oriented, then policy (and strategic planning) can be more accurately focused and less reactionary. That makes for a better response, one that truly addresses a realistic picture of the problem. That’s the point of this article.

January 16, 2013

Looks like another attempt to avoid the reality of our continued decline in student achievement, especially relative to the gain in other countries.
I am an old guy, near 70, and remember that when I went to high school much more was required and learned than today.

    I can’t agree with you. I went to high school about the same time you did. And I have taught high school science for over 40 years. At least in science and math we are requiring more now than we did then.

January 16, 2013

Dear jcbjr,
I don’t believe the intention of the report was to show educators that “we’re doing ok,” as you put it, and therefore that’s all we should care about.

I believe the intention of the report was simply to show that data can be misleading as is the case in the PISA data.

In this case, data from the PISA was not disaggregated by socioeconomic status (i.e. free or reduced lunch). Research consistently has shown that socioeconomic status affects school performance.

In addition, the data fails to report gains made regarding disadvantaged learners. In the end, that is what educators really want to know about. Who is making gains and who is not. That is how we can determine how to proceed with our instruction.


It would be nice to know that US K-12 teachers have at least an undergraduate degree in a discipline–for example, history majors teaching history. Instead, too many teachers have bachelor’s degrees in education. That isn’t the case in excellent private schools in the US. Well-educated teachers are more likely to help children from various socioeconomic backgrounds become succeed.

Because we do not track our students the way most other countries do, our top students are not given the opportunity to shine. Most other countries track students based on perceived ability, challenging their most gifted students from an early age. Those who they expect to steer into STEM fields are pushed ahead, often doing advanced mathematics much earlier. There is no reason that we require nearly all of our students to repeat lower level math facts as much as they do, and we spend much more time reviewing and reteaching entire classes for the benefit of a few. Instead, we need to encourage our top students to go as far as they can while maintaining mastery – introducing fractions in grade 3 and algebra in grade 6, with calculus in grade 10 for those students. We spend far too much time catering to special needs and not enough time pushing out top students ahead.

January 22, 2013

OK, our students who are not economically disadvantaged are doing reasonably well as a group. Is that supposed to be a comfort when such a high proportion of our students ARE economically disadvantaged? Even if only for the most selfish reason to those of us nearing the ends of our careers (earnings of the workers in coming decades will largely determine the sustainability of Social Security in coming decades), the capacity of the overwhelming majority of students to thrive as working adults matters to us all. (And the numerous non-selfish reasons are far better.) If results for other countries are skewed toward the top end, that’s their problem to address, not ours.

January 26, 2013

This article implies that we must change our economic policy instead of our educational policy. Raise the economically disadvantaged and the test scores will rise as well. Simple.