Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
Topics
Discussion
Announcements

Cautions About Inferences From International Assessments: The Case of PISA 2009


by Kadriye Ercikan, Wolff-Michael Roth & Mustafa Asil — 2015

Background/Context: Two key uses of international assessments of achievement have been (a) comparing country performances for identifying the countries with the best education systems and (2) generating insights about effective policy and practice strategies that are associated with higher learning outcomes. Do country rankings really reflect the quality of education in different countries? What are the fallacies of simply looking to higher performing countries to identify strategies for improving learning in our own countries?

Purpose: In this article we caution against (a) using country rankings as indicators of better education and (b) using correlates of higher performance in high ranking countries as a way of identifying strategies for improving education in our home countries. We elaborate on these cautions by discussing methodological limitations and by comparing five countries that scored very differently on the reading literacy scale of the 2009 PISA assessment.

Population: We use PISA 2009 reading assessment for five countries/jurisdictions as examples to elaborate on the problems with interpretation of international assessments: Canada, Shanghai-China, Germany, Turkey, and the US, i.e., countries from three continents that span the spectrum of high, average, and low ranking countries and jurisdictions.

Research Design: Using the five jurisdiction data in an exemplary fashion, our analyses focus on the interpretation of country rankings and correlates of reading performance within countries. We first examine the profiles of these jurisdictions with respect to high school graduation rates, school climate, student attitudes and disciplinary climate and how these variables are related to reading performance rankings. We then examine the extent to which two predictors of reading performance, reading enjoyment and out of school enrichment activities, may be responsible for higher performance levels.

Conclusions: This article highlights the importance of establishing comparability of test scores and data across jurisdictions as the first step in making international comparisons based on international assessments such as PISA. When it comes to interpreting jurisdiction rankings in international assessments, researchers need to be aware that there is a variegated and complex picture of the relations between reading achievement ranking and rankings on a number of factors that one might think to be related individually or in combination to quality of education. This makes it highly questionable to use reading score rankings as a criterion for adopting educational policies and practices of other jurisdictions. Furthermore, reading scores vary greatly for different student sub-populations within a jurisdiction – e.g., gender, language, and cultural groups – that are all part of the same education system in a given jurisdiction. Identifying effective strategies for improving education using correlates of achievement in high performing countries should be also done with caution. Our analyses present evidence that two factors, reading enjoyment and out of school enrichment activities, cannot be considered solely responsible for higher performance levels. The analyses suggests that the PISA 2009 results are variegated with regards to attitudes towards reading and out-of-school learning experience, rather than exhibiting clear differences that might explain the different performances among the five jurisdictions.



To view the full-text for this article you must be signed-in with the appropropriate membership. Please review your options below:

Sign-in
Email:
Password:
Store a cookie on my computer that will allow me to skip this sign-in in the future.
Send me my password -- I can't remember it
 
Purchase this Article
Purchase Cautions About Inferences From International Assessments: The Case of PISA 2009
Individual-Resource passes allow you to purchase access to resources one resource at a time. There are no recurring fees.
$12
Become a Member
Online Access
With this membership you receive online access to all of TCRecord's content. The introductory rate of $25 is available for a limited time.
$25
Print and Online Access
With this membership you receive the print journal and free online access to all of TCRecord's content.
$210


Cite This Article as: Teachers College Record Volume 117 Number 1, 2015, p. 1-28
http://www.tcrecord.org ID Number: 17724, Date Accessed: 10/17/2017 9:25:49 AM

Purchase Reprint Rights for this article or review
Article Tools

Related Media


Related Articles

Related Discussion
 
Post a Comment | Read All

About the Author
  • Kadriye Ercikan
    The University of British Columbia
    E-mail Author
    KADRIYE ERCIKAN is a professor and deputy department head at the University of British Columbia, as well as the director of the Cross-Cultural Assessment & Research Methods in Education (CARME) Research Lab. She has written articles about developments in assessment of student learning and achievement, comparability of bilingual versions of assessments, and multiple-scoring in assessments.
  • Wolff-Michael Roth
    The University of Victoria
    E-mail Author
    WOLFF-MICHAEL ROTH is the Lansdown Professor of Applied Cognitive Science at the University of Victoria. He conducts research on learning in mathematics in science and has contributed to research on graphing as social practice, gesture studies, coteaching, and cultural-historical activity theory.
  • Mustafa Asil
    The University of Auckland
    E-mail Author
    MUSTAFA ASIL is a research fellow at the Quantitative Data Analysis and Research Unit at the University of Auckland where he performs analysis on large-scale data.
Member Center
In Print
This Month's Issue

Submit
EMAIL

Twitter

RSS