Are School-SES Effects Theoretical and Methodological Artifacts?
by Gary N. Marks - September 14, 2012
Several recent studies have made substantive claims about the effects of school-SES on student achievement. This article argues that school-SES effects are an artifact due to the very high correlation at the aggregate school-level between SES and student achievement. School-SES effects are strong because they are proxies for school-prior achievement. Theoretical processes are offered to explain the effects of school-level prior achievement.
Recently, several articles have made substantive claims about the effect of school-SES on student achievement, after including school-level aggregated measures of socioeconomic background (SES) in their analyses. Analyzing PISA data, Willms (2010, p. 1017) finds a strong relationship between average school-SES and average school achievement in the U. S. For Australia, Perry (2010) concludes the socioeconomic composition of the school matters greatly in terms of students' academic performance. Analyzing PISA data from 57 countries, Willms (2010, p. 1024) finds that the school-SES effect (at 62 score points) is much larger than the student-level SES effect (at 17 score points) although they together account for only 4.4% of the variation in students science scores. In countries with large between-school variation in achievement, the effects of school-SES are large, and net of school-SES, the effects of individual student-SES are surprisingly small. For example, in the 2009 PISA study the effects of school-ESCSESCS is the OECDs broad composite measure of SES1and student-level ESCS on achievement were 111 and 13 score points respectively in Belgium, 123 and 14 in the Czech Republic, 122 and 10 in Germany, and 93 and 5 in the Netherlands. These effects compare to bivariate effects for student-level ESCS of 47, 46, 44 and 37, respectively (OECD, 2010, p. 186-187). Although within a country there is less variation in school-ESCS than student-ESCS, these effects for school-ESCS are large.2 Because of the apparently large effects of school-SES, Willms (2010) and Perry (2010) call for policy responses to the effects of school-SES.
School-SES can also explain school sector differences. Analyzing differences in student achievement between private (government independent), other private (government dependent) and government schools in 16 countries, Dronkers and Robert (2008, p. 260) conclude that the explanation of the gross differences in mathematical achievement is the better social composition of private schools. Similarly, school sector (independent, Catholic and government) differences in PISA test scores in Australia are not statistically significant when controlling for school-SES leading to the authors claiming that school sector differences can be entirely attributed to SES (Thomson, De Bortoli, Nicholas, Hillman, & Buckley, 2010, p. 61-63).
The mechanisms for the contextual effects of SES are unclear (Dumay & Dupriez, 2008). Bourdieu (cited by Nash, 2003, p. 443) postulates that if the proportion of working class students exceeds a certain threshold, school classes become more disordered impeding learning. Alexander et al. (1979, p. 223) offer two mechanisms for the contextual effects of SES: a change in the academic climate of the school (academic press) or educational benefits produced by changes in peer networks. Rumberger and Palardy (2005) posit three mechanisms: alterable school characteristics (resources, structures and practices); peer effects, and through schools responses to the student composition (for example, dumbing down the curriculum to cater for low SES students, reduced teacher morale and efficacy etcetera). However, for any of these mechanisms to be viable they would need to involve student achievement; inadequate resources or poor administration affecting overall achievement, the influence of high or low achieving peers or changing the curriculum or expectations in accordance with the students general level of achievement. Therefore, the effects of school-SES must be indirect and involve student achievement.
An important policy and research question is: Are the large effects for school-SES simply an artifact? The ecological fallacy has been well-known for over 60 years: that aggregated data show much higher correlations than the same variables at the individual level. A correlation at the aggregate level cannot be used to make assertions about social processes at the micro-level (Robinson, 1950; Snijders & Bosker, 2012, p. 15). This is because aggregation removes the considerable individual-level variation in both SES and achievement. For both achievement and SES, the within-school variation is typically much larger than the between-school variation, but aggregation removes the within-school variation. Hauser (1964, p. 659; 1970) argues that contextual effects of SES relate to the ecological fallacy in that residual differences between groups (in this case schools) are interpreted as social processes. Such differences should disappear once relevant individual student level predictors (correlated with schools) are included. Nash (2003, p. 446) makes a similar point arguing the contextual effects of school-SES are due to unmeasured non-cognitive or family factors that affect school performance. Gorard (2006, p. 91) suggests that the school composition effect may be spurious because there is measurement error for SES at the student level, but measurement error for SES aggregated at the school-level is lower since the errors cancel out. Thrupp, Lauder and Robinson (2002) suggest that the effects of school-SES may not survive controls for prior achievement and advocate a full set of entry level variables.
There is an even simpler explanation for school-SES effects; school SES is acting as a proxy for the contextual effects of prior achievement. As Thrupp, Lauder and Robinson (2002, p. 486) point out school-level prior achievement is rarely entered as a variable in these studies. The theoretical reasons for a contextual effect of achievement are more direct than parallel arguments for school-SES. Students in a high achieving school perform better, over above that expected by their prior achievement, for a variety of reasons: the curriculum and the teaching are delivered at a higher level, the schools and teachers expectations are higher, students norms regarding the usefulness of academic work are more conducive to learning, and possibly there is less disruption to teaching and learning. For converse reasons, students in low achieving schools perform lower than that expected.
The correlations at the school-level between SES and achievement are surprising large indicating that high multicollinearity would be a problem in multivariate analyses that included both school-level variables.3 White (1982, p. 467) and Sirins (2005) meta-analyses based on mainly U. S studies calculated mean correlations of 0.60 and 0.73, respectively between school-level SES and school-level achievement compared to correlations between 0.2 to 0.3 at the individual level. For Belgium, Opdenakkerand and Van Damme (2001) report a correlation of 0.82 between school-mean ability and school mean for fathers education. For Australia, Marks (2010, endnote 1) reports a correlation of 0.8 between school-mean ESCS and school mean-PISA score. For New Zealand, Harker and Tymms (2004, p. 188) report a correlation of 0.87 for mean-school prior achievement and mean-school SES.
Once school-level prior achievement is included in multivariate (including multilevel) analyses, the effects for school-SES disappear or in some studies become negative. Zimmer and Toma (2000) have found strong effects for academic context in four school systems using data from a 1981 cross-national mathematics study that collected both pre- and post- test scores. Later studies also show the effects for school-SES tend to disappear after controlling for school-mean prior achievement or school-mean student ability (Marks, 2010; Opdenakker & Van Damme, 2001, p. 417). Scheerens, Bosker and Creemers (2000, p. 136) conclude that it is the contextual effects of IQ (or prior achievement) rather than contextual effects of SES that predominate for student achievement. Snijders and Bosker (2012, p. 83-86) provide an example from a study of reading literacy in Dutch grade 8 students in which the effects of mean school-SES are negative on literacy score in the presence of mean school-IQ and student-level measures of IQ and SES.
The very large effects for school-SES in PISA at first appear puzzling. Why should the effects of school-SES school be so strong in countries such as Belgium, Czech Republic, Germany, and the Netherlands? It is because these countries have tracked school systemsat a younger age students are selected to different school types largely based by their performance in primary school or early secondary schooland school SES becomes a proxy for prior achievement aggregated at the school level. Similarly, the reason why school-SES explains differences between private and government schoolsthe Dronkers and Roberts (2008) and Thomson et al. (2010) studies is also because it is highly correlated with school-level prior achievement. To explain differences in school-level prior achievement, the focus should be on the provision of teaching and learning between schools rather some type of contagion effect involving SES.
1. For the 2006 PISA study, Economic, Social and Cultural Status (ESCS) comprised the highest of fathers or mothers occupational status, fathers or mothers educational attainment, and data from approximately 20 wealth, educational and cultural items. There were similar broad composite measures constructed for the 2000, 2003, 2009 PISA studies.
2. In PISA, a one standard deviation in student achievement is standardized at one hundred score points (calculated from OECD countries). The meaning of a one unit change in school SES differs between countries dependent on the distributions of SES across schools within a country.
3. Alexander et al. cautions against the use of school-level variables due to their high collinearity (inter-correlations of variables).
Alexander, K. L., Fennessey, J., McDill, E. L., & D'Amico, R. J. (1979). School SES Influences- Composition or Context. Sociology of Education, 52(October), 222-237.
Dronkers, J., & Robert, P. (2008). Differences in scholastic achievement of public, private government-dependent, and private independent schools: A cross-national analysis. Educational Policy, 22(4), 541-577.
Dumay, X., & Dupriez, V. (2008). Does the school composition effect matter? evidence from belgian data. British Journal of Educational Studies, 56(4), 440-477.
Gorard, S. (2006). Is there a school mix effect? Educational Review, 58(1), 87-94.
Harker, R., & Tymms, P. (2004). The Effects of Student Composition on School Outcomes. School Effectiveness and School Improvement, 15(2), 177199.
Hauser, R., M. (1974). Contextual Analysis Revisted. Sociological Methods and Research, 2(3), 365-373.
Hauser, R. M. (1970). Context and Consex: A Cautionary Tale. American Journal of Sociology, 75, 645-664.
Marks, G. N. (2010). What aspects of schooling are important? School effects on tertiary entrance performance in Australia. School Effectiveness and School Improvement 21(3), 267-287.
Nash, R. (2003). Is the School Composition effect Real? A Discussion with Evidence from the UK PISA data. School Effectiveness and School Improvement, 14(4), 441-457.
OECD. (2010). PISA 2009 Results: Overcoming Social Background. Equity in Learning Opportunitiers and Outcomes (Vol. II). Paris: Organisation for Economic Co-operation and Development.
Opdenakker, M.-C., & Van Damme, J. (2001). Relationship between School Composition and Characteristics of School Process and their Effect on Mathematics Achievement. British Educational Research Journal, 27(4), 407-432.
Perry, L. B., & McConney, A. (2010). Does the SES of the school matter? An examination of socioeconomic status and student achievement using PISA 2003. Teachers College Record, 112(4), 1137-1162.
Robinson, W. S. (1950). Ecological Correlations and the behaviour of individuals. American Sociological Review, 15, 351-357.
Rumberger, R. W., & Palardy, G. J. (2005). Does Segregation still matter? The effect of Student Composition on Academic Achievement in high Schools. Teachers College Record, 107(9), 1999-2045.
Scheerens, J. C. J., Bosker, R. J., & Creemers, B. P. M. (2000). Time for Self-Criticism: on the Viability of School Effectiveness Research. School Effectiveness and School Improvement, 12(1), 131-157.
Sirin, S. R. (2005). Socioeconomic Status and Academic Achievement: A Meta-Analytical Review of Research. Review of Educational Research, 75(3), 417-453.
Snijders, T. A. B., & Bosker, R. J. (2012). Multilevel Aanalysis: An Introduction to Basic and Advanced Multilevel Modelling (Second ed.). Los Angeles: SAGE.
Thomson, S., De Bortoli, L., Nicholas, M., Hillman, K., & Buckley, S. (2010). Challenges for Australian education: Results from PISA 2009. Melbourne: Australian Council for Educational Research.
Thrupp, M., Lauder, H., & Robinson, T. (2002). School composition and peer effects. [doi: 10.1016/S0883-0355(03)00016-8]. International Journal of Educational Research, 37(5), 483-504.
White, K. R. (1982). The Relationship between socio-economic status and academic achievement. Psychological Bulletin, 91(3), 461-481.
Willms, J. D. (2010). School composition and contextual effects on student outcomes. Teachers College Record, 112(4), 1008-1037.
Zimmer, R. W., & Toma, E. F. (2000). Peer effects in private and public schools across countries. Journal of Policy Analysis and Management, 19(1), 7592.