Epistemology, Science, and the Politics of Educational Research
by Aaron Cooley - December 18, 2007
I was delighted to read Neil Eckardt’s (2007) commentary “The Prevalence of Qualitative Methodology at AERA’s Annual Meeting and the Potential Consequences.” The questions he raised about ideology in educational research are important. For raising these questions, I commend his effort at establishing a dialogue on the related issues. That being said, I would not be responding to the commentary, if I were not troubled by many of the statements, assertions, and conclusions that he makes in the piece.
I was delighted to read Neil Eckardts (2007) commentary The Prevalence of Qualitative Methodology at AERAs Annual Meeting and the Potential Consequences. The questions he raised about ideology in educational research are important. For raising these questions, I commend his effort at establishing a dialogue on the related issues. That being said, I would not be responding to the commentary, if I were not troubled by many of the statements, assertions, and conclusions that he makes in the piece. I will deal with each of the most problematic of them in turn.
For the past two years Rick Hess has offered some biting commentary on AERA (Eckardt, 2007, para. 1). I think biting may be a stretch as educators (and now educational researchers) are sadly used to being criticized. It seems to me that Eckardt should have been a bit more wary of using Hess (Hess & LoGerfo, 2006; Hess and Lowe, 2007) as a source to cite regarding the problems of ideologically driven research since Hess works at the American Enterprise Institute. Of course, I am well aware of why Eckardt chose the pieces by Hess to begin the barely veiled attack on non-quantitative research. Hess is a major figure in the wider policy debates on education among a certain set of the thinktankerati, so this theoretically should lend weight to Eckardts concernsI dont think it does.
The attacks Hess has leveled and will likely level again this upcoming spring are part of a political strategy to marginalize any educational research that does not fall in line with what he thinks is useful in promoting a specific educational agenda. What is unfortunate in Eckardts citing of these pieces is that he fails to engage with the critiques of AERA by Hess on anything more than a superficial level. In contrast to Eckardts uncritical acceptance of Hess view of AERA and its conference, I resoundingly reject the assertion that AERA is not concerned with public policy, parents worries, and student learning like they seem to think. Further, the object of the piece in questionqualitative research in particulardoes investigate these issues just in different ways. This is by no means a criticism of quantitative work, but it is the acknowledgment that different research questions require different methodologies. Epistemologically, no one method is better than the others in any absolute sense. (I really thought we were past all this qual vs. quant stuff anyway.) Regardless, it would seem that those who are committed to non-ideologically driven research would be happy to let the marketplace decide what types of methodologies are to be used to study educational issuesin this case, the marketplace seems to have given a strong buy rating to qualitative and mixed methods work.
Another point that must be addressed is the devaluing of other scholars research by taking shots at the titles of their papers and critiquing their work on the basis of them being too difficult to understand. AERA is an academic conference so the complexity and sophistication of language in paper titles demonstrates a depth of inquiry and a richness of understanding in the study of complex social relations. Hence, I was disappointed by Eckardts contention that AERA should move to a jargon-free discourse (Eckardt, 2007, para. 7). One researchers jargon is anothers technical specificity. I am left to hope that these critics attend some of the sessions they criticize and that this exposure would help them to understand that schools are sites of interconnected social complexity that are not easily fixed or even improved by simple or piecemeal solutions.
Another issue that Eckardt raises is that of federal efforts to improve the scientific quality of educational research (Eckardt, 2007, para. 2). This is troublesome because the invocation of and promotion of science by the present federal administration is a miserable joke. To take one shining example1 among many, one must only look to the present administrations environmental policy, which has been far from science based and the scientists know it. The point here is that just claiming to call something scientific does not strip away the political and, yes, ideological aims of any research. So, I would assert that if the administration thought it could achieve its aims to undercut the public schools through qualitative research, then the administration would promote such research and undoubtedly would claim it was more scientific. Obviously, allegations of ideological research can go both ways.
An example of ideology getting in the way of the facts comes not so coincidently from a piece Hess mentioned in one of his attacks on AERA. Kevin Welner and Alex Molnar (2007) relay the achievement of The Program for Education Policy and Governance at Harvard University in getting the Damned Lies Award for Statistical Subterfuge (para. 14). Welner and Molnar describe the reason for this infamous honor:
The Harvard report, however, deserves special recognition. Dissatisfied with the work of other researchers, who found private school to have worse academic results than public schools when educating comparable students, the authors of the report offered an alternative model using, at best, tangentially related statistics that failed to factor in the student demographic differences that were supposedly at the core of the analysis. (para. 15)
My rationale for citing this piece is that it illustrates that severing ideological ties is not as easy as Eckardt seems to think. Statistics dont lie, but neither does qualitative data. In both cases, readers of research must evaluate the interpretation of the data and not judge the research simply by the methodology that was used in the project nor write it off based on the title of the paper. Again, I would assert that all methodologies that one sees working at AERA have merit and value, even a conceptual/theoretical (Eckardt, 2007, para. 3) paper for which Eckardt dismisseswith apologies to all the philosophers and historians.
These reservations notwithstanding, Eckardts perspective and question (which I will get to shortly) do reveal some large divisions between sections of the academy and public policy circles. This gap is a very real problem as colleges and universities are places in American society where progressive ideas have often arisen. If there is then no connection to the levers of government, it wont matter what type of research is done at AERA. So, this disconnect must be addressed and it seems this was part of the challenge Hess has been laying down: if you do research that does not translate well into forms that policymakers understand, then they are unlikely to pay you much attention and they will certainly not be interested in funding your next study.
On this point, Hess and Eckardt are onto something, but to me this is a challenge to the entire educational research community, not just those that do qualitative or mixed methods work. The field as a whole must realize that, unlike our brethren in other academic fields, our research and ideas have at least two audiences: our peers (AERA) and the policymakers that fund the schools we study and the colleges and universities from which we study them. Meeting the challenge of addressing multiple audiences is something I would contend necessitates methodological diversity (not methodological retrenchment), not to mention a greater effort at collaboration across traditional methodological and disciplinary boundaries.
Finally, to answer Eckardts ending question: No. Instead, I would argue that the rise of qualitative research can be linked to the fact that strictly quantitative research is not infallible in answering all of the questions that emerge from the American educational system.
1. Other examples would have worked as well such as the promotion of the Houston/Texas Miracle in educational reform. The Houston/Texas Miracle was touted as data-driven and scientific, but hindsight showed it to be an old- fashioned con job.
Eckardt, N. (2007, November 8). The Prevalence of Qualitative Methodology at AERAs Annual Meeting and the Potential Consequences. Teachers College Record. Retrieved on December 8, 2007, from http://www.tcrecord.org/content.asp?contentid=14741
Hess, F., & LoGerfo, L. (2006, May 11). Silly Season for the School Scholars. Education News. Retrieved from December 8, 2007, from http://www.educationnews.org/Commentaries/Silly_Season_for_the_School_Scholars.htm
Hess, F., & Lowe, F. (2007, April 12). The Education Research We Need?: Think tanks just arent keeping pace with the professional education research community. National Review Online. Retrieved on December 8, 2007, from http://article.nationalreview.com/?q=OTZlYTkyOTZiNTBjZTU4M2JkMzNlNTI5MTM0ZTZlZTM=
Welner, K., & Molnar, A. (2007, February 27). Truthiness in Education. Education Week. Retrieved on December 8, 2007, from http://www.edweek.org/ew/section/tb/2007/02/27/1640.html