Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13

The Prevalence of Qualitative Methodology at AERA’s Annual Meeting and the Potential Consequences

by Neil Eckardt - November 08, 2007

Evidence obtained from AERA’s 2006 Annual Meeting seems to raise an important question, which I encourage the field to take seriously: Might the seemingly high portion of qualitative work at AERA be related to the struggles the field of education experiences in attempting to come together and build up a core knowledge domain?

Last Spring, thousands of graduate students and researchers descended upon Chicago for the annual meeting of the American Educational Research Association (AERA).  For the past two years Rick Hess has offered some biting commentary on AERA.  In 2006, in a piece that appeared under two different titles (“Silly Season for the School Scholars” and “Chicanas from Outer Space”), Frederick Hess and Laura LoGerfo argued that AERA members do not focus enough on “analyzing public policy, improving teaching and learning, and addressing the practical concerns of parents and teachers”.  In contrast, they claimed, the education research profession grants a prominent place to scholarship that “promotes narrow values and spouts incomprehensible nonsense”, while considering work on questions like student achievement boring (Hess & LoGerfo, 2006).  This year Hess was at it again, teaming up with Francesca Lowe on a similar critique in a widely publicized piece entitled “The Education Research We Need?”  Here the central argument was that a lot of educational research is “ideological, frivolous, poorly executed, and jargon-laden” (Hess, 2007).  

An important aspect of the Hess et al. commentaries, one that is more implicit in the 2007 piece and explicit in the 2006 piece, is that too many studies at AERA suffer from very small sample sizes and are not conducted systematically.  Some AERA studies, as Hess notes, even suffer from a sample size of one.  This aspect is important and deserves greater attention, particularly in light of ongoing federal efforts to improve the scientific quality of education research.  Such efforts have been debated consistently at AERA and within education journals since Congress passed the Education Sciences Reform Act in 2002, thereby creating the Institute of Education Sciences within the U.S. Department of Education.  But debate within and around AERA rarely includes empirical data capable of showing the actual distribution of education research methodologies.  It fails, for example, to address the actual proportions of qualitative research (that which focuses on depicting the full complexity of particular, concrete situations) and quantitative research (that which uses statistics to determine the descriptive characteristics and explanatory mechanisms at play in large samples) across the field.  

Data from the annual meeting can shed light on the distribution of research methodologies at AERA.  For example, data from the 2006 meeting in San Francisco obtained from AERA reveal that of 8,612 papers submitted to AERA, 28.7% used exclusively quantitative methods, 23.2% used a combination of “mixed” methods, 37.8% used exclusively qualitative methods, and 10.4% utilized a “conceptual/theoretical” approach in which there were no original data presented.  These percentages are remarkably similar (30.4%, 23.8%, 37.9%, and 8.0% respectively) for the 4,550 papers accepted and presented at the conference.  

This distribution of methodologies is likely to be interpreted differently by different people within AERA.  Some will claim that the distinctions drawn between methodologies are spurious or too broad.  Indeed, having better, more descriptive data would be useful.  But that does not mean that these distinctions do not bear significant weight, as they are determined by AERA’s meeting committees and self-reported by researchers during the process of paper submission.  Every year AERA uses a special Annual Meeting Committee to determine the research methodologies and subject keywords that are offered as descriptors to those making submissions.  Anyone who has made a submission to AERA will be familiar with this aspect of entry submission.

These data tell us something important about the supply side of education research, insofar as they tell us something about the research methods of presented papers.  The data tell us that qualitative research is the most popular research methodology in the field.  They tell us that no more than 1 in 4 research papers use a strict quantitative methodology, and that nearly 3 of every 4 research papers use one of the following methodologies: qualitative, “mixed method,” or “conceptual/theoretical.” One has to question the disproportionate amount of qualitative research and whether or not this positions the field well for scientific progress.  

Strictly qualitative studies fail to operationalize and systematically treat the multitude of factors likely to have an impact on phenomena.  Studies that utilize research methodologies that can be characterized as qualitative, “mixed method,” or “conceptual/theoretical” – also too often culminate as “one in done” investigations.  Strictly qualitative studies tend to be difficult to replicate, and are sometimes so procedurally esoteric as to be incomprehensible outside of a very small clique of scholars.  This does not bode well for developing applied scientific knowledge of educational phenomena, and would seem to legitimate current federal efforts to transform the culture of education research.  

With these data in hand, one begins to wonder whether the high proportion of qualitative research studies in education is related to the lack of intellectual cohesion across the field.  It seems to me that any reasonable person, irrespective of their personal methodological orientation, would have to concede that qualitative research studies are far more difficult to write-up and convey than quantitative studies, which places them at a disadvantage in terms of communicating findings and engendering scientific clarity and collaboration.  Similarly, these data also raise the question of whether the high proportion of qualitative studies is related to the field’s inability to build up a shared, common domain of knowledge or even an accepted, jargon-free discourse.  To the extent that qualitative studies are hard to replicate, they are also detrimental to the development of accepted concepts and shared ideas.  In formidable sciences, a prevalence of quantitative or mathematical inquiry is far more conducive to co-authorship, replication, and citation, which are critical to the generation of accepted core ideas, which, in turn, are critical to the establishment and legitimation of a knowledge domain.  

Evidence obtained from AERA’s 2006 Annual Meeting seems to raise an important question which I encourage the field to take seriously:  Might the seemingly high portion of qualitative work at AERA be related to the struggles the field of education experiences in attempting to come together and build up a core knowledge domain?


Hess, F.M. & LoGerfo, L. (2006). Chicanas from outer space. National Review Online. Retrieved October 31, 2007, from http://article.nationalreview.com/?q=ZDYwOGExMmUxOWY0ZDgxNGQxMGEwZjg4NTNhMzQ2M2M=

Hess, F.M. (2007). The education research we need. Education News. Retried October 31, 2007 from http://www.ednews.org/articles/10473/1/An-Interview-with-Frederick-Hess-The-Education-Research-We-Need-And-why-we-dont-have-it/Page1.html

Cite This Article as: Teachers College Record, Date Published: November 08, 2007
https://www.tcrecord.org ID Number: 14741, Date Accessed: 1/25/2022 7:05:24 PM

Purchase Reprint Rights for this article or review
Article Tools
Related Articles

Related Discussion
Post a Comment | Read All

About the Author
  • Neil Eckardt
    Teachers College, Columbia University
    E-mail Author
    NEIL ECKARDT is a 2007-2008 Research Fellow of the Office of Policy & Research and a Ph.D. Candidate in the Politics & Education Program at Teachers College, Columbia University. He is also the Director of Analytics for a network of KIPP Schools in Newark, NJ. His doctoral research offers a systematic, empirical examination of the impact that political ideology has on contemporary educational research in the United States.
Member Center
In Print
This Month's Issue