Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
You Are Here: Read an Article > View All Posts for the Article > Read a Post
Read a Post for The Prevalence of Qualitative Methodology at AERA’s Annual Meeting and the Potential Consequences
Reply to this Post

The "Struggling" Field of Educational Research

Posted By: Dick Schutz on February 5, 2008
Question: “Might the seemingly high portion of qualitative work at AERA be related to the struggles the field of education experiences in attempting to come together and build up a core knowledge domain?”

Answer: Yes

With that question taken care of, we can go on. (Of course, the answer doesn't "take care of" the question, but there is reason to move on.)

Eckardt presents further useful information, largely drawing on Moody’s graphical representation data illuminating the sociology of sciences. Education is squashed by Psychology—which is about the way I see it.

The “qualitative” vs. “quantitative” spitting match, like the reading wars, has gone on for decades with a lot of heat, but no fire. In each case the rhetorical terms mask the operational matters involved. What is termed “qualitative” in Education, is akin to what is termed “clinical” in Psychology. And what is termed “quantitative” is akin to what is termed “experimental” in Psychology. The clinical/qual lit in each profession tends to be miles wide and inches deep, while the experimental/quant lit tends to build.

Experimenrtal/quant inquiry has been the scientific foundation of Psychology. Education has borrowed some from this foundation, but has developed no foundation of its own.

The use of “numbers” is an indicator, but is not the cause of either the differences involved or its effect on the building of a core knowledge domain. Law, it seems to me, is largely “commentary,” and Economics is as much analytic as quantitative. So something more and different than simply the use of quantitative methodology has to be at work.

Education research suffers from being largely “one shot”, with little follow-up by the researchers involved or by anyone else. It also suffers from puny, adventitious data bases. Where a solid database, (such as the NCES’ Early Childhood Education Study, Kindergarten and Birth Cohorts,) is available it’s virtually ignored both by “policy makers” and “academics.” NCLB would look much different, as would the debate surrounding the legislation, if people were “looking at the data” rather than at a popularized version of a meta-analysis termed the “science of reading.”

The experimental/quant side of educational research is not without “problems.” Treating “science” in education as limited to randomized controlled experiments is scientifically ignorant. Yet that’s the world we’re living in. Couple that with the dismissal of the D in R&D, and “there’s trouble, deep trouble in ‘Education City’” To get out of the predicament will require “culture building,” but we’re more likely to get “76 Trombones and a big parade” instead.

I'm grateful to Eckardt for raising and for further pursuing the question. The matter deserves much wider attention.

Dick Schutz

Thread Hierarchy
 The "Struggling" Field of Educational Research by Dick Schutz on February 5, 2008
    Member Center
    In Print
    This Month's Issue