Colleges: Making the Grade?
by Karen Gift & Thomas Gift - April 13, 2012
Despite frequent complaints that U.S. News has poisoned the college admissions process, schools have done precious little to combat the rankings infatuation. If higher education is serious about reversing the negative consequences—and perverse incentives—that stem from the one-number-fits-all U.S. News craze, it needs to devise a viable alternative. The best solution is to track student performance—from the first hour students walk in the door to years after they earn a diploma—and then make this information readily accessible. This would allow for meaningful comparisons of schools on metrics that matter most to prospective freshman and their families.
For the past 12 months (and much of the past 12 years), Americas most ambitious high school seniors have obsessed over the C-word. Taking (and retaking) the SATs. Stuffing their resumes with extracurriculars. Sharpening their personal statements. Searching for that x-factorfounding a nonprofit, anyone?that will mean the difference between a mailbox full of glossy admittance packets and a small stack of thanks, but no thanks form letters, typed in perfectly impersonal prose. But as May Dayand decision deadlineslooms, the tables have turned. Its now the students asking, Is this college good enough for me?
As graduate students at Duke University, weve watched an ever more frenzied parade of eager prospective freshmen (parents in tow) arrive on campus. Never mind that theyve been admitted to one of the countrys finest universities. The anxiety on the quad is palpable. For some, the appeal of a great engineering program or the chance to be a Cameron Crazy will be enough to sell them on Duke. For others, the lure of the Ivy League, a sophisticated urban metropolis, or a plum financial aid package will send them packing elsewhere.
For most, though, the red carpet treatmentcampus visits, alumni phone calls, free Nalgene bottlesis probably more pomp than circumstance. The endgame, after all, is often as predictable as it is anticlimactic: Just turn to the talismanic U.S News and World Reports Best College rankings. Columbia eclipses Duke this year? Headed for the Big Apple. Duke beats out Brown? Hello, Durham, North Carolina.
Weve all heard the horror stories. Christopher Nelson, president of St. Johns College in Annapolis, Maryland, says there is real evil in the annual ranking of colleges. Hyperbole leads, as it often does, to angst. In 2007, 124 selective liberal arts collegesincluding the likes of Williams, Middlebury, and Hamiltonbanded together to express their displeasure. Self-dubbed the Annapolis group, they pledged not to mention U.S. News . . . in any of [their] new publications, since such lists mislead the public into thinking that the complexities of American higher education can be reduced to one number.
Colleges love to decry college rankings as unscientific or without merit, even as they (not so surreptitiously) maneuver to climb the page. But despite the finger-pointing, colleges have done precious little to combat the rankings infatuation. If higher education is serious about reversing the negative consequencesand perverse incentivesthat stem from the one-number-fits-all U.S. News craze, it needs to devise a viable alternative.
The best solution would start with addressing two questions that U.S. News hasnt dared broach (at least at the undergraduate level): What the heck are students learning in college, anyways? And what happens to them after they graduate?
The answers cant be captured directly in raw figures on alumni giving, peer reputation, or financial resources. But its exactly what students and parents deserve to know, whether theyre plunking down 5,000 bucks per year to enroll in a junior college or more than 50,000 at Duke.
So how do colleges fare in providing this information? We decided to investigate our beloved alma maters to find out.
One school, Washington & Lee University, offers both freshman and senior survey data on its website. It also posts online its College Learning Assessment, a systematic attempt to measure student gains in analytical writing and argumentation (measured by testing selected students as both freshmen and seniors).
But the other universities are less transparent.
Duke, for example, conducts a senior exit survey asking students to self-report gains in 25 areas associated with a liberal educationbut does not make the results publicly available, citing confidentiality concerns of its institutional review board. (Self-reports, moreover, are indirect measures of gains and may reflect students perception of their education rather than its real merit. A better, although inevitably more expensive, measure would pretest at matriculation and posttest at graduation.) The school also administers alumni surveys to cohorts 5, 10, and 20 years after graduation to evaluate how their education impacted their lives, further schooling, and careers. On email request, generic figures on alumni enrollment in various graduate programs were made availablebut with no comparison with peer schools.
Yale University also makes general data on alumni activities available on its website. But it notes that only office of institutional research staff have access to [student] survey data. Privacy is a serious matter. But couldnt data be anonymized?
Unfortunately, our anecdotes appear symptomatic of a larger trend: little data and, even more worryingly, hints of little learning, too.
In a 2011 study, Andrew Kelly of the American Enterprise Institute and Kevin Carey of Education Sector investigated scores of four-year colleges and universities and found that the large majority of colleges are in total noncompliance with some of the most widely cited provisions of the Higher Education Opportunity Act (HEOA), which was designed to track student employment and opportunity postgraduation. And even when colleges [were] technically in compliance with the law . . . much of the information [was] all but useless in informing consumer choice. Out of 152 colleges that Kelly and Carey surveyed, only 73 had functioning web pages designed to provide the required HEOA disclosures.
All this might be moot, of course, if there were compelling evidence that most colleges and universities were successfully educating students. But according to NYUs Richard Arum, whose book Academically Adrift: Limited Learning on College Campuses hit shelves last year, students arent getting much bang for their buck. Arum followed 2,322 traditional-age students from the fall of 2005 to the spring of 2009, from the highly selective to the less selective. He found that 45% of students made no significant improvement in their critical thinking, reasoning, or writing skills during the first two years of college. After four years, 36% showed no significant gains in these so-called higher order thinking skills.
Not only do colleges and universities need to gather more data, they must find better ways to synchronize it, allowing for meaningful and consistent comparisons. U.S. News, for all its flaws, was the first to compile basic information on collegessuch as average SAT scores and statistics on average class sizesin a single encyclopedic format. But colleges need to go well beyond these metrics, evaluating criteria such as analytical and critical thinking skills; knowledge of particular subject matters and disciplines; and clarity and concision of expression. Does studying some disciplines develop skills more quickly? How much gain is made during the first year, compared with years two, three, and four?
Quantifying these issues wont be easy, but in the Age of Big Dataand for an industry that adopted the SAT, PSAT, ACT, and a battery of other metricswere cautiously optimistic. College courses, after all, from accounting to astrophysics, routinely hammer home the importance of meticulously tracking and evaluating data. Its how we measure progressand how we assess success. Colleges and universities should pay attentionits how to make the grade.