Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
Topics
Discussion
Announcements
 

Colleges: Making the Grade?


by Karen Gift & Thomas Gift - April 13, 2012

Despite frequent complaints that U.S. News has poisoned the college admissions process, schools have done precious little to combat the rankings infatuation. If higher education is serious about reversing the negative consequences—and perverse incentives—that stem from the one-number-fits-all U.S. News craze, it needs to devise a viable alternative. The best solution is to track student performance—from the first hour students walk in the door to years after they earn a diploma—and then make this information readily accessible. This would allow for meaningful comparisons of schools on metrics that matter most to prospective freshman and their families.

For the past 12 months (and much of the past 12 years), America’s most ambitious high school seniors have obsessed over the C-word. Taking (and retaking) the SATs. Stuffing their resumes with extracurriculars. Sharpening their personal statements. Searching for that x-factor—founding a nonprofit, anyone?—that will mean the difference between a mailbox full of glossy admittance packets and a small stack of “thanks, but no thanks” form letters, typed in perfectly impersonal prose. But as “May Day”—and decision deadlines—looms, the tables have turned. It’s now the students asking, Is this college good enough for me?


As graduate students at Duke University, we’ve watched an ever more frenzied parade of eager prospective freshmen (parents in tow) arrive on campus. Never mind that they’ve been admitted to one of the country’s finest universities. The anxiety on the quad is palpable. For some, the appeal of a great engineering program or the chance to be a “Cameron Crazy” will be enough to sell them on Duke. For others, the lure of the Ivy League, a sophisticated urban metropolis, or a plum financial aid package will send them packing elsewhere.


For most, though, the red carpet treatment—campus visits, alumni phone calls, free Nalgene bottles—is probably more pomp than circumstance. The endgame, after all, is often as predictable as it is anticlimactic: Just turn to the talismanic U.S News and World Report’s “Best College” rankings. Columbia eclipses Duke this year? Headed for the Big Apple. Duke beats out Brown? Hello, Durham, North Carolina.


We’ve all heard the horror stories. Christopher Nelson, president of St. John’s College in Annapolis, Maryland, says there is “real evil” in the annual ranking of colleges. Hyperbole leads, as it often does, to angst. In 2007, 124 selective liberal arts colleges—including the likes of Williams, Middlebury, and Hamilton—banded together to express their displeasure. Self-dubbed “the Annapolis group,” they pledged “not to mention U.S. News . . . in any of [their] new publications, since such lists mislead the public into thinking that the complexities of American higher education can be reduced to one number.”


Colleges love to decry college rankings as “unscientific” or “without merit,” even as they (not so surreptitiously) maneuver to climb the page. But despite the finger-pointing, colleges have done precious little to combat the rankings infatuation. If higher education is serious about reversing the negative consequences—and perverse incentives—that stem from the one-number-fits-all U.S. News craze, it needs to devise a viable alternative.


The best solution would start with addressing two questions that U.S. News hasn’t dared broach (at least at the undergraduate level): What the heck are students learning in college, anyways? And what happens to them after they graduate?


The answers can’t be captured directly in raw figures on alumni giving, peer reputation, or financial resources. But it’s exactly what students and parents deserve to know, whether they’re plunking down 5,000 bucks per year to enroll in a junior college or more than 50,000 at Duke.


So how do colleges fare in providing this information? We decided to investigate our beloved alma maters to find out.


One school, Washington & Lee University, offers both freshman and senior survey data on its website. It also posts online its “College Learning Assessment,” a systematic attempt to measure student gains in analytical writing and argumentation (measured by testing selected students as both freshmen and seniors).


But the other universities are less transparent.


Duke, for example, conducts a senior exit survey asking students to self-report gains in 25 areas associated with a liberal education—but does not make the results publicly available, citing confidentiality concerns of its institutional review board. (Self-reports, moreover, are indirect measures of gains and may reflect students’ perception of their education rather than its real merit. A better, although inevitably more expensive, measure would pretest at matriculation and posttest at graduation.) The school also administers alumni surveys to cohorts 5, 10, and 20 years after graduation to evaluate how their education impacted their lives, further schooling, and careers. On email request, generic figures on alumni enrollment in various graduate programs were made available—but with no comparison with peer schools.


Yale University also makes general data on alumni activities available on its website. But it notes that only office of institutional research staff “have access to [student] survey data.” Privacy is a serious matter. But couldn’t data be anonymized?


Unfortunately, our anecdotes appear symptomatic of a larger trend: little data and, even more worryingly, hints of little learning, too.


In a 2011 study, Andrew Kelly of the American Enterprise Institute and Kevin Carey of Education Sector investigated scores of four-year colleges and universities and found that “the large majority of colleges are in total noncompliance with some of the most widely cited provisions” of the Higher Education Opportunity Act (HEOA), which was designed to track student employment and opportunity postgraduation. And “even when colleges [were] technically in compliance with the law . . . much of the information [was] all but useless” in “informing consumer choice.” Out of 152 colleges that Kelly and Carey surveyed, only 73 had functioning web pages designed to provide the required HEOA disclosures.


All this might be moot, of course, if there were compelling evidence that most colleges and universities were successfully educating students. But according to NYU’s Richard Arum, whose book Academically Adrift: Limited Learning on College Campuses hit shelves last year, students aren’t getting much bang for their buck. Arum followed 2,322 traditional-age students from the fall of 2005 to the spring of 2009, from the highly selective to the less selective. He found that 45% of students made no significant improvement in their critical thinking, reasoning, or writing skills during the first two years of college. After four years, 36% showed no significant gains in these so-called higher order thinking skills.


Not only do colleges and universities need to gather more data, they must find better ways to synchronize it, allowing for meaningful and consistent comparisons. U.S. News, for all its flaws, was the first to compile basic information on colleges—such as average SAT scores and statistics on average class sizes—in a single encyclopedic format. But colleges need to go well beyond these metrics, evaluating criteria such as analytical and critical thinking skills; knowledge of particular subject matters and disciplines; and clarity and concision of expression. Does studying some disciplines develop skills more quickly? How much gain is made during the first year, compared with years two, three, and four?


Quantifying these issues won’t be easy, but in the Age of Big Data—and for an industry that adopted the SAT, PSAT, ACT, and a battery of other metrics—we’re cautiously optimistic. College courses, after all, from accounting to astrophysics, routinely hammer home the importance of meticulously tracking and evaluating data. It’s how we measure progress—and how we assess success. Colleges and universities should pay attention—it’s how to make the grade.






Cite This Article as: Teachers College Record, Date Published: April 13, 2012
https://www.tcrecord.org ID Number: 16758, Date Accessed: 11/27/2021 6:36:12 PM

Purchase Reprint Rights for this article or review
 
Article Tools
Related Articles

Related Discussion
 
Post a Comment | Read All

About the Author
  • Karen Gift
    Duke University Law School
    E-mail Author
    KAREN GIFT is a third-year law student at Duke University School of Law and a graduate of Yale University.
  • Thomas Gift
    Duke University
    E-mail Author
    THOMAS GIFT is a doctoral candidate in political science at Duke University and a graduate of Washington & Lee University.
 
Member Center
In Print
This Month's Issue

Submit
EMAIL

Twitter

RSS