University Wire, September 1, 2006, Friday
Copyright 2006 The Chronicle via U-Wire
University Wire
September 1, 2006 Friday
HEADLINE: College ranking formulas vary by magazine
BYLINE: By Meg Bourdillon, The Chronicle; SOURCE: Duke
DATELINE: DURHAM, N.C.
BODY:
Number of students in the Reserve Officer Training Corps? Percentage of international faculty? Food quality? Not one appears in the formula U.S. News and World Report uses to rank colleges, but all factor into at least one other media outlet's ranking methodology.
Duke University came in 8th in this year's U.S. News ranking, which is probably the most well-known, but differing evaluation criteria in other lists lead to variation in Duke's placement.
Published lists of top colleges each reflect distinctive evaluation systems, designed to condense disparate data on students, professors and endowments into a simple, numerical ranking.
U.S. News scores national universities and liberal arts colleges on seven weighted categories of measurements: peer assessments, student selectivity, faculty resources, graduation and retention rate, financial resources, alumni giving and graduation rate performance, which is the difference between actual and predicted graduation rates.
"The method has been unchanged for the last four years and for three years prior to that," said Robert Morse, director of data research for U.S. News.
Year-to-year movements in the rankings are, therefore, rarely due to shifts in the magazine's methodology.
Morse explained that the newsweekly's analysts choose and weight scoring criteria based on regular discussions with experts in higher education and data analysis.
"We're not using a scientific system -- whatever that may mean, exactly -- to come up with the rankings," Morse said.
Scientific or not, scholars say students pay significant attention to the numbers.
An article in the Nov./Dec. 1999 issue of Change magazine showed the impact of colleges' U.S. News rankings on their selectivity and tuition. According to the article, a jump from 10th place to 6th place corresponds with a 5.5-point increase in average SAT scores.
"People have become obsessed with the U.S. News and World Report rankings," said Ronald Ehrenberg, director of Cornell Higher Education Research Institute and co-author of the article.
HOW THE RANKINGS WORK
Data on colleges comes from a variety of sources. Some of U.S. News' criteria are based on information from the federal government, and the magazine surveys colleges on other measures it has created, such as the percentages of classes with 50 or more, or fewer than 20 students.
Morse called the current ranking "vastly better" than the methodology the magazine used when it first published college rankings in 1983. He explained that editors meet with a committee of admissions deans and a panel of institutional researchers at least once a year to make sure the scoring reflects current, expert thinking about higher education.
Changes to the methodology, however, are only made when deemed necessary.
Over time, the magazine has shifted the focus from inputs, such as entering students' SAT scores, to outputs, such as graduation rates, Morse explained.
Revisiting the rankings every year is also standard practice at U.S. News' competitors. There are innumerable annually published guidebooks, each with their own system.
For example, the Princeton Review's "The Best 361 Colleges" uses student surveys to rate universities on a variety of nontraditional criteria, such as "Reefer Madness" or "Is it Food?"
Criteria used by Washington Monthly may seem just as unfamiliar. The magazine's formula includes three, equally weighted, components: Social mobility, research and community service. The community service score reflects participation in ROTC and the Peace Corps, as well as use of federal work-study grants for service projects.
This formula generates a list quite unlike U.S. News' top 100. Four University of California schools were among the top 10 in 2006, and the Massachusetts Institute of Technology placed first. Duke came in 23rd, beating 28th-ranked Harvard University -- a perennial rankings leader in other published lists.
Tom Frank, consulting editor for Washington Monthly, explained that the ranking is unique in focusing solely on how much good colleges are doing for society.
"If colleges feel a need to live up to these rankings ... then the effects will be positive for the country," Frank said. "This guide is probably not going to be your first choice for deciding where to go to school."
One group of academics proposed that rankings should score colleges' selectivity via "revealed preference," a system of capturing students' choices when admitted to competing universities. This measure would discourage colleges from manipulating admissions numbers such as acceptance rates and yield, the researchers argued in a Sept. 2004 working paper for the National Bureau of Economic Research.
The scholars said revealed preference data is not generally available, but they surveyed seniors in the high school class of 2000 to construct a preliminary ranking.
Their model ranked Duke 19th, and first place again went to Harvard.
LOOKING OVERSEAS
Major media sources also generate research-centered rankings of universities from around the world, not just the United States.
Newsweek published a list in August of the "Top 100 Global Universities." The magazine's system factored in "openness and diversity," as measured by the percentages of students and faculty of international origin, in addition to published articles, faculty citations and student-faculty ratios.
In the Newsweek ranking, Duke placed 14th internationally and 12th among U.S. universities, while Harvard scored highest.
A similar ranking published by the London-based Times Higher Education Supplement Oct. 2005 also ranked Harvard first worldwide. Currently in its second, annual edition, the ranking put Duke 11th in the world and eighth in North America.
TO RANK OR NOT TO RANK
Differences in rankings and the methodologies behind them reflect a lack of expert agreement on how to evaluate university performance.
Ehrenberg said he would prefer to see the media publicize data on colleges without putting them in numerical order, since the weightings assigned to specific criteria are arbitrary. "It's a difficult thing, because having the information out there is very useful to students," Ehrenberg noted. "The only real problem is trying to summarize the data in one number."
Despite such criticisms, U.S. News' annual rankings capture America's attention upon their publication every August.
"If our system has so many weaknesses," Morse said, "why has it become the standard external benchmark for comparative measurement in higher education?"
(C) 2006 The Chronicle via U-WIRE
<< Home