/-------------------- advertisement -----------------------\
Explore more of Starbucks at Starbucks.com.
http://www.starbucks.com/default.asp?ci=1015
\----------------------------------------------------------/
The Best, the Top, the Most
NYT, August 3, 2003
By NICHOLAS THOMPSON
U.S. NEWS & WORLD REPORT ranks Columbia University as the
10th best university in the country. The Princeton Review
gives its food good scores but says its bureaucracy is the
nation's 17th worst. Columbia also has the third, sixth or
10th best business school, depending on whether you believe
The Financial Times, Business Week or The Wall Street
Journal (which just a year earlier called the school, with
mostly the same faculty, same library and many of the same
students, the 34th best).
America is fascinated with competitive lists, and by
combining that obsession with higher education's
status-defining role, the assorted ranking systems have
become a highly lucrative and influential industry. U.S.
News sells hundreds of thousands of copies of its annual
college and graduate school guides. The day after
announcing its 2003 rankings last October, Business Week's
Web site received two million hits. And new rankings hit
the market every year. This year, Entrepreneur magazine and
Seventeen inaugurated their own. (Rice is the ''coolest
school in the land'' in part for its proximity to great
shopping and cute boys.)
Be they whimsical or dead serious, assessments of colleges
and universities are always controversial. The core
criticism is that it's impossible to quantify something as
complicated as a university. Every September when U.S. News
announces its results, a chorus of pundits weigh in with
variations on the aphorism supposedly tacked to Einstein's
office wall: ''Not everything that counts can be counted,
and not everything that can be counted counts.''
''Rankings give a false sense of the world and an
inauthentic view of what a college education really is,''
says Lee C. Bollinger, president of Columbia, adding that
they reflect and contribute to ''a steadily rising level of
competitiveness and anxiety among young people about
getting into the right college.''
Bruce Hunter, chief college counselor at Rowland Hall-St.
Mark's, a private school in Salt Lake City, says he stands
in front of the student body every fall and tears the
rankings pages from his U.S. News guidebook for just this
reason.
Different systems can approximate a campus's wealth,
credentials of the people who come to it, how much students
think they learn and what they do afterward. But no one has
come up with a formula for combining these factors that
satisfies educators. Even if someone did, students care
about the various components to different degrees.
''You can't quantify the important things,'' says Edward B.
Fiske, author of ''The Fiske Guide to Colleges.'' Instead
of rankings, the guide relies on subjective analysis, which
is something Mr. Fiske is proud of. ''We do not use
arbitrary formulas as a matter of principle, nor do we
operate with any gnostic sorts of information,'' he says.
Nonetheless, in this world of ''bests'' and ''mosts,'' the
Fiske guide includes 43 ''Best Buys'' (in Mr. Fiske's
opinion).
Each of the rankings uses a distinct methodology to
quantify some element of the college experience for the
prospective consumer. Understanding how they work can help
students decide if a given ranking says something worth
knowing.
THE BIG BOY
U.S. NEWS & WORLD REPORT
If there were a ranking of the rankings based on influence,
the winner would undoubtedly be U.S. News's ''America's
Best Colleges.'' Selling out in bookstores since its
arrival in 1983, U.S. News is the New York Yankees of the
field: the richest, most followed and most disliked. The
media report the results each September, and colleges
routinely crow when they do well, even in obscure
categories -- Northeastern University in Boston hung a
banner on campus last fall announcing its success in the
''internships/co-ops'' ranking.
Eugene P. Trani, president of Virginia Commonwealth
University, carries a card in his briefcase listing his
presidential priorities for the next five years. The top
one: becoming a tier-two university. The trustees have
promised him a $25,000 raise if the university jumps a tier
under his watch. Dr. Trani insists he will not change the
nature of the university to improve in rankings. But, he
says, ''you have to accept the fact that they're here.'' He
notes that they clearly matter to his alumni, the board and
the politicians who set his budget.
While there were other considerations, ''U.S. News was one
factor'' tipping the scales in favor of an admissions
policy change at Hamilton College, says Richard M. Fuller,
who retired in May after almost a decade as admissions
dean. Hamilton, a high-ranked liberal arts college in
Clinton, N.Y., recently adopted an early-decision
admissions program and made the submission of SAT scores
optional. Low-scoring students might well withhold their
SAT scores, improving the college's averages, and binding
early-decision programs have traditionally increased an
institution's score in the category of yield -- the
percentage of accepted students who actually go to an
institution. But this summer, acknowledging that colleges
thought they could improve their rankings by manipulating
their yield, U.S. News eliminated the category.
The magazine's executive editor, Brian Kelly, says that in
reality yield was a small factor in the rankings. He notes
that some categories, like freshman retention rates and
percentage of alumni donors, encourage positive behavior.
''There's a lot of obsessions,'' he says. ''If a school is
doing something merely to look good in the rankings, it is
behaving unethically.''
Initially, U.S. News relied on a simple opinion survey.
Twenty years of steady evolution later, it now has a
formula that could flummox a statistics major. The magazine
gathers raw data on factors ranging from average class size
to expected graduation rate to reputation, as judged by a
survey of university presidents, provosts and admissions
deans. The information is turned into rankings through a
formula that gives subjective weights to each factor, as
determined by the magazine's editors. For example, the
proportion of professors with the highest degrees in their
fields matters three times as much as student-faculty ratio
but a bit less than half as much as average faculty salary,
adjusted for regional differences in living expenses.
Wealth (endowment spending, alumni giving) figures
significantly, Mr. Kelly says. ''But our rankings formula
shows that we don't mainly measure wealth. It is one of a
series of factors that include the academic standing of
students, student satisfaction after freshman year, peer
assessment. We see one wealth indicator -- alumni
contributions -- as a partial proxy for student
satisfaction and accomplishment. Students who had a good
experience tend to give more than those who didn't.''
With these factors, it's easy to guess who's going to end
up on top: Harvard, Yale and Princeton round out the first
three essentially every year. In fact, when asked how he
knew his system was sound, Mel Elfin, the rankings'
founder, often answered that he knew it because those three
schools always landed on top. When a new lead statistician,
Amy Graham, changed the formula in 1999 to what she
considered more statistically valid, the California
Institute of Technology jumped to first place. Ms. Graham
soon left, and a slightly modified system pushed Princeton
back to No. 1 the next year. (In 2001, Ms. Graham and I
co-wrote an article dissecting the U.S. News rankings for
Washington Monthly, and as an undergraduate at Stanford in
1996, I helped found a student group that was critical of
rankings.)
U.S. News works diligently to ensure the rigor of the
numbers it crunches. As a result, many in academia credit
the magazine with forcing colleges to standardize the way
they record and report information. Few challenge the
contention by Mr. Kelly that ''with data collection, we
believe that what we do is basically state of the art.''
But the system still has flaws. For one, surveys can be an
imprecise science. University presidents and senior
administrators may not know a whole lot about every school
they are asked to evaluate, and they may be biased: 7
percent of the presidents surveyed in 2001 admitted to
deliberately downgrading peers, according to a poll by the
Association of Governing Boards of Universities and
Colleges.
In addition, with much of the data submitted by the
colleges themselves, institutions can shade their
information, though U.S. News says it double-checks
everything it can. Nonetheless, errors slip in. For
instance, the 2003 rankings report that one in every four
students at Manhattanville College in Purchase, N.Y.,
scored higher than 1500 on the SAT. That's not true. The
top quartile scored just 1150. Manhattanville's president,
Richard A. Berman, says he has no idea how the
misinformation got to U.S. News and expresses relief that
the error was not the reason the college leaped in the
current rankings from the second tier to the bottom of the
first. (Only the mean SAT score is factored in to determine
a ranking, and Manhattanville submitted that figure
correctly.)
But to Mr. Berman, having a number so markedly wrong in the
table just emphasizes the difficulty of quantifying
educational quality. With slightly different formulas and
slightly different numbers, ''we could be No. 7,'' he says.
THE OTHER GUYS
KIPLINGER'S PERSONAL FINANCE
U.S. News rankings tend to ignore a college's cost. In
search of campuses ''that combine great academics with
affordable tuitions,'' Kiplinger's magazine, with a
circulation of one million, has run a guide to the ''100
Best Public Colleges'' every other fall since 1998. Large
public universities that do respectably (if not
astoundingly well) in U.S. News fare the very best in these
rankings.
In addition to academic data similar to U.S. News's,
Kiplinger's considers cost factors like tuition, financial
aid and living expenses. But its formula has a few bugs.
The algorithm is supposed to exclude colleges with average
SAT scores below 1030. That screening failed last fall.
Subsequently, five institutions below the cutoff made it
onto the final list, and then were removed several weeks
after the rankings hit newsstands. Among those kicked off
were all the historically black colleges and universities
on the list, including 49th-rated North Carolina Central
University, which had already started publicizing its
success. The editors don't know why the formula failed but
expect to have it worked out by the next ranking, in 2004.
THE CENTER: LOMBARDI PROGRAM ON MEASURING UNIVERSITY
PERFORMANCE
For college applicants, one of the frustrations of rankings
is trying to figure out what matters and by how much. ''Top
American Research Universities,'' an annual report from the
Center, the Lombardi Program on Measuring University
Performance, a research institution at the University of
Florida in Gainesville, offers a ranking far simpler and
more transparent than U.S. News's and Kiplinger's.
The report, published online in September
(thecenter.ufl.edu/research2001.pdf) and distributed free
to anyone who asks for a copy, ranks universities in nine
categories, including median SAT scores, doctorates granted
and faculty awards received. Universities scoring in the
top 25 in all nine categories form one group, followed by
those scoring in the top 25 in eight categories, in seven
categories and so on.
By giving schools the same credit for scoring sixth in a
category as, say, ninth, this ranking avoids the complaint
often hurled at U.S. News that tiny differences in scores
matter too much. ''Universities cluster, and the borders
between the clusters are fuzzy,'' says John V. Lombardi,
the researcher for whom the system is named and now
chancellor of the University of Massachusetts at Amherst.
But by dealing with that problem, the Lombardi system
creates a subsequent one: huge differences among the
institutions that barely slide into the top 25 and those
that just miss it. For instance, if Johns Hopkins
University had granted 392 doctorates in 2001 instead of
the 384 that it did, it would have landed in the first tier
instead of the second tier.
STUDENT KNOWS BEST
PRINCETON REVIEW
If U.S. News, Kiplinger's and the Center
are primarily concerned with an institution's credentials,
the Princeton Review's ''Best 345 Colleges,'' which sells
about 100,000 copies annually, focuses more on what happens
there. To many academics, this is the right thing to
measure. The question remains: Can it be measured?
The Princeton Review gives every college a ''selectivity''
rating on a one-to-four scale based on factors like
incoming students' SAT scores and admissions yield. But all
other ratings spring from student surveys. ''We survey kids
to get a feel for the school,'' says the chief executive,
John S. Katzman. ''And then it's just too much fun to not
rank the top 20 and the bottom 20.'' The surveys, he says,
give a better picture of what actually happens on a campus
than anything else available. ''U.S. News very carefully
measures things that are not important,'' Mr. Katzman says.
''We survey 100,000 kids a year, which is 100,000 more than
they do.''
To sample students for its rankings and general profiles,
the Princeton Review passes out surveys using a technique
known as random intercept. For example, for the new edition
coming out later this month (as ''The Best 351 Colleges''),
two young men handed out questionnaires last January in the
cafeteria of Manhattanville College to whoever passed by
(with no way of tracking whether someone came by more than
once).
Based on conversations with students that day, the college
will do well when it comes to the food ranking but not so
well on location: Purchase, N.Y.
Conventional wisdom seldom applies with the Princeton
Review. In ''best overall academic experience,''
Northwestern tops the 2003 scales; Harvard doesn't even
crack the top 20. Wabash, the all-men's college ranked 44th
by U.S. News, is one of the Princeton Review's more
academically rigorous colleges, making top-10 lists for
both quality of professors and strong student work ethic.
Many administrators don't trust the data. ''Terrible is the
only word for it,'' says David Davis-Van Atta, director of
institutional research at Carleton College in Northfield,
Minn. He faults the data collection as haphazard and random
intercept as ''well established to often give
nonrepresentative, not cross-sectional results.'' In
addition, he points out that e-mail addresses are optional
on paper questionnaires (few students in the Manhattanville
cafeteria that day included one), meaning students can
retake the anonymous survey online after filling one out by
hand.
Moreover, Mr. Davis-Van Atta says the questions are
inappropriate for seriously evaluating a campus. For
example, one question on the current survey asks students
to grade their college's financial aid on a scale of A
(''It's nirvana'') to E (a vulgarism). The question, he
says, ''does not meet any normal standards for survey
design.'' Last fall, he queried directors of institutional
research at other universities and received 35 complaints
similar to his.
Mr. Katzman concedes cheating is possible but says the
problem is small enough not to throw off results. ''We have
never claimed that the survey is scientific or precise,''
he says. ''But it's consistently useful and accurate. You
will find more alcohol and drugs at our party schools than
at our 'stone-cold sober' schools, and the same holds true
for our other lists. In fact, we ask kids to read and rate
the prior year's write-up: 76 percent say either 'on
target' or 'extremely on target'; only 7 percent say either
'off base' or 'way off base.''' Asked about Mr. Davis-Van
Atta's criticism of the financial aid question, Mr. Katzman
quips: ''He's right. We should have asked students to rate
their schools from 'nirvana' to 'untouchable.'''
As with U.S. News, ''The 345 Best Colleges'' is considered
a boon to colleges that do well, and administrators see it
as significant enough that they have been known to try to
affect their scores. In 2001, when the rankings declared
that Macalester College in St. Paul was one of 20 colleges
with the least religious activity, its president raised a
ruckus by sending an e-mail to deans suggesting they get
students to say on the next survey that they pray a lot.
Macalester has a picture of the Princeton Review on its
admissions home page, along with a complimentary quote from
a former editor.
GRADUATE SCHOOLS
U.S. NEWS, BUSINESS WEEK, FINANCIAL TIMES, WALL STREET
JOURNAL
One key indicator of the value of an education is the
success of its immediate graduates. That's a problem with
undergraduate rankings: students want to do different
things. But most everyone who gets an M.B.A. wants to work
in business; and most everyone who goes to law school wants
to be a lawyer. So programs can be judged by tracking the
entry-level success of graduates in the job market. Twenty
percent of U.S. News's law school rankings are based on the
rate of graduates who pass the bar and rates of employment
both at graduation and nine months later.
To an extent even greater than with its undergraduate
rankings, U.S. News's law school list dominates the field.
As a result, its ''Best Graduate Schools'' special issue
and guidebook, published in April, causes even more
consternation than ''America's Best Colleges.'' Peter M.
Shane, former dean of the University of Pittsburgh's law
school, reports that his school's drop in the rankings one
year and rise the next ''was treated as an enormous crisis,
and then an enormous victory -- though it wasn't as if
anything particularly significant changed in the quality of
the school.'' That pressure, he says, also pushed him to
set goals that perhaps differed from the school's or
students' best interest, like focusing fund-raising efforts
on merit-based scholarships for students with high grades
or LSAT scores instead of need-based scholarships.
The rankings also rankle because academics don't think they
account for what matters most. ''What you don't know, you
can't rate,'' says Richard A. Epstein, a professor and
former interim dean at the University of Chicago Law
School. ''U.S. News doesn't know teaching and scholarship.
So what does it mean to rate these schools if you take out
the two essential missions of a law school?'' More than 90
percent of the deans of the schools accredited by the
American Bar Association have signed a letter denouncing
the rankings. Mr. Kelly, on the other hand, points to the
reputation survey of law school deans and faculty members
that makes up 25 percent of the ranking. ''Our ratings
reflect the view of people who do know about teaching and
scholarship,'' he says.
BUSINESS school rankings by U.S. News are also based on
factors like admissions test scores, a reputation survey
and a survey of corporate recruiters. But unlike with law
schools, there's real competition here. Three other
publications put out widely read surveys that students,
recruiters and administrators all take seriously.
''Rankings are maybe the single most important thing on the
minds of most business school deans in America,'' says Tim
Westerbeck, who focuses on business schools as an executive
vice president of the marketing firm Lipman Hearne.
The leader in the field, Business Week, has been putting
out ''The Best B-Schools'' every other October since 1988.
Forty-five percent of its rankings are based on a student
satisfaction survey, 45 percent on a survey of about 200
corporate recruiters and 10 percent on faculty scholarship,
based on the number of articles in selected journals and
books reviewed in prestigious publications.
Instead of a student satisfaction survey, The Financial
Times measures salaries of recent graduates as well as
faculty diversity and publications. Using only data that
can be compared across countries, the newspaper measures
schools from around the world. In its latest special
section, published in January, the United States took the
top five spots; the Insead outside Paris was the top entry
abroad, in sixth place.
The Wall Street Journal's ''Top Business Schools'' also
ranks internationally but is based entirely on a survey of
corporate recruiters whose names are provided by the
schools. The 2003 rankings, issued in September, are The
Journal's second foray into the ratings business; this
time, it surveyed 2,201 recruiters, compared with 1,600 for
2002. Both years, Dartmouth landed on top.
The Journal's top choices tend to differ from the other
publications'. The University of California at Berkeley is
the only West Coast school in its top 25, although four
make U.S. News's top 20. Stanford comes in second in U.S.
News and does well in The Financial Times and Business
Week. But it's 39th in The Journal. Ronald J. Alsop, editor
of The Journal's guide, suggests this anomaly may occur
because, although Stanford's admissions test scores are
high and its faculty publishes prolifically, many
recruiters see it as a place for the ''arrogant'' and
complain about its career services office. According to Mr.
Alsop, ''When asked 'What is the first thing that comes to
mind' about Stanford, recruiters in the survey made
comments like 'California country club.'''
Another possibility is that fewer West Coast recruiters
answered the survey; 20 percent of a school's score is
based on the number of recruiters who list it at all.
(Harris Interactive, which conducts the poll for The
Journal, says it cannot provide a regional breakdown of
respondents.) The Journal may also differ because it relies
on one indicator (recruiter input), which would have far
less swing if 20 factors were cobbled together.
As with other rankings, business school assessments have
almost certainly had an impact on decision making,
sometimes for the better and sometimes for the worse. David
C. Schmittlein, deputy dean at the Wharton School, says his
school now tries to pinpoint what students think of campus
opportunities and improve them. ''You want to figure out
any problems before The Financial Times does,'' he says. On
the other side of the fence, MaryBeth Walpole, an assistant
professor at Rowan University in Glassboro, N.J., who
studied rankings in 1999, says that nearly two of five
business school professors surveyed thought that Business
Week's student satisfaction surveys put pressure on them to
make courses easier.
As with all the rankings, the surest way to find someone
who thinks they have had a pernicious effect is to find
someone whose school has recently dropped. People who
believe they foster virtuous competition can often be found
at schools that have done well. Asked about making it into
U.S. News's top tier for the first time last fall, Mr.
Berman of Manhattanville gives a big smile and says: ''When
we weren't in the top tier we could tell you all the flaws
in the ranking system. Now we think they have finally
realized the right way to rank.''
Nicholas Thompson is a fellow at the New America
Foundation.
http://www.nytimes.com/2003/08/03/edlife/03EDRANK.html?ex=1061905305&ei=1&en=abcad410ae360380
---------------------------------
Get Home Delivery of The New York Times Newspaper. Imagine
reading The New York Times any time & anywhere you like!
Leisurely catch up on events & expand your horizons. Enjoy
now for 50% off Home Delivery! Click here:
http://www.nytimes.com/ads/nytcirc/index.html
HOW TO ADVERTISE
---------------------------------
For information on advertising in e-mail newsletters
or other creative advertising opportunities with The
New York Times on the Web, please contact
[EMAIL PROTECTED] or visit our online media
kit at http://www.nytimes.com/adinfo
For general information about NYTimes.com, write to
[EMAIL PROTECTED]
Copyright 2003 The New York Times Company


Tired of spam? Get advanced junk mail protection with MSN 8. -------------------------------------------- This service is hosted on the Infocom network http://www.infocom.co.ug

Reply via email to