Skip to content

The Other One-Third: The Relative Worthlessness Of USNWR’s Graduate History Program Rankings

March 28, 2008

I learned from the Chronicle of Higher Education‘s News Blog that U.S. News and World Report released its 2009 ranking of graduate schools yesterday. For amusement’s sake, I perused the social sciences and humanities listings, focusing of course on history. Everyone hopes that his or her program will appear in this year’s top ten, yes?!

The usual suspects rounded out the top ten:

1. Yale University New Haven, CT Score 4.9
2. Princeton University Princeton, NJ Score 4.8
2. University of California–Berkeley Berkeley, CA Score 4.8
4. Harvard University Cambridge, MA Score 4.7
4. Stanford University Stanford, CA Score 4.7
4. University of Chicago Chicago, IL Score 4.7
7. Columbia University New York, NY Score 4.6
7. University of Michigan–Ann Arbor, MI Score 4.6
9. Johns Hopkins University Baltimore, MD Score 4.5
9. University of California–Los Angeles, CA Score 4.5

Aside from not being in the top ten, I learned that my graduate history program, at Loyola University Chicago, was neither scored nor ranked. It had a sorry-looking “N/A” designation. Bummer! Sigh. How this could be? Just last year a SUNY/Academic Analytics productivity study had us in the top ten. I wrote about it here. What gives?

I searched the USNWR site for more on the tabulation process. Along the way I discovered that Loyola was actually not in poor company. Of 151 institutions named on the list, only 100 received numerical rankings (numbered 1-87 with several ties). Wondering how this could be, I found this page on the methodology behind the social sciences and humanities lists (posted March 26, 2008). Here are some excerpts (bolds mine):

Rankings of doctoral programs in the social sciences and humanities are based solely on the results of peer assessment surveys sent to academics in each discipline. Each school (or, in the case of psychology, each institutional unit) offering a doctoral program was sent two surveys. The questionnaires asked respondents to rate the academic quality of the program at each institution on a 5-point scale: outstanding (5); strong (4); good (3); adequate (2); or marginal (1). Those who were unfamiliar with a particular school’s programs were asked to select “don’t know.” Scores for each school were determined by computing a trimmed mean (eliminating the two highest and two lowest re[s]ponses) of the ratings of all respondents who rated that school; average scores were then sorted in descending order.

Surveys were conducted in the fall of 2004 by Synovate. Questionnaires were sent to department heads and directors of graduate studies (or, alternatively, a senior faculty member who teaches graduate students) at schools that had granted a total of five or more doctorates in each discipline during the five-year period from 1998 through 2002, as indicated by the 2003 Survey of Earned Doctorates. The surveys asked about Ph.D. programs in economics (response rate: 38 percent); English (39 percent); history (33 percent); political science (40 percent); psychology (23 percent); and sociology (50 percent).

It appears then, that of the 33 percent of schools responding in 2004, that cohort in turn could say nothing about 51 of the 151 total graduate programs in history. Of course it could be the case that those 51 had only four reports and, with the two highest and lowest being trimmed, received no ranking. Still, it appears that one-third of graduate programs in history basically know next to nothing about another one-third. Curious. I wonder how this affects the job market for new Ph.D.s from those 51 schools? Not positively, I’m sure.

If you were choosing (again, for some of us) a graduate program in history, how much stock would you put in a list where one-third of those consulted in its construction apparently knew almost nothing about another one-third of those being considered? – TL

Advertisements

From → Uncategorized

2 Comments
  1. Tim,

    Here is a tool that takes several more factors into account when evaluating history doctoral programs. Its conclusions are pretty much the same as those of USN&WR.

    Like

  2. Sterling,

    Thanks for the link. I recall playing around with that site once before—thanks to your weblog.

    Even if the conclusions of both are the same, that doesn't take the onus off USNWR to come up with something better. Granted, it's a glossy news magazine. But people give a surprising amount of weight to stuff like this. What's most surprising, is that the weight comes from others in the academy as much as the uninformed seekers on the outside. That's why I questioned the implications for the job market.

    – TL

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: