Any reason why mercer dropped to t3? I was pretty surprised to see that.
The answer to this question is complicated and, at least in part, unknowable. I know that sounds strange, but the U.S. News ranking methodology is a curious beast. If you're interested, this is a really good article that explains the ins and outs of the U.S. News ranking: Theodore P. Seto, Understanding the U.S. News Law School Rankings
, 60 SMU L. Rev. 493 (2007), available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=937017
Professor Seto's article is very thorough; excluding appendices, it's 77 pages. I say that the cause for Mercer's drop is "unknowable" because the ranking of one school depends heavily on what other schools report. And it's not as simple as comparing the numbers reported by School A and every other school. One minute change to one school's numbers can greatly affect how other schools are ranked, with no control by the re-ranked school. Thus, although a school can do everything within its power to increase its numbers, some things are out of the school's hands.
Consider this excerpt from Professor Seto's article:
I begin with my conclusions. First, U.S. News’ law school “ranks” are unreliable – that is, they are subject to significant random error. . . .
The first conclusion can be illustrated by a simple example involving a change in the numbers of U.S. News's lowest-ranked school--which I will call the “bottom anchor” but otherwise leave unnamed. Assume that the reported nine-month employment rate for graduates of the bottom anchor falls by just one percentage point and nothing else changes at any school in the country
. . . .
As one might expect, nothing happens to the bottom anchor's overall score (by definition, zero) or rank (180th). But this tiny change wreaks havoc on the relative ranking of the top one hundred law schools.
Seattle and San Francisco jump six ranks, Fordham jumps from 32nd to 27th, and Rutgers Camden, San Diego, and Indiana Indianapolis each jump four. Houston, Kansas, Nebraska, and Oregon, by contrast, each drop three ranks. Overall, forty-one of the top one hundred schools change rank.
Fordham's dean gets a bonus. Fingers are pointed and voices raised at Houston. All because of a trivial change in the employment statistics of a single school far away in the spreadsheet. Stranger still, if the bottom anchor's nine-month employment rate falls an additional four percentage points (that is, a total of five percentage points)--and nothing else changes at any school in the country--most of these effects disappear, but the reordering moves into the Top Ten. University of California (UC) Berkeley and Virginia both drop from 8th to 9th place. At the other schools named above, it is as if nothing had ever happened.
Prospective students, employers, and faculty members, reading that UC Berkeley and Virginia have dropped to 9th place, may decide to go elsewhere. Regents, trustees, and university presidents, reading that Seattle, San Francisco, and Fordham have advanced dramatically in the rankings, may record this accomplishment in the apparently responsible deans' performance evaluations. What the foregoing example suggests, however, is that basing decisions on this kind of difference or change in U.S. News ranks is unwarranted.Id.
at 509-10 (citations omitted) (emphasis added).
Although it's impossible to tell for sure why Mercer dropped out of the top 100 this year, one thing that probably had a negative impact was reporting the at-graduation employment rate. Because of the bizarre way that this number fits into the rankings, many schools get a "boost" in the rankings by not reporting it. To put it simply, a school benefits in the rankings by not reporting if it's at-graduation employment rates are more than 30% lower than it's 9-month employment rates. To be exact, 64 schools (quite of few of which are in the top 100, including Georgia State), did not report at-graduation employment rates. Mercer was one of 23 schools reporting the employment rate that may have been "hurt." http://taxprof.typepad.com/taxprof_blog/2009/05/rankings-malpractice.html
. There is, of course, a legitimate argument that schools should respond to the U.S. News survey as truthfully as possible, but many schools "game" the rankings in every imaginable way (I won't go into it right now, but trust me--there are many ways). Mercer has attempted to report as truthfully as possible, but this reporting may have caused a dip in the rankings. Maybe this will change, but maybe not. Time will tell.