Brian Leiter's New "Rankings" of Law School Faculty Citations.
Professor Brian Leiter, of University of Texas-Austin School of Law, is in the midst of conducting a study to determine the top law schools based on citations to their law school faculty. He has done this sort of ranking before (see 2005 Rankings for example). In the 2005 results, Leiter indicated:
This is a ranking of the top 30 law faculties based on a standard “objective” measure of scholarly impact: per capita citations to faculty scholarship. We looked only at the top quarter of each faculty, largely for logistical reasons--it made the study more manageable--but partly because the scholarly standing of a school depends more on its best faculty, than its average.
On June 3, Leiter posted the following (bolding in the original) on his Law School Reports Blog:
Over the summer, I plan to carry out a new citation study, and in the fall, we may (finally) undertake a new on-line reputational survey. To that end, we've compiled draft lists of faculty for 49 law schools that appear likely to rank in the top 35-40 by these different measures. We may add a few more faculties to the mix, especially for purposes of a reputational survey. The faculty lists include only academic faculty (an effort has been made to exclude clinical, adjunct, and legal writing faculty, since these studies will focus on scholarly output). The ranking study will be per capita across the whole faculty, since many have worried that by looking only at the top quarter of the faculty in the past produced distorted results because of one or two very highly cited faculty at certain schools (e.g., Chemerinsky at Duke, or Delgado at Pittsburgh). . . .
Leiter's call has generated some criticism, which he has not handled particularly well. Specifically, the decision to exclude ALL clinical and legal writing faculty from the study would seem to leave a gaping hole in the credibility of a study about which law school faculty is most oft-cited. When Richard Neumann of Hofstra Law, for example, pointed out the flawed methodology of completely ignoring all clinical and legal writing faculty, Leiter responded and referred to Professor Neumann as "someone named Richard Neumann," suggested that Professor Neumann has a "chip on the shoulder," and referred to Professor Neumann's comments as "irrational and self-serving." Never mind that Professor Neumann is nationally known, has authored significant text books, teaches both clinical and non-clinical courses, and is likely more recognizable in the community of law school faculty than Leiter.
It's worth noting that Leiter is also, per his own request for input, starting with a list of schools he "expects" to be the top schools. So not only is he ignoring a potentially very large portion of highly cited legal scholarship by clinical and legal writing faculty at law schools, he is further beginning with a list of schools he expects to be ranked at the top -- based in part on past studies that have also ignored significant legal scholarship. And then he seems shocked when others point out the flaws in his basic premises. It's worth noting that Leiter *does* include part-time and adjunct faculty in his evaluations (such as Judges Posner and Easterbrook at The University of Chicago) -- while, again, leaving out full-time, tenured or tenure-tracked faculty in clincial and legal writing.
Leiter seems to completely ignore the changing landscape of legal education. Clinical and legal writing faculty, unlike as recently as 10 years ago, are now increasingly tenured or tenure tracked, and engage in every bit as meaningful scholarship on a variety of legal topics as non-clinical faculty. Most clinical and legal writing faculty are now eligible for summer research grants, and studies by the Association of Legal Writing Directors (ALWD) indicates that there is a significant trend toward requiring legal scholarship of such faculty. It is not difficult to find information about the growing impact of clinical and legal writing faculty in the world of scholarship -- peruse the sources in this post at The Legal Writing Professors Blog for some examples.
As a final matter, it's worth noting some very basic reasons why Leiter's study is flawed, beyond the seemingly obvious fact that ignoring a significant number of law schools entirely, and then ignoring a significant portion of the faculty at the schools that are considered, casts doubt on the credibility of the survey results. In the comments on Leiter's blog, Dean Stephen Ellman of the New York Law School, commented:
In sum, it is probably true that a citation survey of only non-clinical faculty would measure the impact of the work of those who are, in general, the most likely to be the most active authors -- though not without some striking omissions and no doubt with the inclusion of a number of very unproductive classroom teachers. But such a survey would miss not only the work of a sizable number of people who are genuinely and productively committed to scholarship, by reason of professional obligation and personal inclination, but also the impact of the institutional decisions that have concentrated these people at some schools rather than others.
It seems fair to add that the task of separating clinical and nonclinical faculty is itself so difficult that undertaking it is bound to generate errors along the way. Many clinicians, of course, have titles that are identical to those of their nonclinical colleagues, and so they cannot be distinguished by title. Many also cannot be distinguished by contractual status, since they hold full, regular tenure. In addition, many clinicians do not teach only clinically. Suppose a clinical professor also teaches Civil Procedure, and so his or her teaching time is 1/4 clinical, 3/4 nonclinical. Would this professor count as a clinician? What if his or her teaching load was half clinical and half nonclinical? Or 3/4 nonclinical and 1/4 clinical? Or suppose that the professor in question is, initially, a nonclinical faculty member, but over time comes to spend a portion of his or her time teaching in a clinic? Which of these people (and I think there are a lot of people who fit one or another of these models) would count as clinicians, and why? I think that figuring out the right time-share definition of "clinician" won't be simple, but it’s important also to keep in mind that formulating the definition may well be easier than collecting the data with which to apply it.
One other anomaly bears mentioning: a failure to count citations to scholarship by clinicians will mean that citations to otherwise comparable works of scholarship, perhaps appearing in the pages of the same law review issue, will be counted, or omitted, depending solely on the identity of their author. An issue of the Journal of Legal Education, for example, might include articles on law school pedagogy by both clinical and nonclinical faculty – but only citations to those written by the latter group would be counted. Clinicians, it should be noted, write on a great many topics; a symposium on torts, for instance, might feature articles by clinicians and nonclinicians, but again only those written by nonclinicians would be accounted for. Or to take one more example, there may be instances where a clinician from one school and a nonclinician from another school co-author a piece (I’m not speaking hypothetically, since I know of a book that fits this description precisely); citations to the book, or article, would be credited to the nonclinician and his or her school, but not to the clinician and the school he or she taught at.
Like most random law school rankings, Leiter's must be taken with a grain of salt. Leiter's failure to recognize the significant contributions that clinical and legal writing faculty make to legal scholarship, at least with respect to a good number of schools -- I think you could look at Mary Beth Beazley at Ohio State as one example of a clinical/legal writing scholar who is both highly published and very highly cited -- is reason enough to look at his results with a highly skeptical eye, no matter how highly his own school might end up ranking in his study (top 10 in 2005, by the way).