Fourteen of the Universitys doctoral programs have been ranked among the top 10 nationally in a survey released this week by the National Research Council. Among the 14, the anthropology program is ranked number one nationally; four other programs are ranked in the top three and eight in the top five.
The study, Research-Doctorate Programs in the United States: Continuity and Change, was guided by the Committee for the Study of Research-Doctorate Programs in the United States and sponsored by The Conference Board of Associated Research Councils.
The reputational study examined programs in 41 fields in the arts and humanities, biological sciences, engineering, physical sciences and mathematics, and social and behavioral sciences. It covered 3,634 programs at 274 universities (105 private and 169 public) that have about 78,000 faculty members and graduate about 90 percent of the Ph.D.s produced in the studied fields between 1986 and 1992. This years study, which builds on one reported in 1982, includes 214 institutions that participated in the 1982 study and many additional programs.
Fields were included in the study based on a combination of three factors:
Overall, 38 U-M programs are ranked, out of more than 100 that are offered. The U-M programs ranked in the top 10 and their rankings are: anthropology (first); psychology (second); political science and classics (both third); sociology and industrial engineering (both fourth); aerospace engineering and mechanical engineering (both fifth); electrical engineering (sixth); philosophy (eighth); mathematics, French and music (all ninth); and civil engineering (10th).
While it will take some time to analyze the rankings, President James J. Duderstadt says they indicate that the academic reputation of the University has not only been sustained but actually strengthened in many key areas over the past decade. This is all the more remarkable in view of the significant deterioration in state support over this same period.
The University, Duderstadt says, continues to be the national leader in the social sciences, among the leaders in the humanities and engineering, and improving in the sciences. While there are some areas of concern, these rankings do indicate the exceptional strength of our faculty and the quality of our academic programs in these important areas of graduate education.
Robert Weisbuch, interim dean of the Horace H. Rackham School of Graduate Studies, notes that it is important for people to look at raw scores, not just rankings. The University had 24 programs ranked in 1982 in terms of perceived faculty quality also ranked in the current survey. The raw score is up in 19 of the 24 programs.
We have to be concerned with competing with ourselves and seeing whether we are improving and we are. Michigan, he adds, does even better on the question of effectiveness of the programs. We have good curriculums, the students have good experiences, demonstrating that we are good and conscientious educators.
Weisbuch notes that data provided in a survey done as carefully as the NRC report provides lots of helpful information that will be important for us to consider. In addition, we have much more extensive data on all our programs than any such survey can provide and were constantly reviewing it.
Its important for people to understand, Weisbuch says, that reputation follows reality by several years. There is a problem with these types of surveys. When I fill one out, I think of two things: recent events and how the program ranked when I was applying to graduate school. It gets skewed toward the past, like light from the moon. Its real but in the past. We can use that data, but we have more immediate and extensive data on how our programs are working.
Weisbuch cites two U-M programs as examples:
When the 1982 survey was done, the English department admitted about half of its applicants. Now we get three-to-four times the number of applicants and admit only 8 percent, and student quality has improved extraordinarily. That is internal data we have that is not translated into a ranking in a reputational study.
Theres no question that Michigan is up. Its just not getting fully reported, Weisbuch says.
Weisbuch also notes that the U-M has a number of programssuch as American culture and classical art and archaeology, that are the most respected in the country but dont get ranked. We have other programs that dont offer discreet degrees, such as womens studies, which is one of the strongest in the country. The same is true with the Center for Afroamerican and African Studies. When rankings are done of Black or African American programs, were among the top four or five. There are places where we are strong, but thats not evident in the survey by their categories.
This is not an athletic contest, Weisbuch says. To use rankings reductively is anti-intellectual, anti-educational. We need to look at the figures very carefully and bring our intelligence to bear on them. Its not perfect and its very complex, and we need to give the complexity careful attention.
Much of the data for the faculty opinion of program quality was generated by the National Survey of Graduate Faculty done in spring 1993. The survey elicited ratings on the scholarly quality of program faculty and the effectiveness of each program in educating research scholars. At least 100 faculty raters evaluated each. The committee also updated statistics from the 1982 study and included other demographic data drawn from a variety of sources.
Ratings for scholarly quality of faculty were pooled, resulting in an average rating on a five-point scale, with 0 signifying not sufficient for doctoral education and 5 signifying distinguished.
According to the report, about 62 percent of the programs were rated as distinguished, strong or good, although this varies by field.
The same approach was used for program effectiveness, with 0 being not effective and 5 extremely effective. About two-third of the programs, the report says, were considered to be extremely effective or reasonably effective. Fewer than 10 percent were considered to be not effective.
The rank orderings in each field are based on a mean rating derived from the pooled responses to the national survey. Acknowledging that rank ordered information requires careful interpretation, the report includes an appendix that illustrates the relative standing of programs with respect to a number of variables.
John H. DArms, former dean of the Horace H. Rackham School of Graduate Studies and former vice provost for academic affairs, was a member of the committee that conducted the study. He is now a visiting distinguished professor at the National Humanities Center in Research Park, N.C., working on portions of a book on funding patterns in the humanities being done for the Mellon Foundation.
DArms characterizes the report as significant and of considerable value, particularly in light of the criticism frequently leveled at colleges and universities that they do not evaluate themselves.
This is an attempt on the part of the people producing the Ph.D.s to look at the quality of the institutions doing the producing, he says. He also notes that this report is much more comprehensive than those done, for example, by U.S. News and World Report, which last spring reported on only 13 fields of doctoral study.
Nine fields not included in the 1982 report are in this years report. Of interest, DArms notes, is that almost all of them are interdisciplinary in nature, such as comparative literature, materials science and biomedical engineering. This not only indicates that Ph.D. production has reached a certain level, he explains, but also points up the shifting nature of these fields.
DArms says there are a number of broad findings illustrated in the report:
DArms notes, however, that women and minorities in the higher ranked programs are just as likely to earn Ph.D.s are non-minority males. If they enter the programs, their chances of success are just as good. Its important that we continue to encourage them to pursue graduate study.
The study was funded by the Ford, Andrew W. Mellon, Alfred P. Sloan, and William and Flora Hewlett Foundations and the National Academy of Sciences.
An electronic file of selected tables from the 740-page report is available on the Research Councils World Wide Web home page at http://www.nas.edu. A CD-ROM that will include more detailed program-level data is being developed and will be distributed for public use.
Paper copies of the report, $59.95 (prepaid) plus shipping of $4 for the first copy and $.50 for each additional copy, are available from the National Academy Press, 2101 Constitution Ave., N.W., Washington, D.C. 20418; (202) 334-3313 or 1-800-624-6242.
Noting that NRC study is reputational, with quality based on perceived reputation in the eyes of the 100 or more faculty who reviewed each program, John H. DArms does have a few disappointments.
This is not the only way to measure quality, he says, particularly since reputation changes more slowly than facts.
For instance, the English Department at the U-M has undergone dynamic changes and probably rates more highly [than in the report]. The changes are not yet well reflected.
And, some early hopes on the part of DArms and several other committee members for including data from other sources were not realized due to time and resource limitations.
We hoped to evaluate the experiences of recent Ph.D. holders who were graduates of the programs. This is another way to test reputation, particularly since the questionnaire does not emphasize the degree to which the program prepares one for life in the academy or industry.
We also were unable to get reactions from employers of recent Ph.D.s who work in their research laboratories, another independent measure. It also would have been nice to ask international students who have returned to their countries what their experiences had been.
DArms also would have liked to include data on a more elusive characteristics, that of the value added to the doctoral experience by non-degree-granting centers and institutes and other resources on a campus, such as the Institute for the Humanities or the Clements Library at the U-M.
The Humanities Institute is a very lively place, DArms says, where doctoral students are able to connect and come away with a richer, more diverse experience. There are similar centers elsewhere. We have superb collections at the Clements and the Kelsey, and theres no way of capturing this. The same is true for the Center for Afro-american and African Studies and womens studies and foreign area centers.
The presence of these resources adds immeasurably sometimes to the experience. It makes a positive impact on graduate education in these fields.
Noting that a number of fields were added for the current report, DArms says that finding a way to organize the biological sciences was one of the biggest frustrations faced by the committee.
We had a terrible time organizing the biological sciences, since they come from literary schools, medical and pharmacy programs and engineering. What this does say is that life sciences are in a major ferment. There are big differences since 1982. Other fields, such as womens studies, also have changed, but their organization has not changed, so the change is not so noticeable.
As a committee member, DArms says in the long run he hoped for an even better report. There is value, however, in trying to do what we did. It provides a reference for what to study in the future, a benchmark to work from even if it is not perfect.