Training for What? Issues in Graduate Education

Originally published in Perspectives, the newsletter of the American Historical Association, as the introduction to a new column, in February 1999.

Welcome to “Issues in Graduate Education,” a new column in Perspectives. The proliferating electronic discussion networks seem a likely medium for personal narratives, historiographical reviews, interviews, and reports about recent conferences and projects. I intend to reserve this space as a forum for essays that generate debates on leading issues in graduate education—pieces that treat broad themes and posit general relationships and connections. Contributors should frame their topics in a context compelling enough to address graduate students, undergraduates, and faculty engaged in a variety of fields. Consider, by way of introduction, the following essay, which briefly identifies a few broad concerns in history graduate education.

At its best, graduate training nurses the mind and enlivens the imagination. It encourages playfulness and discovery, setting loose the creative intellect. The current generation of historians has opened the field to an endlessly fascinating panoply of topics so that today’s graduate students can probe the interstices of thought, politics, and culture with greater license than ever before. Scholars preparing for careers as professional historians can press an array of exciting theoretical departures into the service of their work, and, increasingly, they can shape their scholarship without fear of condescension from a rigid, elitist profession. These constitute important advances, for which every beginning scholar should be grateful.

Paradoxically, graduate school also can foster careerism, political sterility, and narrowness of thought. Ideas can become incidental to achieving a degree and fashioning a professional career. “Anyone who has taught in an American university knows how often the bright and lively undergraduate undergoes a depressing change in style soon after entering graduate school,” observed Irving Howe more than three decades ago. “He becomes professionally cautious, intellectually timid, concerned to please and adapt to professors.” Today, too many graduate students feel pressured to suffer early professionalization and feel obligated to conform their ideas to a menu of subfields that, individually, do not satisfy their hungers. They worry that search committees measure candidates not by the originality of their imaginations, but rather by the extent to which they have fulfilled the rituals of apprenticeship and genuflected before the logic of specialization. Reading the lamentations of senior figures such as Herbert Gutman and Christopher Lasch—“we find ourselves uncomfortable in academic life and often at odds with the profession and the university,” wrote Lasch in 1973—“they wonder about the intellectual costs of socialization into the profession.”[1]

Are graduate students workers or students—or both? The bitterness of recent confrontations at Yale University and the University of California and the growing problem of adjunct, graduate, and part-time teachers underscore the stakes of this question. Whatever the long-term answers, the chronic surfeit of PhDs brings the notion of exploitation to the foreground of discussions about academic labor. Surveying developments across the humanities, English professors Michael Berube and Cary Nelson argue that “it is time to say, bluntly, that graduate education is losing its moral foundation.” They observe that “we overwork graduate teaching assistants for seven years or so, then cast them aside. The money we pay them each school year is often not enough to live on in the summer. Their health benefits are often marginal, their retirement benefits nonexistent.” Berube and Nelson ask, “What are the ethics of training students for jobs that few of them will ever have?”[2]

To be sure, politicians and administrators increasingly deploy teaching assistants and adjunct lecturers as implements in the commodification of the American university. Yet many teaching assistants and part-timers in the United States also enjoy numerous privileges and resources. Graduate training in many places affords flexible work arrangements, tuition waivers, relatively secure political and academic freedom, social prestige, and an exhilarating assembly of print and electronic materials about which students elsewhere in the world can only dream. Aspiring intellectuals in China, Indonesia, and Russia—to say nothing of the working classes in this country—may properly regard the conditions of graduate training in the United States as a feast of opportunity. During the last generation, moreover, admissions committees in history and elsewhere have shown a greater commitment to cultural diversity and graduate training is now a meaningful possibility for women and racial and religious minorities. The goal for which these and other graduate students strive, the tenured professorship, might very well prove illusory in the coming decades; for now, it remains “the last good job in America,” as sociologist Stanley Aronowitz observes. Harvesting the potential of graduate training means struggling against the deplorable practices Berube and Nelson identify. It also means acknowledging—and defending—the privileges that endure.[3]

Social critics and professional historians routinely worry that “the public” does not understand its own past and, correspondingly, that PhDs must work harder to fashion connections with nonspecialized audiences. They note, for instance, that business and management undergraduate majors doubled in the last 15 years as enrollments across the humanities declined sharply. (One study, reports historian Russell Jacoby, concluded that courses in the humanities and social sciences now account for fewer than one third of the total credits taken by American undergraduates.)[4] Yet the gap between rhetoric and reality persists. Many history graduate programs discourage—or at least fail to reward—work that explicitly seeks to reach popular audiences. Despite incessant hand-wringing over a “present-tense” American culture, the profession indeed seems as circumspect as ever about the status and implications of work with “the public,” and commonly constructs and defends rigid categories to distinguish “service” from “scholarship.”[5] That liberal education might act as a leavening force against the encroachment of utilitarian social values too often remains mere sentiment—a view bereft of the sort of seriousness that might yield meaningful reforms.

Should the controversies and possibilities of the past decade prove a reliable portent for the future, however, popular history will require the participation of a new generation of scholars. At present, graduate students might ask whether their training includes preparation for engaging in such disputes as the controversies over the National History Standards, the proposed Disney theme park, and the Enola Gay exhibit. They might inquire whether graduate training incorporates new technologies like CD-ROMs and the Internet, already influential agents of historical representation. And they might investigate the assumptions that underwrite the differences between “service” and “scholarship.” As they contemplate their dual role as scholars and citizens, graduate students cannot afford to delay reckoning with these difficult questions. What constitutes acceptable scholarship? For whom do they write and teach? What do they hope to achieve?

As I hope the foregoing begins to suggest, graduate education in history stands today at the center of a series of tensions. Questions about the direction of the profession have always loomed large for aspiring scholars. But the very idea of a graduate student is now undergoing an earnest and potentially perilous reexamination, and no serious observer argues that—in one way or another—the production of graduate degrees has managed to elude the multiple crises that confront higher education in the 1990s. I hope this column can serve as a forum for discussion, and I invite graduate students, undergraduates, and professors of every persuasion to contribute.

[1]      Irving Howe, “Berkeley and Beyond” in Revolution at Berkeley: The Crisis in American Education, edited by Michael V. Miller and Susan Gilmore (New York: Dell, 1965), xx; Christopher Lasch, “Foreword” to Richard Hofstadter, The American Political Tradition and the Men Who Made It (New York: Vintage, 1973), xxiii. For Gutman’s complaint about the “overspecialization” of the profession, see Herbert G. Gutman, Work, Culture, and Society in Industrializing America (New York: Vintage, 1977), xii-xiii.

[2]      Michael Berube and Cary Nelson, “Introduction: A Report from the Front” in Higher Education Under Fire: Politics, Economics, and the Crisis of the Humanities, ed. Berube and Nelson (New York: Routledge, 1995), 20, 21.

[3]      Stanley Aronowitz, “The Last Good Job in America,” Social Text 15 (summer 1997): 93-108. Aronowitz argues that graduate students and tenured faculty must devise a strategy “for universalizing the self-managed time some of us still enjoy” (108).

[4]      Russell Jacoby, Dogmatic Wisdom: How the Culture Wars Divert Education and Distract America (New York: Doubleday, 1994), 3, 8, 9; Christopher Hitchens, “Goodbye to All That: Why Americans Are Not Taught History,” Harper’s, v. 297 (November 1998): 37-43, 46-7. For another view of the historical sensibilities of nonprofessionals, see Roy Rosenzweig and David Thelen, The Presence of the Past: Popular Uses of History in American Life (New York: Columbia University Press, 1998).

[5]      See, for example, Lynn Parsons, Joan Shelley Rubin, and Owen Ireland, “Redefining Scholarship: Some Problems,” OAH Newsletter 23 (1995): 6-7.