History as a Vocation

Introduction to Every Fury on Earth. Davies, 2008.

Although I was born and raised in Gettysburg, my interest in history consisted of scouting the battlefield for patches of land where my friends and I could swill beer without attracting the park rangers. The only evidence that I possessed books lies in my memory of using them to light the bonfire at our high school graduation party.

Not until I moved to Washington D.C., following a lackluster performance at college in rural Virginia, did I decide to take evening classes in the master’s program at George Mason University. There I met Roy Rosenzweig, whointroduced me to academic history and encouraged me in all the ways that count for a young man enduring his first ecstasies of learning without a background in higher education. Led by Roy and Lawrence Levine, who had recently migrated from Berkeley, George Mason’s History Department was staffed by a crew of assistant professors hip to the literary theory then in fashion. Had I entered the academy with Roy, twenty years earlier, I might have taken my training in social history. I might have tried to relate my discoveries to a social movement outside the university. But social history, sociology, and social movements had exhausted their innovations by the nineties. Cultural Studies had replaced them as a source of radical political speculation across the humanities. The result was bracing. I went from burning books to describing them as “cultural artifacts.”

I quit my job and went full-time. Roy was the hub. Already he saw in new technology a means to consolidate and distribute the gains social history had made. He worked closely with Steve Brier and Josh Brown at the American Social History Project (ASHP) at the Graduate Center, City University of New York. Having struck up a synthesis with the textbook Who Built America?, they had transformed the first volume into a multimedia CD-rom. I signed on as research assistant for the second volume and had a hand in minor projects for years thereafter. The solidarity was engrossing. Roy founded the Center for History and New Media several months after I began classes. He built it from a sheet of paper tacked to his office door into one of the world’s leading centers of its kind.

My studies at George Mason culminated in a conventional paper on Dale Carnegie’s How to Win Friends and Influence People. Even this modest exercise, however, illustrated the whole world of contrasts that opened between my past and my choice of vocation. Who Built America? identified the history of the country with the history its working-class. As the son and grandson of small businessmen, I grew up despising labor unions. That most of my professors were not only pro-labor radicals, but Jewish radicals, disclosed aspects of culture (and reality) that my upbringing had ignored or denied.[i] In my family, moreover, How to Win Friends and Influence People ranked with the great documents of the twentieth century, slightly above Russell Conwell’s “Acres of Diamonds” and Andrew Carnegie’s “The Gospel of Wealth.”

My family members—teachers, nurses, and small businessmen—inhabited the hamlets of Adams County, Pennsylvania, as self-conscious descendants of conservative German and English farmers. We have been living in the area since the eighteenth century. The county had voted Republican in nearly every presidential election since 1920. In the 1980s, Ronald Reagan was a culture hero as well as a political leader. Now it is George Bush who moves them. All along they have paid out large sums to attend the Dale Carnegie Institute, at once affirming their ideological inheritance of Protestant individualism and adapting it to the realities of the modern corporation. In my final paper I criticized Carnegie as the Machiavelli of the petit bourgeoisie. They applauded his ingenuity in defending American capitalism against communists, hippies, queers, blacks, Jews, Jesse Jackson, and Jane Fonda.

The radical social movements that had shaped Roy Rosenzweig’s youth in Queens had never visited my part of rural Pennsylvania, not even when they were most needed. “Many servicemen in Vietnam feel that when they are sent over here it is for a duty that they themselves cannot fully understand and thus will not give their complete cooperation,” my father had written from somewhere in Southeast Asia, in a letter published in the local newspaper in December 1970. Earlier that summer, having gone eagerly into a war that was already lost, he had found himself fighting a rearguard battle against his fellow Marines. In his letter, he traced variations in their morale to the varying political attitudes of their home communities. The quality of his own patriotism he brandished proudly. “Being over here is not a duty to me, but rather a privilege—a chance to help our great country in whatever way I can, also a chance to give something back to the community which has helped me in so many fine ways.”[ii]

My father left Saigon with a wounded leg and a Purple Heart. If any one book caused me to enroll in George Mason, it was Neil Sheehan’s A Bright Shining Lie, which I read almost by accident after college. At home, broaching any discussion of the morality of the war aims provoked the furies. Vietnam invigilated my adolescence as the Lost Cause once affected southern youth. There was the same intimate connection between history and biography on display; the same embattled refusal to confront the toll of a mistaken war; the same odor of unconfessed shame. At sixteen, watching the Iran-Contra hearings on television with my grandfather, I donated a portion of my summer wages to Colonel Oliver North’s defense fund. By the time I left George Mason a decade later I had renounced my patrimony and embraced my re-baptism, without, however, knowing where it might lead.

I cannot believe any doctoral program in the country afforded greater freedom to think through such matters than the Department of History at the University of Rochester, where I enrolled in 1996. The faculty there drew no firm line between the practice of history, journalism, and social science because they rejected the modern graduate school’s invidious distinction between scholarship and citizenship. They never in discouraged graduate students from contributing to upmarket magazines and newspapers, nor to the broadest possible range of scholarly journals. I loved it. In addition to completing seminar work in intellectual history, I served as president of the university’s Graduate Organizing Group, participated Rochester Labor Council’s Workforce Education Program, hosted Christopher Hitchens, Katha Pollitt, Ralph Nader, and other public figures, and published my first essays.

The generosity entailed by the Department’s conception of history as a vocation spared me the assumption, rigorously enforced among the graduate students I began to meet elsewhere, that to achieve success in the field was to drive for disciplinary knowledge. It also spared me the snobbish assumption, increasingly apparent in those same graduate students, that democratic societies cannot generate cultural excellence. Many and varied were the obligations of the aspiring historian, and mastering historiography was high on the list at Rochester, to be sure. But the first and most important was landing an argument on the level of abstraction between the specialists and the boobs. Telling them apart was the hardest part.

The legendary personalities in the Department’s history still excited interest among the graduate students, encouraging us to honor its legacy of bold, politically active scholarship. Norman O. Brown and Eugene Genovese had taught in the Department. So had Herbert Gutman, co-founder of the American Social History Project and co-author of Who Built America?[iii] Christopher Lasch had outlasted them all. Lasch died two years before I arrived, but his exemplary body of work presided over the Department as a standing challenge. The New Radicalism in America helped me to locate myself in a tradition of intellectual revolt against middle-class families and taught me to beware the social isolation to which it often led. The Culture of Narcissism and The True and Only Heaven tutored me in a sensibility aloof from the canting conservatism I once knew and the liberal tradition I might have considered its only alternative.

In the genealogy of radical social thought that focused my attention, no family of ideas was more significant than pragmatism. I learned to respect the relativity of knowledge, to test the value of ideas by their power to influence action, to define inquiry in the vocabulary of naturalism, to associate freedom with “creative intelligence.” These lessons collected toward a center in Robert Westbrook’s John Dewey and American Democracy, the most accomplished book in the Department, at least among those sensitive to its recent political history. Rather than inflating the significance of his subject, Westbrook made Dewey appear less influential than received opinion suggested. Dewey was not the representative liberal philosopher of the Progressive Era, though Progressivism established the major context for his work. He was a prophet of democracy whose most radical insights had been scanted, misunderstood, or ignored by his critics. In The Revolt of the Elites, Christopher Lasch, one such critic, reversed his judgment of Dewey and dedicated this final book to Westbrook.

“Democracy is neither a form of government nor a social expediency,” Dewey once wrote, “but a metaphysic of the relation of man and his experience to nature.”[iv] With Westbrook as my advisor, I beganto see the implications. Democracy was not only procedures and institutions. It was an embattled political theory indebted to theology as well as to the human sciences. It was a social struggle on the part of movements to improve the quality of community life. It was the main story in the ongoing drama of American power abroad.

As for pragmatism, a revival had been brewing in literary criticism, law, and philosophy since the seventies. But timing is the key to inspiration, and the political entailments of Dewey’s pragmatism were never more compelling than in the decade after 1991, when John Dewey and American Democracy appeared. Westbrook presented a theoretically sophisticated argument for radical democracy on premises independent of liberal capitalism and Marxist socialism. In the decade after the fall of the Soviet Union, while Marxism was pronounced obsolete and democracy and capitalism were held to be analytically and actually indistinct, to be armed with the Deweyan dissent was immensely useful. Of the dwindling number of leftist radicals I met in Rochester in the late nineties, a handful continued to call themselves Marxists. Few did so without hesitation. The rest could call themselves democrats or anarchists, as I eventually decided to call myself, without flinching.

The choice to write a dissertation about C. Wright Mills came easily. Mills’s writings in the forties and fifties made him, chronologically speaking, the key figure in the development of radical social thought from John Dewey to Christopher Lasch. Mills had written his dissertation about Dewey (one of his “godfathers”) and Lasch had cited Mills repeatedly and enthusiastically. The disorderly transmission of radical values from generation to generation had puzzled all three men. A study of Mills seemed a good way to repair some broken links and to improve the scattershot and polemical quality of the secondary literature. InAchieving Our Country (1998), Richard Rorty accused the “Mills-Lasch thesis” of misleading the New Left on the threat of international communism. Rorty criticized Mills’s call for “our own separate peace” with communist intellectuals. But Rorty did not tell readers that Mills had made his call after the death of Stalin, during a political moment pregnant with reasons to hope for a detente. Elsewhere, Rorty (mis)attributed the idea of “separate peace” to Lasch.[v] I thought these casual errors, all-too-common in the literature surrounding Mills, masked his rightful place in the Deweyan tradition that Rorty admired and promoted.[vi]

Then, too, the emotional and ethical force of Mills’s style moved me. After reading The Power Elite, there was no going back. Soon I found myself grappling with a genuine biography, not a study of ideas, but a chronicle of a life-in-ideas. Mills claimed that history as a record of presumptive collective action was ending, that the machinery of the “post-modern epoch” was hollowing out the moral culture of democracy. His personal history seemed to tell a more sanguine tale. Born and educated in Texas, he arrived at Columbia University in 1945 as a prince of sociology, winning tenure there before he turned 40. He published 10 books, including four bestsellers in nine years, and experimented with organic farming, architecture, marriage, photography, and motorcycling.

I decided to treat his flamboyant egotism as the instrument of his energies, hoping to engage my readers in the completest possible range of legitimate responses. In “What Makes a Life Significant,” William James offered a warrant for this decision. “What our human emotions seem to require is the sight of the struggle going on,” James wrote. “The moment the fruits are being merely eaten, things become ignoble. Sweat and effort, human nature strained to its utmost and on the rack, yet getting through alive, and then turning its back on its successes to pursue another more rare and arduous still—this is the sort of thing the presence of which inspires us, and the reality of which seems to be the function of all the higher forms of literature and fine art to bring home to us and suggest.”[vii] Dead at forty-five, Mills spent his brief life “on the rack.”

The challenge is to lash an unusually wide range of subjective judgments onto the objective rigor of historical analysis, to honor the technical standards of scholarship without dishonoring the spiritual qualities that stamp biography with its humanistic bias. How might biographies of charismatic intellectuals heal the wound between the argumentative and evocative dimensions in our literature? How might “force and fire” and “sweetness and light,” those dueling motifs in Anglo-American letters, mingle in mutual aid and sympathy?

A biography of Mills also seemed a good way to explore the history of the academic profession to which I had entrusted myself. More and more, I grew suspicious. Already by the time Herbert Gutman, Eugene Genovese, and Christopher Lasch had converged in the Rochester History Department in the early seventies, several generations of satirists had taken aim at the petty ironies and defrauded ambitions dogging campus society, working in the conflict between liberal education, with its emancipatory ideals, and the vocational ethos of colleges and universities. “We find ourselves uncomfortable in academic life and often at odds with the profession and the university,” Lasch wrote in 1973.[viii] To read today’s satirists alongside the novels that founded the genre in this period is to be struck by the irony of a satire that never quite transcended the object of its mockery.

If the least that could be said for the academic profession in the seventies was that it offered job security, health insurance, and a full-time salary, this was more than subsequent generations had the right to expect. The job market collapsed. The last person who believed it would recover was the economist and higher education expert William G. Bowen, President of Princeton University until 1988, when he became President of the Andrew W. Mellon Foundation. The next year, in an annual report widely paraphrased and disseminated to college students, myself included, Bowen at once overlooked the biggest story in graduate education and helped to exacerbate its worst features. “The immediate pressures on our graduate schools to provide faculty members for colleges and universities will be considerable in the 1990s,” he wrote. “The results of recent research persuade us that there will be serious staffing problems in essentially all fields within the arts and sciences.” Bowen predicted “very substantial [faculty] shortages” in the humanities and social sciences, such that, by the late nineties, “the competition for faculty members could become so acute that it would threaten the quality of teaching and research in all of higher education.”

Legions of students took Bowen’s advice into graduate school and learned to sight the trouble from the opposite direction. University managers applied to the academic profession the downsizing and outsourcing techniques perfected by the corporations. Neither the political history of this purge, nor its cumulative effect on intellectual freedom, was well understood at the time. In 1998, in conjunction with the Rochester Labor Council, I co-organized a one-day conference on the changing nature of academic work and came away impressed by the meagerness of our collective understanding. Apparently, I was not alone in finding it difficult to comprehend how ruthlessly the life of the mind could be made the prey of market forces. Academic labor? The idea made me recoil. Could the scholar’s natural rights to contradiction and spontaneity obey an industrial discipline? If those rights were alienated, would not the qualities of mind they represent wither in their turn?

The University of Rochester was led by Thomas Jackson, abankruptcy attorney by training and by spirit. Of all the bad ideas Jackson and his minions promulgated, none affected me more directly than their decision to institute a writing program for the benefit of the “customers,” i.e., the undergraduates. Curriculum-wide composition programs are notoriously expensive to administer and laborious to teach. The Jackson administration, finding graduate students the cheapest and most convenient solution, thus conscripted us into service. The terms of our fellowships shifted, the decision made by administrative fiat. To continue to receive our stipends we would have to interrupt our studies and teach remedial writing. I gave the course once, then left the rest of the money on the table, moving to Austin to work in the C. Wright Mills papers at the University of Texas.

Over the next three years, 1999 to 2002, I served as graduate-student representative to the American Historical Association’s first Committee on Part-Time and Adjunct Employment. In this same period I founded and edited a column on graduate education in the AHA’s newsletter, Perspectives. Here, as elsewhere, I argued that if history was a craft, then historians should organize themselves into guilds and strive for independence from the point of production. A guild, so conceived, regulates admission to its ranks; enforces the ethical and aesthetic standards of its craft against the pecuniary interests of employers; multiplies classes and types of jobs in sympathy with the skills and desires of its membership.

Organizations such as the American Historical Association, which present themselves as custodians of the standards by which scholars are educated, accredited, and employed, bear a passing resemblance to the guild ideal. Yet professors in America have never formed such self-governing bodies and invested them with necessary economic powers and political responsibilities. As a result, the young scholar has been defenseless, confronted by insultingly narrow job descriptions and forced to transmit knowledge by paraphrase, in prefabricated, digestible forms and methods easily copied. Then the demands made upon the successful teacher establish the criteria for legitimate research. Promotion means proving one’s competence by writing a thesis-driven monograph in one small corner of one small field. Tenure means retaining responsibility for it, and often for it alone, forever after.

Most literate adults could master professional history’s dozen Big Ideas in a long summer of reading the “state of the field” handbooks. Could they read the dissertations and monographs, which hang like edicts over the souls of scholars? The graduate student or assistant professor bores into a mass of dumb facts with tools handed down by forebears, then “deposits” the product in the library-bank for reference and storage; but not often for reading.

As a permanent class of lecturers and adjunct teachers took shape in the nineties, it fell to a new group of union organizers to establish a measure of independence over the point of academic production. In return, they generated a vicious opposition from the most powerful faculties in the country, although these faculties have had little to offer but a restatement of their privileges. And from the rest? The plight of the scholar who once expected to command a seller’s market elicited vested indifference or outright incomprehension. Professors have continued to think of themselves as craftsman even as they fulfilled the economic functions of corporate managers and bosses. How well I remember Rochester historians bemoaning the divestment of upstate New York. Here, disappearing manufacturing jobs stranded thousands of unemployed and unemployable workers. Yet these same historians baulked before their complicity in a system of graduate education that wasted its products by similar dynamics of over-production and under-employment.

Who is responsible? Nobody can say. The losers are referred to The Market. That they are so referred by the same professors who have built careers on criticizing the irrationality of unregulated markets—everywhere, of course, except their own departments—has made the news all the more difficult to bear.

My experience with the American Historical Association suggested a collective inability on the part of professors to act. The discussions of the Committee on Part-Time and Adjunct Employment taught me that, with some exceptions, the leading members of the AHA are convinced of their helplessness. They might affirm the right of teaching assistants to form unions, but they would not bring the fight for full-time, benefits-eligible work to the accrediting agencies, where power lies. Nor would they tolerate much impudence. After sixteen months of editing “Issues in Graduate Education,” in which I made readers of Perspectives eat a steady diet of protest and analysis, I was instructed to moderate the message. I resigned instead, and was replaced by a cadre of graduate students who set about commissioning articles on the best layout for a curriculum vitae, and other Significant Subjects.

The intellectual fragmentation of the humanities has undermined the ability of their spokespeople to mount a common defense of the very idea of a free professional, with rights and values distinct from the mere employee. Yet the social life of professionalism persists. Critics who voice their arguments from outside the cool, detached register of academic argument meet epigrams aimed at dissuasion (“pick your battles”—a euphemism for do not fight this battle) or maxims against “burning bridges,” as if bridges built upon a suspension of one’s critical faculties might lead anywhere one might wish to go.

Within these broad limits I have struggled, no doubt unsuccessfully, to reduce the bombast in my writing. Some bombast appears to be inevitable. If the political psychology of righteous indignation is too prominent among anarchists, that is because our method of locating responsibility for suffering outside immediate experience continually risks blurring distinctions between I and other, self and society. Anarchism has no philosophy of history, no superannuated system of concepts to blunt the knife-edge of personal complaints. Criticism turns to querulous resentments. “Every poor devil finds pleasure in scolding,” Nietzsche wrote of the anarchist temper. “It gives him a little of the intoxication of power. There is a small dose of revenge in every complaint. On the basis of this logic, one makes revolutions.”[ix] Then again, without punishment there is no culture, even if the anti-thesis (without culture there is only punishment) is also true. The resolution, if there is one, should lie in the critic’s cultivation of distinctions between blame and pique, justice and revenge. The facility for keeping them in view must come out of radical self-awareness and self-discipline.

The dissuaders flee from the risks and burdens of engaged criticism by taking shelter in ready-made hierarchy of values. Or they absolve the ugliness of public life into the old mugwump gentility, warning against being negative because they believe conflict inflicts psychic damage. Either they withdraw into an absolutist morality; or they withdraw into a nihilism that cowers before the task of judgment because it cannot imagine any values worth defending. The alternative, the Dale Carnegie way, skirts the whole problem. “When dealing with people, remember you are not dealing with creatures of logic, but creatures of emotion,” Carnegie says. “The only way to get the best of an argument is to avoid it.”[x] Touché.

On the campus itself, the professional tone is still available to those who would insulate themselves from criticism over values. Governed by committees and rituals of dissent, the professor beholds a morally passionate challenge as little more than heckling. The erosion of the economic basis of middle-class professionalism might have disturbed the quietism in the professional manner if it had not found support from an unexpected source in “the feminization of history” and humanities teaching.[xi] American women in the nineteenth century seized control over the primary schools with a thoroughness unknown in any other part of the world. Since the last formal barriers to women faculty fell in the seventies, they have achieved comparably disproportionate gains in universities. That their achievement has coincided with a decline in the status of college teaching and a partial withdraw of the jobs from the full economy lies within the historical norm in America. Has the feminization of campus society also inculcated a sentimental attitude toward students?; a political correctness that causes criticism to die on the tongue?

Carnegie promised white-collar men that if they emasculated themselves—if they suppressed their thirst for conflict by eliding the distinction between sincerity and the performance of sincerity—then success would be theirs. Feminization may offer the same bargain by a pedagogy of therapeutic uplift and ego-building. I am not sure how one might demonstrate this. When I began teaching at Harvard University I know that more than half my colleagues were women, and I thought I saw feminization omnipresent. Toward the end, when President Larry Summers lost his job and was replaced by Drew Faust, a woman historian, I thought I saw it running at high tide.

Arriving at Harvard in the summer of 2000, I joined the staff of the Committee on Degrees in Social Studies, the fifth largest concentration in the College and the most distinguished of its honors programs. As Tutor, then Lecturer, I advised senior theses, conceived and conducted freshman and junior seminars, and taught the sophomore tutorial six times. The fractured nature of my appointment, renewed annually for six successive years while never amounting to more than 65 percent in any one year, kept me on the margins of prestige and promotion even as it kept me there long enough to serve three Chairpersons of Social Studies and two Directors of Study.

The position afforded an exhilarating scope of intellectual engagement. I taught the classics of modern European thought, Hobbes to Habermas. Founded in 1960 to combat the fragmentation of liberal arts education, the Committee took the name Social Studies (rather than Social Sciences) in the bold spirit of C. Wright Mills’s The Sociological Imagination. It was an ideal place to study.

But it was not an ideal place to teach. The Committee had no ladder faculty of its own. It borrowed assistant professors and staffed its remainders with lecturers and teaching assistants. The students faced a permanent, unavailable faculty on the one hand, and a transient but available class of adjuncts on the other. The fragmented composition of the teaching staff continually subverted the committee’s interdisciplinary ideals. With no possibility of promotion within Social Studies, the assistant professors and lecturers had to prove themselves in their disciplines, where tenure waited at the edge of their thoughts. A colleague once explained to me that although Social Studies gave equal time to Michel Foucault and Jurgen Habermas, assistant professorships in political theory split between “Foucaultians” and “Habermasians.” One had to choose. Rather than encouraging a transvaluation of academic methods and disciplines, as the founders intended, Social Studies trundled from one to the next. Its isolation, in turn, was physically represented by its location on the edge of the campus, far from Harvard Yard.

Because I was an adjunct, I had no job description, no political rights, no promise of renewal beyond each year and thus no basis to protest the opaque manner by which tutorial assignments were decided and distributed. Class consciousness among the adjuncts was nil. The corporate mentality was much in evidence. Accordingly, it was never enough to perform one’s duties well. One was expected to attest to the benevolence of the institution This could be accomplished by aping the local definition of the Good Attitude. As consumers turn themselves into adjuncts of corporate marketing departments, so adjunct faculty identify their self-worth with the propaganda-interests of their employers. And why not? Everyone recognizes the value of the Harvard Corporation’s brand.

I found the students the strangest part of the scene. It was not that their demand for relevance vitiated the mission of Social Studies; relevance was always the price that liberal arts have had to pay to gain a place in the curriculum. It was their fetish for wealth and status I found so striking. Harry Lewis, the Dean of Harvard College, described the campus culture as Excellence Without a Soul (2006), a harsher phrase than any that crossed my mind. Yet I, too, was startled by the cynicism of the students.

During my time at George Mason, I had sent my parents a copy of “So You Want to be a Historian,” an essay in the Washington Post Education Review. It described the life of the mind as a calling that demanded not only a heroic commitment to reading and writing, but a spiritual conviction of the importance of knowledge and sensibility freely achieved.[xii] I had sent the essay home because it explained why I wanted to pursue a life of scholarship. Over the next decade, as religious enthusiasm swept through Gettysburg, the spiritual component of my vocation remained strictly eccentric there. At Harvard, meanwhile, I confronted another form of anti-intellectualism. My interview for the position in Social Studies ended with a warning from the Director of Studies. Never show weakness in the presence of the students, she said. She called them “sharks” and promised that they would “eat you alive.”

The post-pubescents of notables for whom I suddenly found myself holding curricular responsibility included the daughter of a U.S. Senator, the son of a Hollywood director, and the son of New Jersey real-estate developer Charles Kushner, named here because he is a leading example in Daniel Golden’s piquant The Price of Admission. Not long before Mr. Kushner was indicted for bribery (he was convicted on tax charges) he got his son Jared into Harvard by a $2.5 million donation. Back in prep school (according to Mr. Golden), Jared had not impressed his teachers or guidance counselors as the kind of student who might excel academically. Of this curious background I knew nothing at the first meeting of my first seminar of my first year, when Jared entered my classroom and promptly took the seat across from mine, sharing the room, so to speak. I was drawing an annual salary of $15,500, and borrowing the remainder for survival in the Cambridge housing market, in order that Jared might be given the best possible education. I was subsidizing him.[xiii]

Most of the students I encountered had already embraced the perspectives of the rich, the powerful, and the unalienated, and they seemed to have done so with appalling ease. In keeping with the tradition of the American rich, they worked exceptionally long hours, they were aggressive in exercising their talents, and on the ideological features of market-capitalism they were unanimous. Their written work disclosed the core components of the consensus upheld by their liberal parents: wherever equality of opportunity presents itself, equality of welfare is likely to ensue; the meaning of liberty lies in the personal choice of consumers; free competition in goods and morals regulates value; technological progress is an unmixed good; war is unfortunate.

Around this consensus crystallized an ethos. One of my less affluent students, the son of a postman, asked me once for advice about a financial investment. He said his friends had told him to invest “in prisons”—in one of the new private companies gaining management contracts over correctional facilities. I told him what I thought about this recommendation, but only later, when I learned how little he had believed he must invest ($2,000 was his total savings), did I think I understood his question. No amount of money may be permitted to lie idle if something may be gotten for nothing. The capitalist theory of life as a game disallows uncapitalized advantages.

I asked each of my seminars whether they had so far encountered a teacher they genuinely appreciated. What aspects of manner did they admire? Invariably the students replied that good teachers made them “feel comfortable.” To sense the sterility, one had only to listen. When I proposed to teach a junior seminar entitled Anarchist Cultural Criticism in America, I was instructed to go ahead only if I first changed the title to America and Its Critics. Here was the same method of cultural hygiene which has transformed Harvard Square from a bohemian enclave into an outdoor mall. “Shopping period” was the name of the week the students select classes.

Grading, the one instrument of power I wielded, offers the best example of the degradation of pedagogy by the frenzy of success. The Boston Globe’s expose of grade inflation at Harvard left little doubt that it is a semi-rigged competition, another subsidized risk. The formal scale runs from A to F. The tacit scale runs from A to B. I learned the latter from students and supervisors, but especially from colleagues, few of whom wish to carry the opprobrium of the low end. This is as it may be. But the presence of two standards of value, one official and one tacit, is always a sign of corruption: the one necessarily dishonors the other. It also abridges academic freedom. Although I never dared to give a final grade below B minus, I can attest to the petty harassment that teachers attract in such cases. I do not mean merely that the students are never so aggressive and articulate as when they hunt for grades. I mean that they wage political reprisals against the B minus grader and send gifts to high-placed academic directors.[xiv] Electronic mail abets all varieties of grade-jockeying by abolishing what little remains of the presumptive moral distance between teachers and students.

In January 2008, a “Group of Harvard Alumni from the Vietnam War Era” sent an open letter to the university’s president. “We are concerned by what we see to be the widespread apathy and political indifference of the student body at Harvard College today,” said the letter, which defined the problem as “self-examination and broad intellectual growth versus the careerist, vocational orientation.”[xv] The letter was only half-right: the students are the opposite of apathetic and indifferent. The new student rich have retained the radical energy of the sixties, only to engage it in lushly monetized competencies. The New Left occupied universities to protest the bureaucratic hollowness of examination rituals and grading rationales. Now its children complete the attack on the authority of teachers, who are simply annexed to the management of student careers, drawn into an agreement between corporation and client in which failure is not an option. I had to grade the students, and I had to grade them well. Everyone expected a recommendation letter.

The ethos mimics the psychodynamics of inflation in this age of unlimited markets. Apparently, since the students were young their parents and teachers have bathed them in ambitious glances, so that the source of their identity has come to lie in their potential. This is why, though they demand to be graded, they resent the teacher’s claim to judgment based on performance, which implies a stable set of values. A relatively low judgment may be met by the always available thought that they could have done better.

This thought is not as easy to rebut as one might suppose. Harvard students may be divided into three types: those who infer from their presence on campus that they have already made it; and those who infer that they are on their way to making it. Both types are keenly aware of the prestige-value of their situation. To mention to a stranger where one attends college is to drop the “H-Bomb.” Neither type, accordingly, has encountered any really good reason to suppose that their potential is anything but limitless. Members of the third type, the ironists and the scoffers, have their degree and eat it too. Their anti-Harvard posturing incurs no risk. The gigantic endowment, that symbol of unspent potential, blesses their skepticism by indexing their value on the credentials market. Consider that the grading scandal (an open secret on campus) broke into the public discussion at the same time the dot.com bubble burst. Try to see these phenomena as twin instances in the chronic overextension of the credit markets. When scholars act as clerks and students act as clients, college teachers do not differ from corporate accountants.

If youth is wasted on the young, is teaching wasted on students? And what about the scholar? What happens to those of us who discover a jealous passion when we discover our calling, then find the university, the only institution formally dedicated to upholding the scholarly ideal, unmindful of our aspirations and sedulously banal in its own? “Every young man who feels called to scholarship has to realize clearly that the task before him has a double aspect,” Max Weber wrote. “He must qualify not only as a scholar, but also as a teacher. And the two do not at all coincide.”[xvi] The gap between the moral culture of scholarship against the vulgarity of college teaching deserves to be drawn sharply, if only so that it may be mitigated.

To hope for reform, however, is to be fooled twice. The proliferation of ancillary services and the growing power of student affairs offices indicate all trends running in the direction of consumer services. They have been running that way for quite some time. As the sentimentalization of manners has taken a feminine direction in the professional ethos, so the art of pedagogy has become a service-class preoccupation whose chief duty is flattering students on their way to monied careers.[xvii] Vain is the image of thecollege teacher as a bearer of culture, a representative of a value-standard, a model of aspiration and achievement, unapproachable except by knowledgeable respect.

“The great historians, with few exceptions, are those who have not merely studied, but lived; and whose subjects have ranged over a much wider field than the period or subject of which they write.”[xviii] This passage, from Samuel Eliot Morison’s History as a Literary Art: An Appeal to Young Historians, is full of unfashionable wisdom. I think Morison, in urging scholars to come out of the classroom, meant to protect their self-perfection from the constant company of the very errors only recently left behind. Teaching, so understood, is a form of sabotage. “Every teacher acquires a continually increasing stationary force, a cumulative inertia in proportion to the eloquence of his innovating doctrines,” Emerson complained to his journal in 1834.[xix] Emerson was one of a surprising number of thinkers who have exerted a pronounced influence on modern thought without ever holding a regular college or university appointment, surprising since colleges and universities are virtually only places where his influence peeks through.[xx] Perhaps he retained his faith in education because the Lyceum was the social setting for his pedagogy. He addressed his pupils, his public, inspired souls in the garden of his perceptions.

Should I say that I am grateful for the chance to teach at Harvard? I am. Should I acknowledge the many fine exceptions it was my privilege to instruct? I do, with pleasure.[xxi] I finished my dissertation and won three teaching prizes.At the end, however, I made my assignations in the mists. We were all drugged. Teaching on the temporary staff at Harvard was a little like visiting Disneyworld. The magic dust induced a light narcosis. The mind went incontinent in the presence of paradox and conflict, and it was tough to tell how much fun I was having from how much I was having to pretend. The most important thing was never to become the screamer who ruins the ride for everyone. The line is long.


[1] The American Jewish Year Book for 1989, the year I graduated from high school, does not find any town in south-central Pennsylvania hosting more than 100 Jews. Were there 99 or none? I never knew any.

[2]      “Lions Receive Note of Appreciation,” Hanover Sun, December 1, 1970.

[3]      See Ira Berlin, “Herbert G. Gutman and the American Working Class,” in Gutman, Power and Culture: Essays on the American Working Class, ed. Ira Berlin (New York: Pantheon: 1987): 3-69.

[4]      Quoted in Richard Rorty, Achieving Our Country: Leftist Thought in Twentieth-Century America (Cambridge: Harvard University Press, 1998), 18.

[5]      Against Bosses, Against Oligarchies: A Conversation with Richard Rorty, eds. Rorty, Derek Nystam, and Kent Puckett (Chicago, 2002), 50.

[6]      Rorty later published an essay in the London Review of Books in which he predicted the desiccation of free institutions after another terrorist attack strikes U.S. soil. “At the end of this process of erosion, democracy would have been replaced by something quite different. This would probably be neither military dictatorship nor Orwellian totalitarianism, but rather a relatively benevolent despotism.” Since Rorty went on to ascribe the origins of the democratic predicament to the same national security shibboleths Mills criticized at their moment of conception, it would be interesting to know whether he reconsidered his call to “redescribe” the portrait of American political society Mills sketched in his vast writings. See Rorty, “Post-Democracy,” London Review of Books, v. 26 (April 1, 2004).

[7]      William James, “What Makes a Life Significant,” in The Writings of William James: A Comprehensive Edition, ed. John J. McDermott (1967; Chicago: University of Chicago Press, 1977), 648.

[8]      Christopher Lasch, “Foreword” to Richard Hofstadter, The American Political Tradition and the Men Who Made It (1948; New York: Vintage, 1973), xxiii.

[9]      Friedrich Nietzsche, Twilight of the Idols, trans. R.J. Hollingdale (New York: Penguin, 1968), 86.

[10]      Dale Carnegie, How to Win Friends and Influence People (New York: Simon and Schuster, 1936), 143.

[11]      Lynn Hunt, “Has the Battle Been Won? The Feminization of History,” Perspectives, May 1998. Dr. Hunt’s answer is: probably.

[12]      Only later I discovered Dr. Fields omitted the one career fact it should have been impossible for her to ignore. Most of those who “want to be a historian” spend their lives in the classroom, teaching. Of this, she said nothing. In this manner, the graduate schools qualify students for a career in college teaching, and for nothing else, without first wooing the kinds of people who wish to dedicate their careers to pedagogy, nor imparting to the rest the skills and resources adequate to its arts. The mismatch between the technical qualifications of doctoral recipients and their real qualities of character and aspiration would appear to be the cause of the halfhearted instruction so noticeable on our campuses.

[13]      Jared later purchased the New York Observer for $10 million, part of which he made buying and selling real estate while he was failing my seminar at Harvard. As publisher, one of his first moves was to reduce pay for the Observer’s stable of book reviewers. I had been writing reviews for the Observer in an effort to pay my debts.

[14]      Once, a judge and his wife went to my supervisor to complain about a grade I had assigned in a senior orals examination. They rested their complaint on the fact that I was not yet in possession of the all-encompassing credential, the Ph.D. They pointed out that the second examiner in the room had assigned the exam a slightly higher grade, and that this second examiner was, in fact, a Ph.D. The judge and his wife did not know, nor did they care to discover, that I was by far the more experienced of the two graders. I had been conducting exams for four years; the second examiner had never before conducted one. A minor gaffe, to be sure, but one that William James, author of “The Ph.D. Octopus,” could have understood and appreciated.

[15]      See “Alumni Protest over Lack of Protest,” Times Higher Education Supplement, Jan. 4, 2008.

[16]      Max Weber, “Science as a Vocation,” in Hans Gerth and C. Wright Mills, eds., From Max Weber: Essays in Sociology (New York: Oxford University Press, 1946): 135.

[17]      See Burton J. Bledstein, The Culture of Professionalism: The Middle Class and the Development of Higher Education in America (New York: Norton, 1976).

[18]      Samuel Eliot Morison, History as a Literary Art: An Appeal to Young Historians (Boston: Old South Leaflets, Series II, No. 1, 1948), 12.

[19]      Quoted in Robert D. Richardson, Jr., Emerson: The Mind on Fire (Berkeley: University of California Press, 1995), 169.

[20]      The list also includes Max Weber, Friedrich Nietzsche, Sigmund Freud, Karl Marx, Alexis de Tocqueville, Simone de Beauvoir, J.S. Mill, Henry Adams, Charles Beard, and many others.

[21]      See Matthew Mahan, “Harvard College and the Liberal Arts,” Senior Honor’s Thesis, Harvard University, 2005; and “Disillusioned at the Top,” Harvard Crimson, April 14, 2005.