Tag: intellectual history

The Arc-Hive Mind?

Blog

I have spent this past week conducting preliminary research for my dissertation prospectus at the National Archives II in College Park, Maryland. While there I had an epiphany (of sorts) about the problems academic historians have reaching a broader public. Unlike, other critics that point to the difficulties presented by academic writing or stress that the nuance of academic histories are too much for the “average” reader (whoever that is), I think the problem is the archive or rather, the relationship between academic standards of evidence and narrative.

Two interrelated events led to my epiphany. Working at a massive research facility like Archives II, I was confronted by a dizzying array of brilliant historians and archivists. Everyone had a thoughtful research project. The reading room resembled a historian hive with researchers buzzing between carts, computers, and copy machines. The collection of source material divided into its binders, folders, boxes, and carts was overwhelming. But one thing was missing: narratives. There was no box labeled “Narratives” and you couldn’t find it searching through the online database (believe me, I tried).

After a long day at the archive, I was killing time in Dupont Circle before meeting friends for drinks. I wondered into Kramerbooks and Afterwords – a charming independent bookstore – and began browsing their history new releases section. Being a poor graduate student and an AmazonPrime Member, I rarely peruse the new history releases and was shocked by how few names I recognized despite being a professional (in training) historian. Flipping through books by Mark Kurlansky and other popular historians, I – again – noticed something was missing: the archive. Most of the evidence in the popular histories was from primary and secondary sources; unsurprisingly many journalists were partial to newspapers. These histories were overwhelmingly story-driven. The few histories I had read were from old or deceased academics like Hofstadter, Lasch, Schlesinger Jr., Zinn, and Woodward. It seemed as though the history academy had lost the popular support it once had during the 1960s and 1970s. But was their work really so different? Feeling confused, I walked out empty-handed.

I thought about all the hustle and bustle in the archives earlier in the day and wondered: where is the output from all this research going? Who is reading all the brilliant work produced by researchers at Archives II? How did popular historians come to fill the void left by academic historians of earlier eras?

Then I began to think of the archive. Hofstadter was notoriously archive-resistant. Lasch seems to have drifted from archival work as he became more of a public intellectual after the publication of The Culture of Narcissism in 1979. Schlesinger’s The Vital Center is almost entirely supported by secondary sources. I thought of all the little archive boxes filled to the brim with folders and wondered: are archives limiting our horizons as historians? This problem seems particularly acute with young historians who are trying to evince academic rigor to their senior colleagues. I remember writing my master’s thesis and building chapters around novel archival evidence instead of published material because I thought it would showcase my research ability. Narrative was often the last of my concerns or something I simply hoped would happen if I added enough biographical anecdotes.

Beyond limiting research, it also seems to exacerbate a bifurcation between academic historical research and politics. It seems to me that the archive creates another separate community – along with the academy and scholarly associations – that isolate academic historians from the public. Academic isolation has intensified the professorial argot, but more importantly it has led to group-think about what is important and how to communicate themes that academics believe are significant to a broader audience. I was shocked how few histories of race, sexuality, and gender were carried at a well-stocked independence bookstore (the ones that were carried being published by non-academic historians like Isabel Wilkerson). Military histories still predominate in bookstores and on television, despite its declining significance in the academy. As the Supreme Court debate over DOMA continues, the value of historical context to politics is readily apparent. Histories of gender and sexuality composed by articulate academics, if widely read, could help Americans understand why gay marriage is an important issue in the same way C. Vann Woodward’s The Strange Career of Jim Crow made legible the origins of the Civil Rights Movement.

Obviously, I think academic historians need to look beyond the archive and focus on storytelling to broaden our audience. Outgoing American Historical Association president William Cronon has already made this point in his annual address, so there is no point to belabor it here, but the question remains: how can professional historians be encouraged to publish books for public consumption while retaining high research standards? Though it will undoubtedly be a long and difficult process, I am confident an accommodation can be reached. It will mean, however, that historians will have to enter the scrum of politics where the rules of reasoned debate rarely hold. But getting a little dirty is a small price to pay for the transformative potential of academic historians acting as popular social critics.

Researchers at Archives II and other reading rooms across the globe have a tremendous wealth of information to share. It is time we began to focus on communicating that information to the widest popular audience. To accomplish this, researchers will need to leave the arc-hive and venture out with their knowledge to pollinate the world.

President Obama’s Moral Revolution and Its Dying Vanguard

Blog

On May 27, President Obama became the first US president to visit Hiroshima since the United States dropped an atomic bomb on that city on August 6, 1945. The visit was the culmination of a long process of reconciliation between the United States and Japan since the end of World War II when Japan changed from wartime enemy to valuable ally almost overnight. In his speech delivered before the bombing’s survivors and their relatives, President Obama called for a “moral revolution” in the face of increased technological capacities to kill large numbers of people. “Technological progress without an equivalent progress in human institutions can doom us,” he warned (full text of the speech can be found here).

President Obama’s speech echoed the concern about technological change outpacing human moral capacity that many American journalists and academics felt in Hiroshima’s immediate aftermath. Journalist John Hersey, sent to interview bombing survivors for The New Yorker, told stories of dread, shock, and suffering. Lieutenant Daniel McGovern captured videos of the bomb’s impact showing bombed out buildings and the bleached skulls of the blast’s victims. Upon hearing about the bombing painter Pablo Picasso is supposed to have remarked, “the genius of Einstein leads to Hiroshima,” linking the beauty of scientific discovery to the devastation of instantaneous mass murder.

Nothing captured concerns about the ethics of using an atomic bomb better than Alexander Leighton’s 1949 book Human Relations in a Changing World, however. Before arriving in Hiroshima in December 1945 to map the bomb’s psychological effects on Japanese civilians for the United States Strategic Bombing Survey, Leighton had spearheaded research on the morale of Japanese-Americans interned at the Colorado River War Relocation Center at Poston, Arizona. The aim of his research at Poston was to assess how the Japanese community responded to the stress of relocation and internment. Leighton hoped that an administration informed by social scientific knowledge – group psychology in particular – would be more efficient and humane. While Leighton did not oppose internment, he advocated administrative reform in the camps emphasizing cooperation between administration and internees on issues ranging from public health to community leadership with the hope of combating the dehumanization of internees. When he left in 1943 to take job with the Office of War Information, Leighton was confident that social science had ameliorated conditions in the camps by improving relations between camp administrators and internees.

Leighton’s attitudes between his work at Poston and his trip to Hiroshima differed markedly, showcasing a lost confidence in the ability of administrative reform to keep pace with the technology of dehumanization and killing. Whereas the poor conditions at Poston – sweltering heat, unsanitary and overcrowded facilities, and popular distrust of administrators – could be overcome by administrative reforms and improved communication, at Hiroshima there was little left to reform. Describing his first impression upon arriving at Hiroshima, Leighton invoked a “city dump with its smells of wet ashes, mold and things rotting, but one that runs from your feet out almost to the limits of vision.” The 4.4 square miles of downtown Hiroshima were completely destroyed. Leighton found a people shattered by the experience of vaporized lives and lost loved ones. An elderly schoolteacher told Leighton the bomb had transformed Hiroshima from “Paradise to Hades” in an instant. What haunted Leighton most was a feeling that Hiroshima was only the beginning. “This is a preview of things to come where I live,” he wrote, “These thousands dead will not be strangers. They will include friends, brother, sister, father, mother, wife and children. Some will die instantly and some will survive awhile to realize their slow death and to beckon for help where there will be no one to answer.”

Leighton came to believe Hiroshima was made possible by the outpacing of moral or civilizational progress by technological development. He hoped that social scientific advances would make using weapons of mass destruction obsolete by easing international tensions. Work in the fields of sociology and anthropology had important roles to play as well, highlighting commonalities unifying the human species. Furthermore, the very place of the social sciences in tying the impersonal work of the hard sciences to the moral world of human beings was significant. Leighton believed social scientific interventions into the natural sciences were necessary for moral guidance. “Moral values when pertinent dominate scientific values at three contiguous points: the selection of the problem to be investigated, the limits of the human and other materials that may be used, and the determination of what shall be done with the results.” Social scientists with their specialty in human values and experience would prevent scientists from privileging scientific theories and results over ethical concerns.

Leighton made numerous recommendations for how to disseminate social scientific knowledge ranging from expanded university fellowships to public education initiatives. Explaining the values and experiences unifying humanity was, for Leighton and others who experienced Hiroshima’s aftermath, an obligation shared across American society from policymakers in Washington to families in small towns.

Leighton’s suggestions make uneasy reading with the continued national defunding of the social sciences during the Obama administration. The Obama administration has vocally supported the STEM fields, but have elicited a lukewarm (at best) response to promoting the social sciences and humanities. In April 2015 the Republican-led House of Representatives Committee on Science, Space and Technology proposed a 45% reduction in federal funding for the social sciences (a useful summary can be found here). This while increasing the overall budget for the National Science Foundation, “adding more than one hundred million dollars each to the offices for biology, computing, engineering, math, and physical sciences.” National cuts reflect declining university enrollments in the social sciences. The University of Washington, for example, reported declining enrollments

Alexander H. Leighton in Poston, AZ during World War II.
Alexander H. Leighton in Poston, AZ during World War II.
in the social sciences ranging from four to forty-five percent depending on the department and responded by cutting twenty-five teaching assistant positions. The 2015 panic over national cuts confirmed fears that waning American economic competitiveness made separating the “useful” natural sciences from the superfluous social sciences a priority for policymakers and universities alike.

President Obama’s visit comes at a crucial moment as America’s East Asian allies are challenged in the South and East China Seas by an expansionist China. His speech was both a reaffirmation of his commitment to Japan as a US ally and a warning to China about the dangers of expansionism. The President’s speech also underlined the perils of dehumanizing language for American audiences. Donald Trump has risen to the Republican Presidential nomination on hateful rhetoric meant to demonize racial, gender, and cultural “others” as inferior and dangerous.

The moral revolution Obama sees as the anecdote to aggressive expansionism abroad and xenophobic nationalism at home begins by reaffirming the human obligations of global citizenship. Yet, it is difficult to imagine constructing a civically responsible American populous while systematically defunding its social scientific and humanistic vanguard. Moral revolutions are not spontaneous. They begin with an understanding of current ethical problems facing humankind and the context of how we are all facing those problems together as part of a single global community. The social sciences and humanities have an important role to play in demystifying other cultures and educating Americans how to become contributing global citizens.

Academic Inequality: A Problem of Ideas and Institutions

Blog

Yesterday, an article on Slate, “The Academy’s Dirty Secret”, showed the extent to which elite universities dominate faculty hiring. History was the study’s worst offender. It found that eight universities account for half of all history professors. Those few students from non-elite universities able to find university jobs usually did so by finding a job at an even less prestigious school than their graduating institution. It concludes with a warning that such a concentration of power in the hands of a few schools could stifle creativity and marginalize paradigm shifting ideas contributed by academic outsiders.

While the Slate article is effective in showing the scale of the current crisis, it fails to put its findings in historical context. This institutional disparity is nothing new. I have found in my own research that many of the same concerns and frustrations hindering less prestigious schools today were expressed decades ago. The creation of the Institute of Far Eastern and Russian studies at the University of Washington in the mid-1940s is a useful case in point. It’s founder, George E. Taylor, was up against a field defined by elite, Ivy League programs at Harvard, Yale, and Columbia. Despite success attracting funding to his Institute, he routinely lost promising faculty to Ivy programs and struggled to place Washington graduate students at top universities. By contrast, John K. Fairbank, Harvard history professor and Taylor’s biggest competitor, used his connections to other Ivy institutions as well as powerful government officials to secure placement for his graduate students at top schools like Stanford and the University of California – Berkeley (Harvard also employed several of Fairbank’s students). He was so well connected with university administrators at other institutions that he often knew about job openings before the hiring departments, giving him an additional advantage pressing that institution’s hiring committee to take on one of his students. By 1950, Taylor and his staff at Washington were fed up. Convinced that there was an Ivy League conspiracy against their program and animated by the early Cold War’s anti-communist hysteria, Taylor along with his colleagues Karl Wittfogel and Nicholas Poppe became witnesses for anti-communist loyalty committees chaired by Senator Joseph McCarthy and, later, Patrick McCarran who investigated the China studies community during the Second Red Scare.

As the episode above illustrates, there is continuity in the concentration of academic power in the hands of elite institutions between the mid-20th century and today. This continuity is relevant to intellectual historians for two important reasons. First, it shows the central role institutions play in shaping ideas and intellectuals. A recent post by Audra Wolfe on the S-USIH blog laid out the wonderful recent studies by historians like Harold Isaacs and Jamie Cohen-Cole on the way university institutions shaped work done in the social sciences. There have also been fruitful (if somewhat limited) forays into the ways funding either through the government or private grants has influenced American ideas in the 20th century. Still, there is much more work to be done in exploring the relationship between ideas and institutions. Much of the recent scholarship recapitulates institutional inequalities by only examining elite institutions. There has been little work done on less prestigious schools and how lack of connections and funding shaped their intellectual production. There has also yet to emerge much research on recent developments in the relationship between university scholars and institutions after the collapse of the New Deal coalition in the 1970s. Olivier Zunz has shown how the emergent New Right devised its own forms of private philanthropy during the Reagan years, but its impact on intellectual production has not been explored.[1]

The second important function such continuity serves intellectual historians is contextualizing ideas and intellectuals into longer durees that show how funding and institutional mantras perpetuate strains of thought. Institutions have long lifespans, often outlasting generations of scholars. They are also not value neutral. The Rockefeller Foundation, for example, has had the same mission (“to improve the well-being of humanity around the world”) since 1913. This mantra has been interpreted in different ways over the hundred years it has been in use. Again looking at China, in the 1930s Rockefeller sponsored large development and education programs in China with particular attention paid to agriculture and medicine. War, hot and cold, compelled Rockefeller to channel money for China away from direct investment in Chinese development and into American university programs devoted to studying Chinese history, culture, and society. Despite changes in practice, Rockefeller continually promoted Chinese development and democratic institution-building. Their investment in university China studies ensured that intellectuals who shared their vision would have the financial resources to pursue their work, which was of no small significance in a new field with limited connections to sources of funding.

It is tempting to idly despair at stories of institutional inequality particularly when taken together with news about the perpetually shrinking academic job market. As paralyzing as the prospect of future unemployment can be, it does little to help understand or address the problem. Like the parallel problems of race and gender inequality, using history to contextualize institutional inequality will both help us better understand how these disparities were created and undermine arguments that these inequalities are natural or inevitable products of the higher education system. But if historians are going to properly contextualize our current plight we need to be more sensitive to the role institutions play in shaping ideas. Addressing inequality involves stripping away harmful mythologies about meritocracy and “great thinkers” to get at the institutional roots of their creation and popularization. Only by understanding these roots can we properly adjust the discipline to create fairer and more egalitarian hiring practices.

[1] Olivier Zunz, Philanthropy in America: A History (Princeton: Princeton University Press, 2012). For another book that effectively explores the relationship between conservative ideas and funding sources see, Angus Burgin, The Great Persuasion: Reinventing Free Markets Since the Depression (Cambridge, MA: Harvard University Press, 2012).

The Past Is Not A Cold Dead Place: Perry Anderson, Genealogy, History

Blog

[This piece first appeared at the Society for US Intellectual History Blog on July 18, 2014]

In his triptych of articles for The New Left Review – “Homeland”, “Imperium” and “Consilium” – Perry Anderson examines the rise of the current American neoliberal national security state. Anderson uses each article to tackle a different aspect of this rise: “Homeland” looks at domestic politics, “Imperium” at foreign policy, and “Consilium” at current mainstream academic thinking on US foreign policy. These articles, and “Imperium” in particular, are historically oriented. Anderson traces the creation and development of the national security state from the 19th century to the present day with special emphasis on the way the presidencies of Theodore Roosevelt and Woodrow Wilson shaped US policy thinking about America’s place in the world. Anderson’s erudition and the breadth of his project is impressive. The further I read his NLR articles though, the more one question nagged at me: is this history?

I believe Anderson is writing as a genealogist, not a historian. Since being adopted by the philosopher Friedrich Nietzsche in the late 19th century, genealogy has been one of social criticism’s most powerful and versatile weapons. It is also an important corrective to the historian’s orientation toward objectivity and context. For the purposes of this article, I define genealogy as an archeology of knowledge interested in exposing incidents of knowledge/power as constructed with the hope of disrupting its application and affecting future change. In contrast to the historian, the genealogist Anderson’s aim is to show how the American neoliberal national security state developed from the Founding to the present. His hope is that by exposing the contours of American empire, it can be more effectively combatted.[1]

Friedrich Nietzsche developed the genealogical method in the 1870s. Genealogy’s inspiration was not historical. In his “On the Uses and Disadvantages of History for Life” (1874), Nietzsche dismissed most historical thinking for its undue emphasis on presenting an objective past. Attempting to present an objective past denied human subjectivity and deadened life-giving lessons by diluting them with spurious contextualization and detail.[2] Instead, Nietzsche crafted his genealogical method by combining his training as a classical philologist with his interest in Charles Darwin’s ideas about evolution. His interest in the relevance of non-historical scholarship to genealogy is evinced by one of Nietzsche’s fundamental questions in On the Genealogy of Morals (1887): “What light does linguistics, and especially, the study of etymology, throw on the history of the evolution of moral concepts?[3] Here both linguistics and evolutionary science play a crucial role in showing the development of Western morality and values. To Nietzsche, a virtue shared by linguistics and evolution is its rejection of objectivity.

The culmination of Nietzsche’s genealogical method is found in Michel Foucault’s work on institutional power and authority. In his essay, “Nietzsche, Genealogy, History”, Foucault provided a concise outline of his genealogical method and its indebtedness to Nietzsche. Foucault is explicit that genealogy is neither opposed to history nor is it a less rigorous version of it. “Genealogy does not oppose itself to history as the lofty and profound gaze of the philosopher might compare to the molelike perspective of the scholar,” Foucault wrote, “It opposes itself to the search for ‘origins’.”[4] Whereas Nietzsche used genealogy to undermine the foundations of Western morality, Foucault used the same method to unmask the power dynamics underlying institutions. His method was more historically oriented than Nietzsche’s, but in all of his genealogies (The Order of Things (1966), Discipline and Punish (1975), and The History of Sexuality (1976-84)) Foucault was committed to Nietzsche’s orientation toward the present and future at the expense of historical objectivity.

While Foucault was the most visible adopter of Nietzsche’s genealogical method, it also found an audience among American historians. These historians, including Gabriel Kolko and William Appleman Williams, believed that history should be used as a tool to criticize American culture, society, and politics.[5] In the preface to his The Contours of American History, William Appleman Williams defined the purpose of history, it “is neither to by-pass and dismiss nor to pick and choose according to preconceived notions; rather it is a study of the past so that we can come back into our own time of troubles having shared with the men of the past their dilemmas, having learned from their experiences, having been buoyed up by their courage and creativeness and sobered by their shortsightedness and failures.”[6] Like Nietzsche, Williams believed history should be future oriented; it should inform us as we try to make a better future. Though history cannot be the sole means of informing a better future – in Williams’ words “History offers no answers per se, it only offers a way of encouraging men to use their minds to make their own history” – it can be a useful tool.[7] It should be a means of “enrichment and improvement through research and reflection,” not a backward-looking scholasticism myopically obsessed with the past for its own sake.[8]

Beyond a common devotion to history for life, Nietzsche and Williams shared a commitment to using history as a tool to disrupt the search for singular origins, continuities, and wholes. Both reject the notion that total understanding of the past is possible. Perspective and subjectivity feature prominently in the work of both authors. In the same way Nietzsche could only create a genealogy of morals, Williams is limited to tracing the contours of American history. Nietzsche can trace a few common themes in his genealogy of morals: the rise of slave morality, the triumph of life-denying religion over life-loving warrior culture, and the coming of the übermenschen. Though Williams’ values could not be more dissimilar from Nietzsche’s, he also rejects a totalizing approach to history. Themes like frontier expansion, private property versus social property, and community are embodied in subjects like the Earl of Shaftesbury, Charles Beard, and Herbert Hoover. There are not impersonal historical forces for Williams, any historical continuities are dependent on individual agency for carriage between generations.

Perry Anderson is writing in the tradition begun with Nietzsche and carried on by Williams. Like Nietzsche and Williams, he is most interested in trying to understand how contemporary American domestic and foreign policy has come about. In line with his argument in “Homeland” that American political power has become unduly concentrated in the executive, Anderson embodies the characteristics of the neoliberal national security state in various Presidents. Woodrow Wilson in particular, is a pivotal figure for Anderson in unifying and arguing that, “Religion, capitalism, democracy, peace, and the might of the United States were one.”[9] It is no coincidence then for Anderson that, despite Wilson’s peace-loving rhetoric, he entered the United States into a world war that would massively expand American militarism at home and influence abroad. For Anderson despite periods of isolationism, “the ideology of national security, US-style was inherently expansionist.”[10]

Anderson’s vision of American foreign policy is erudite, but has clear limits. He’s not particularly interested in context or depth. To cover such a rangy topic as American foreign policy over a century’s duration in only a couple hundred pages compels Anderson to narrowly focus on a few individuals. He is also not concerned with counter-evidence or providing alternative opinions to his own. Like other genealogists, Anderson’s argument is polemical. While discussing the aims of the Truman administration after World War II, for example, Anderson focuses only on Europe.[11] The American government’s occupation of Japan, attempt to arbitrate the formation of a coalition government between the Chinese Communist Party and the Guomindang, and decolonization in South and Southeast Asia are omitted from Anderson’s account of postwar American policy. This omission streamlines Anderson’s argument regarding the expansion of executive power and political consensus regarding liberalism (in both its political and economic forms) across the world. The postwar reality was much messier. Congressional support for Chiang Kai-shek frustrated Truman and the State Department, opinion was divided as to the future of Southeast Asian independence movements with substantial support in the State Department for independent (even if leftist) nationalist regimes in Vietnam and Indonesia, and despite the surprising ease of US occupation in Japan, the future role of the US there was unclear. While not acknowledging these seminal events in US history would be a mortal sin if committed by a historian seeking to write an authoritative historical account of 20th century US foreign policy, providing this historical context is not important for realizing Anderson’s aims as a genealogist. Like Williams before him, Anderson is able to keep his argument focused and coherent by eschewing historical complexity.

Despite being interested in the past, Anderson rejects the methodological dogmas of the professional historian. Though Anderson is writing during a more methodologically inclusive period of post-linguistic, post-cultural turn historical scholarship, historians still prize values that Nietzsche criticized. Objectivity, now seen as a ‘noble dream’ instead of realizable goal by historians, nevertheless remains a valued mindset. Polemics and jeremiads, while useful fodder for historical articles and monographs, fall outside the bounds of acceptable historical scholarship. Many historians also continue to believe that history should be solely concerned with the past. Jill Lepore and other who have deigned to approach history with an eye to the present have been labeled “presentist” by their peers.[12] Frequently, these accusations amount to little more than political disagreement couched in the language of good scholarship. Still, they posit a hard, artificial dividing line between the past and present.

Genealogy provides an alternative to endless debates about historical methodology. It is a separate method with its own values. At the same time, it can inform our thinking about the past and its relevance to present and future events. Historians need not have a monopoly on the past. History can tell us about the past as it was and for its own sake. The genealogist is first and foremost a social critic, interested in history as a means of interpreting the present. Both are necessary for a complete understanding of the past and how it informs current events. Embracing methodological diversity will allow scholars, be they historians or genealogists, to construct a thoroughgoing and socially responsible vision of the past that can inform how we live in the present.

[1] Anderson, “Imperium”, 4.

[2] Michel Foucault wrote about Nietzsche’s criticism of objective history that, “The objectivity of historians inverts the relationships of will and knowledge and it is, in the same stroke, a necessary belief in providence, in final causes and teleology – the beliefs that place the historian in the family of ascetics.” The historians obedience to unchanging, dead facts denies, for Nietzsche, the perspectivalism of our understanding of past events as well as the historian’s own subjectivity. Michel Foucault, “Nietzsche, Genealogy, History” in Paul Rabinow (ed.) The Foucault Reader (New York City: Pantheon Books, 1984): 76-100.

[3] Friedrich Nietzsche, On the Genealogy of Morals in Walter Kaufman (ed.), Basic Writings of Nietzsche (New York City: The Modern Library, 1992): 491. Italics Nietzsche’s.

[4] Foucault, “Nietzsche, Genealogy, History”, 77.

[5] It should be noted that the genealogical method is not the exclusive purview of leftist social critics. I think the best way to understand much of Christopher Lasch’s late-career work is as genealogy instead of history. The Lasch monograph best fitting the genealogical description is The True and Only Heaven: Progress and Its Critics (1991).

[6] William Appleman Williams, The Contours of American History (New York City: New Viewpoints, 1961): 23.

[7] Williams, Contours of American History, 480.

[8] Williams, Contours of American History, 23.

[9] Anderson, “Imperium”, 9.

[10] Anderson, “Imperium”, 30.

[11] Anderson, “Imperium”, 17-18.

[12] This is obviously not a new phenomenon. American communist historians in the mid-20th century were often labeled unscholarly for viewing history as a tool for political struggle. My reference to Jill Lepore concerns a 2011 spat between her and Gordon Wood concerning her book The Whites of Their Eyes: The Tea Party’s Revolution and the Battle over American History (2011). See Gordon Wood, “No Thanks For the Memories”, The New York Review of Books (January 13, 2011): http://www.nybooks.com/articles/archives/2011/jan/13/no-thanks-memories/ and Claire Potter’s thoughtful riposte at The Chronicle of Higher Education: http://chronicle.com/blognetwork/tenuredradical/2011/01/department-of-snark-or-who-put-tack-on/. David Sehat has written on the debate for the S-USIH blog back in 2011. See his post here: http://s-usih.org/2011/01/wood-on-lepore-on-presentism-or-why.html.

Not My Liberalism

Uncategorized

[first appeared on the Society for U.S. Intellectual History Blog, July 5, 2014]

Confusion about liberalism’s definition is ubiquitous in American popular and scholarly discourse. To the conservative Fox News set, liberalism has become a catch-all term for ineffective governance and flimsy morals. During the 2004 Presidential election Republicans had so successfully tarred liberalism that Democratic Presidential nominee John Kerry shied away from using the term. In recent years a resurgent left has also been critical of liberalism without properly defining it. Scholarly magazines like Jacobin have often criticized unspecified liberals for embracing capitalism and refusing to take strong ethical stands on poverty and racism. While the left’s criticisms of liberals undoubtedly hold true for some liberals, without an agreed-upon definition of liberalism it’s difficult to determine if sweeping criticisms from the right and left are defensible.
In his rangy synthesis, Liberalism: The Life of An Idea, Edmund Fawcett attempts to provide an authoritative definition of liberalism. By picturing liberalism as a fluid philosophy continually reacting to social, political, and technological problems, Fawcett convincingly demonstrates why liberalism has endured for centuries while evading definition. Liberalism’s very nebulousness explains its success. The expansiveness of the term allows it to accommodate seemingly contradictory values, such as individual freedom and social security, without fragmenting. Fawcett’s descriptive argument about defining liberalism from 1830 until the present is largely successful. His prescriptive attempt to highlight liberalism’s value through careful definition is less successful however. In his attempt to salvage a unitary liberalism, Fawcett recapitulates many of its most grievous sins including the exclusion of non-Western voices, papering over substantive ideological differences between thinkers, and dismissing the troublesome history of political liberals in power.

Fawcett’s definition is chronological and thematic. Chronologically, Fawcett situates liberalism in four separate epochs (1830-1880, 1880-1945, 1945-1989, and after 1989). His liberalism is not a static philosophy. Instead, it’s continually adapting to address the problems of its era whether they be social, technological, or intellectual. To Fawcett, liberalism’s adaptability explains why it has endured despite challenges ranging from economic depression to world war. “The story of liberalism is in a way a coming-of-age tale as liberals learn, or fail to learn, from experience,” Fawcett tells his readers (6). Despite this seeming capitulation to the Whiggish liberal narrative, he is careful to avoid a simple story of continual progress. Examining the efflorescence of human rights thinking after World War II, Fawcett shows how the 1948 Declaration of Human Rights built an intellectual consensus among its diverse body of drafters only to watch that agreement erode once the Declaration was passed on to the UN’s member nations. Critics of human rights feared that the doctrine would be overused or misused to infringe on sovereignty or force reparations on former colonial powers. Human rights doctrine had no suitable response to these critics. In Fawcett’s words, “Intellectual disenchantment with human rights grew with a seeming failure to find stable, publicly available defenses for them against mockers, debunkers, and deniers” (295).

Anchoring Fawcett’s liberal chronology are four persistent themes: conflict, resistance to power, progress, and respect (10). By conflict he means that to liberals “social harmony was not achievable, and to pursue it was foolish” (10). Fawcett’s liberalism is not a utopian philosophy. Instead, it’s pragmatic and looks to find temporary, moderate solutions to assuage, not eliminate, conflict. Second, Fawcett’s liberals are skeptical about power. Power should never be absolute and liberals sought to check or limit power whenever it became concentrated. The third and fourth ideas, progress and respect, are both fundamental and perpetually in conflict. To Fawcett, liberals view “human character and human society as…not static but dynamic” (11). This dynamism possesses promise and peril. People have the ability to improve their lives and communities. At the same time, there is always the threat that progress could be lost, order could be disordered, and liberty could be bound. At the same time, liberals are sensitive to coercive improvement. Individual autonomy should be respected by superior authority. Good liberals should not “obstruct and intrude on people in pursuit of their chosen enterprises or beliefs” (11). Fawcett is sensitive that these four themes are often in conflict. Still, he views “such disputes as family quarrels, not as wars among rival sects” and consistent with the liberal worldview.

Fawcett fundamentally views liberalism as a “practice of politics” instead of a speculative philosophy (25). His focus on political ‘doers’ leads him to populate Liberalism with a diverse and unexpected selection of characters. There are the expected political totems – Abraham Lincoln, John Stuart Mill, Franklin Delano Roosevelt – but there are also surprise guests including the obscure Franz Schulze-Delitzsch, Republican President Herbert Hoover, and free-market economist Milton Friedman. Ultimately, Fawcett is only able to fuse thought and political practice until World War II. He admits that, “after 1945 the separation of ideas and politics appeared to be complete as each side professionalized itself” (316). Fortunately for Fawcett, this separation was never complete and though more speculative philosophers figure into the post-1945 sections, political practitioners like Ronald Reagan and Margaret Thatcher loom large. Fawcett also focuses on political contributions over great books. For example, he dwells at length on Mill’s undistinguished career in Parliament while devoting comparatively little attention to his landmark On Liberty. Though this is a bit frustrating for the intellectual historian, Fawcett’s focus on politics allows him to avoid the more abstract and abstruse aspects of philosophical and economic liberalism and cover greater temporal and geographic ground. This simplicity, when taken with Fawcett’s facility as a writer, also makes Liberalism a useful foundation text for undergraduate courses on liberal ideas or politics.

Liberalism is an admirable attempt to synthesize the diverse strands of liberal thought and practice. Regrettably, Fawcett’s examination of liberalism is flawed by a search for its singular origin. He is interested in defining and delineating liberalism, not ‘liberalisms’. Fawcett’s justification for singularity is twofold. First, he’s concerned that once liberalism is divided it’s susceptible to infinite fracture. Second, he’s concerned about questions of authenticity in a fractured liberal environment. To Fawcett, multiple liberalisms beg the question about which is the authentic or true liberalism. At best debates about authentic liberalism “risks turning an indispensible label into an unnecessary puzzle”, at worst it could lead to a “hunt for nonbelievers” and a violation of liberalism’s fundamental commitment to toleration (25-26).

In fact, Fawcett fails to avoid his second pitfall because of his search for a unified liberalism. By looking for a singular origin, he recapitulates liberalism’s tendency to exclude dissenting minority voices. His story of liberalism is limited to a white, Euro-American worldview. The first, and only, prominent female character in the book is Margaret Thatcher and (aside from all the political problems of having Thatcher as your only female voice) she is not introduced until page 379. Fawcett’s non-Euro-American representatives are George Orwell and Albert Camus who, despite being born in European colonies, were thoroughly enmeshed in and responsive to European ideas and politics. As Erez Manela and other have shown, liberalism was a potent global idea by the dawn of the 20th century.[1] Fawcett’s liberalism fails to take into account liberalism’s globalism and fails to mention non-Western thinkers who were essential to its expansion. Fawcett’s liberalism is capacious enough to include the likes of Michael Oakshott and John-Paul Sartre, why not Lu Xun, Sun Yat-sen, and Jawrahal Nehru?

Fawcett’s exclusion of non-Western contributors to liberal thought and practice is particularly troubling because he bristles at and dismisses the harm caused by Western liberal imperialism. He buys into the canard of the liberal civilizing mission. To Fawcett, liberal empire’s good intentions make imperial practice’s violence and folly justifiable, if not justified. After briefly conceding that there was no ideological conflict between liberalism and empire, he stumbles into defending liberal colonial domination as “not all rapine, domination, and unequal exchange” (198). Liberal empire brought “progress and modernity” to areas lacking technological innovation and egalitarian values (198). Furthermore, Fawcett’s liberal empires were not seen as hated conquerors by colonized peoples. In fact, “liberal benefits of modernity were often sought for and welcomed by colonized peoples” (199). Obviously, there were excesses. He admits “that in raising up backward peoples and showering them with the boons of modernity, the governments of liberal civilizations had them killed at the same time by the tens of thousands” (204).

The very capaciousness of Fawcett’s liberalism also presents problems. While some of his characters like German legal theorist Carl Schmitt act as foils for his liberal protagonists, Fawcett willingness to include intellectuals and politicians who are rarely understood as liberal and who did not view themselves as such is puzzling. He often accuses his characters of denying their own liberalism. “Friendly critics suspected MacIntyre was, in effect, a closet liberal”, “Sartre was more liberal than he cared to admit”, and “Oakeshott’s liberal quietism was apt for a ship in calm seas” (353, 336, 321). Strangely, Fawcett does not question his liberal exemplar’s bona fides.

He also omits prominent liberals who could disrupt his definition. There is no John F. Kennedy to upset his liberal characteristic of resistance to power. His omission of Kennedy also makes the relationship between liberalism and conservatism unidirectional. Conservatives like Hoover and Reagan may be closet liberals or have liberal aims. Fawcett’s liberals are not susceptible to conservatism’s allure however. Kennedy’s (or even Obama’s) technocratic liberal militarism could serve as a useful corrective to this imbalance. Just as under the proper conditions conservatives have embraced a narrative of progress, liberals have justified reaction and maintenance of the status quo when under threat.

Fawcett’s Liberalism mirrors the promise and peril of its intellectual namesake. It’s an ambitious synthesis and tackles an important problem. Fawcett’s delineation of liberalism as fluid and historically contingent provides a useful way of thinking about it as an ideology. Liberalism’s fluidity also partially explains why it has evaded definition for so long. Still, Liberalism shares its namesake’s flaws. Its principle protagonists are white men who speak in universals about ethics and good government while presuming that non-male, non-white, and non-western people will share their values. Its focus on political practice tacitly accepts liberal naturalism, denying that liberalism is a manmade ideology. Its capaciousness and toleration of dissent (at least among white men) make me question liberals’ depth of feeling about their values. I appreciate Fawcett’s genealogy of liberalism, but as someone who has sometimes defined himself as a liberal I found myself constantly thinking as I read his book, “I hope Fawcett’s liberalism isn’t my liberalism.”

[1] Erez Manela, The Wilsonian Moment: Self-Determination and the International Origins of Anticolonial Nationalism (Oxford: Oxford University Press, 2007).