Tag: history

The Bridge from Academe

AdviceBlog

My career as a history graduate student seems like a lifetime ago already. It was January 2018 and I had just defended my dissertation, removing the one constant that had unified my intellectual life for the past five years. I didn’t really have any job prospects. My furtive attempts to secure a tenure-track academic job the previous year had been met with such a deafening silence from university employers that I realized I needed an alternate path. I had applied to a bunch of university administration jobs in the kind of slipshod, arbitrary manner that showed that I had no idea what I was doing. Even though I had held many part-time jobs on campus, I wasn’t sure how any of them translated to an actual career. I was simply looking to do something because I wasn’t going to be able to make a living doing what I was trained to do.

Now almost two months into my job as the Publications and Communications Manager at the Council of Graduate Schools (CGS), it’s stunning to reflect on how far away that all seems. The restless nights after receiving another rejection. The weird combination of elation and anxiety that surged through my body after opening an email requesting an interview. All the wonderful and kind people I met in the hodgepodge of part-time jobs I worked as I tried to figure out what to do next. It all seems distant.

And in some ways it is distant. I’ve moved to a new city. Reconnected with old friends and met new people. The 9 to 5 office schedule provides a structure that I never had before in graduate school, though it can feel stifling on a beautiful day (which, fortunately, between the scalding heat and pouring rain there are few of in Washington). I used to wake up at 8, now I wake up at 6. Everything seems different.

I’m not sure there’s a lesson here. The usual platitudes – all variants of ‘don’t worry it will get better!’ – are too simple. The emotions are much more complex. There is certainly relief that the economic insecurity of graduate school and the post-graduate transition is over. I am incredibly lucky to have found a job that I enjoy at a supportive organization staffed by smart, compassionate people.

But there is also a nostalgic patina that is already starting to wear away the sharpness of those desperate feelings of my last days in graduate school. I had the opportunity to do the research that I loved. I got to work with students, faculty, and administrators from different backgrounds and with different life experiences who broadened by worldview. I lived in a beautiful place and was surrounded by people who cared about me (and hopefully still do).

Even though graduate school seems far away, real questions and doubts persist in my mind. As Erin Bartram and others have more eloquently noted there is an identity crisis that accompanies the transition out of academia and into another field. For many of us, and I know for me, “intellectual-in-training” or “scholar” was a core component of who we were and wanted to be. I often wonder if being a scholar or historian will be an important part of my identity going forward or, if through some transformative magic, it’s already gone. I would still like to be part of the history community, participate in scholarly debates in my field, and publish on my research interests. Since my research never seemed to generate much interest as a graduate student, however, I’m not sure I have enough momentum or a large enough public profile to continue on as part of that community.

I’ve been thinking often about how to bridge my former and current employment identities. I agree with the career pathways approach that rejects the bifurcation between academic and “alt-ac” employment trajectories to think more holistically about post-graduate employment. That sounds good as an ideal, but what does that look like as a lived experience? Is it really possible to pursue multiple career paths simultaneously without (either implicitly or explicitly) ranking them? Are we fighting in vain against the inevitable disappointment that will occur when graduate students enroll in programs expecting to become professors only to realize that career path is an incredibly difficult one? There are no easy answers to these questions.

Changing a core component of one’s identity is trying, but if we can work together as current graduate students, faculty, administrators, and alumni I think we can better show graduate students the many opportunities available to them. There will never be a perfect solution, but I think there is room to make the post-graduate identity change feel less like a crisis and more like a new phase of personal and intellectual discovery. 

Writing a Book Proposal – A Crowd-Sourced How-To

AdviceBlog

About a month ago I wrote on Twitter that I was looking for advice about how to write a book proposal. I am revising my dissertation into a book manuscript and wanted to begin writing a book proposal to circulate to academic presses. I do not currently work in a university and had no idea where to start, so I put a call out on Twitter. I received so many wonderful responses that I decided to compile them all in a Twitter story and post it here. The most interesting revelation was the diversity of responses I received. It seems there really are many different ways to successfully compose a book proposal that will be accepted by a university press.

I hope others will benefit (as I already have) from all this generous advice. The Twitter story can be found by clicking the link below:

http://wke.lt/w/s/K7zBN

I Defended My Dissertation – What Do I Do Next?

BlogUncategorized

I defended my dissertation this past December. In the defense’s afterglow, I started to wonder – what’s next? I knew I wanted to turn it into a book, but I had no idea where to start. Should I contact publishers? Start writing a book proposal? My advisor suggested stepping away from the project to gain perspective. Still, I was skeptical. I didn’t have a job and thought the only way to get a job (or even a post-doc) was to have, at the very least, the book under contract.

After taking some time to think through this, I did what any good millennial would do: took to Twitter. I received a staggering number of replies from professors, publishers, and graduate students offering advice or commiserating about the opacity of the publishing process. I am distilling the substance of the discussion into five points, in the hope that in the future they can guide graduate students wrestling with the same questions.

1. Take a break. Writing a monograph-length dissertation is a long and, oftentimes, arduous process. In so doing, it is easy to lose perspective. In my case, I had largely stopped reading books outside my narrow subdiscipline and immersed myself in the project’s archival and primary sources. Most Twitter respondents recommended putting the manuscript aside, focusing on other projects (an article, teaching a class, etc.), and returning to it with fresh eyes after between six months and one year. By then, the hope is that you can bring a new perspective to the project.

2. Read other things. Fresh perspective cannot be attained through idleness, however. When taking a break from the manuscript, respondents’ advised that you delve into books outside your disciplinary niche. Many recommended reading fiction, since its emphasis on narrative and readability are two qualities lacking in many manuscripts. Others recommended reading prize-winning books. These could act as models for a book proposal or provide insight into how to best frame arguments. For Twitter respondents the message was nearly universal: the best writing begins with omnivorous reading.

3. Network. Like any other employment opportunity, finding a publisher for your manuscript is easiest achieved through networking. The best place to find these networking opportunities is at academic conferences. Respondents in the publishing industry shared that they meet many first-time academic authors at conferences. Larger academic conferences (AHA, OAH, MLA, etc.) usually have the greatest number of publishers, but smaller conferences can present greater opportunities to meet and have sustained discussions with publisher representatives. If conferences are too expensive (and for many graduate students they are), rely on your existing social network. Ask your advisor, faculty in your department, or alumni if there is anyone at their publisher you could speak to about your manuscript.

4. Write your dissertation as a book. If you have not yet completed your dissertation or are in the beginning stages of your graduate career, you may want to think about your dissertation as a book. This is a polarizing approach and one that will need to be worked out with your advisor. There are at least two ways of thinking about a dissertation. First, the dissertation-as-certification approach, which sees the dissertation as a document proving your abilities as a scholar. This generally means lengthy forays into historiography, rigorous citation using mostly archival sources, and favoring argument over narrative. Scholars advocating this approach see the dissertation as a showcase for all the skills you have learned as a graduate student and the defense of the project as certification that you belong in company of other professional academic historians. Second, the dissertation-as-book believers argue that since the real disciplinary standard is a publishable manuscript emphasis should be placed on those traits publishers find desirable – narrative, clear argument, and a clear writing style – over skill demonstration. While there is disagreement over which approach is best, writing the dissertation as a book has obvious benefits in the transition from manuscript to published book.

5. Write a different book. The most surprising suggestion I received was not to transform the manuscript into a book at all. Instead, these respondents suggested to think of the book as a totally different project than the dissertation. On the surface this seems ridiculous. I just spent five, six, seven years writing a dissertation and now you’re telling me to scrap it and start over! What a waste of time! Yet, when you think more deeply about divergences in form and audience, thinking about the book as a new project makes more sense (particularly if you took the dissertation-as-certification approach, as I did). One respondent put it particularly succinctly, “You don’t revise your dissertation; you steal from your dissertation while you’re writing your first book.” Thinking about your manuscript as a second project can free you to think more capaciously about your manuscript topic than trying to revise a dissertation project intended for a narrower audience and with more limited objectives.

These five points are heuristics for the manuscript-into-book transformation that I intend to follow over the next six months. I’m sure there will be disagreement and all of these points are subject to debate (and if you have further questions or comments feel free to post below). Thank you to everyone who responded and I hope this short memo will help graduate students feel a little less lost after they defend.

Making History Too Big To Fail?: The History Manifesto and the Return of History as Science

Blog

In The History Manifesto historians David Armitage and Jo Guldi add their voices to a growing body of literature on the humanities in crisis.[1] As the book’s title suggests, they focus on the field of history and its diminishing influence on public life. This descent into irrelevance is not the result of a changing public. Instead, academic historians’ embrace of short-term thinking has made the discipline unresponsive to the global crises of the day including wealth inequality and environmental degradation.

Despite historians’ retreat from their publics, Armitage and Guldi are hopeful for the resuscitation of publicly-minded history. They see two ways for historians to regain their audiences. First, historians need to reject short-termism and return to studying longue duree narratives. Looking back to earlier history (it seems here that Armitage and Guldi are thinking pre-industrial) will allow historians to show policymakers real alternatives to ingrained economic and political systems. They provide many examples of historians who have used the longue duree to challenge established institutions and systems including the Webbs, R.H. Tawney, and Eric Hobsbawm. Furthermore, Armitage and Guldi see an expansion of temporal scope as a useful counterpart to historians’ embrace of larger geographic areas. Just as transnational oceanic, continental, and comparative imperial histories have allowed historians to tell new stories about systems, institutions, and ideas, an expanded time scale would provide the same benefits. Their second solution is an embrace of “big data”. Armitage and Guldi believe historians are uniquely suited to effectively utilize big data because of their ability to make data meaningful through contextualization and narrative. “History has an important role to play in developing standards, techniques, and theories suited to the analysis of mutually incompatible datasets where a temporal element is crucial to making sense of causation and correlation” (104).

To Armitage and Guldi, a focus on the longue duree and an embrace of big data need to go hand-in-hand in combating short-termism. A focus on the long temporal scopes brings with it the inevitable problem of information overload. The tools of big data offer a solution to this problem by condensing and visualizing large datasets into manageable graphs, maps, and charts. They highlight the Google NGram viewer as a freely available, easy to use big data tool already being used by historians. Embrace of big data also offers historians a way to remain relevant in a technologically modernizing university. Armitage and Guldi recognize the crisis of the humanities extends to employment as well as larger social relevance. “If History departments train designers of tools and analysts of big data, they stand to manufacture students on the cutting edge of knowledge-making within and beyond the academy” (107).

While The History Manifesto presents itself as a revolutionary way of approaching novel problems facing contemporary history scholarship, its solutions are old ones. The crisis of the public intellectual has been a preoccupation of historians since at least the 1980s if not earlier. A lack of responsiveness to public needs is often viewed as an important explanation for the public intellectual’s demise. The entire discourse surrounding the “ivory tower” is a reflection on academics’ insecurities about their relationship to the larger public and perceived differences between subjects of scholarly interest and public needs. Even if historians embrace longue durees and big data it seems unlikely that the tension between scholarly freedom of inquiry and the public embrace of intellectuals will be resolved.

The tools proposed by Armitage and Guldi also have problems. As Deborah Cohen and Peter Mandler have convincingly demonstrated, The History Manifesto’s data often doesn’t support its argument.[2] Historians have not retreated from longer temporal studies and embraced short-termism. Anecdotally, scholars held by Armitage and Guldi as exemplars of long-term thinking like Arthur Schlesinger Jr. and Charles Beard published many books examining short time periods. It’s also unclear how longue durees will lead to more relevant history scholarship. As has been pointed out by several others, some of the most politically engaged fields – notably the history of American capitalism – seem particularly plagued by short-termism but for the reason that the ascendency of neoliberalism is a relatively recent phenomenon.

The turn to big data represents another tried and true response by the history field in times of crisis: an appeal to science. An earlier history crisis in the mid-20th century provides a useful lens for understanding both the current crisis and the Armitage/Guldi response. After World War II, the expansion of social science funding and prestige put history in an undesirable position. While leading scholars like Arthur Schlesinger Jr. had success as traditional narrative historians, younger academics and those in less established sub-disciplines were concerned about diminished influence and funding. An enterprising few, particularly those studying strategically important areas like the Soviet Union and China, began rebranding the discipline as a social science. These historians claimed that they were like scientists because they used data to find objective answers to historical questions.

They envisioned a substantive role in interdisciplinary social science in particular. In an interesting parallel to Armitage and Guldi, they claimed that historians were essential in contextualizing the findings of other social scientists. At Harvard, Washington, and Johns Hopkins (among others) interdisciplinary social science programs were established with substantial history components. At Johns Hopkins, Owen Lattimore led an interdisciplinary study on Xinjiang province in China. Its aim was to analyze its politics and role as a Chinese frontier, but a large part of the study was devoted to understanding how its history shaped its politics. These programs were incredibly successful at attracting funding and grew exponentially between 1945 and 1965. As the hierarchy of university disciplines shifted in the mid-20th century from a humanity-centered university toward one more in line with American national security interests (what Rebecca Lowen has called “the Cold War university”), history maintained high standing by appealing to science.[3]

The History Manifesto is a call to revolutionary action. It aims to persuade students and faculty to use the longue duree and new technology to seek broader audiences and answer bigger questions. These are noble and worthwhile goals. They are also not revolutionary. As Deborah Cohen and Peter Mandler have shown, the longue duree never left and the short-termism of recent historical scholarship is a canard. The appeal to data is similarly an old tactic dressed up in new language and concepts. In an age where the value of scientific research is rarely questioned amid massive cuts to university budgets, it’s natural for historians to appeal to science in an effort to defend their discipline. It has worked in the past (to a certain extent) and the current enthusiasm around big data may allow it to succeed again. Still, in the spirit of Armitage and Guldi, I think it’s important not to become myopically focused on the current crisis. Instead, a deeper exploration of why history faces periodic methodological crises is necessary. It’s also necessary to define with greater precision what public engagement means for scholars. While size and scale are important metrics for gauging influence, extension without clear goals can not only compromise historians’ relations to a wider public, but can jeopardize our stature within the university as well.

[1] David Armitage and Jo Guldi, The History Manifesto (Cambridge, UK: Cambridge University Press, 2014).

[2] This is not to say Cohen and Mandler’s response is unproblematic. They go too far in opposing Armitage and Guldi to the point of denying any sort of crisis in the humanities. Though their riposte was likely intended as a full-throated call for methodological pluralism, it often reads as a defense of the status quo.

[3] Rebecca S. Lowen, Creating the Cold War University: The Transformation of Stanford (Berkeley: University of California Press, 1997).

The Past Is Not A Cold Dead Place: Perry Anderson, Genealogy, History

Blog

[This piece first appeared at the Society for US Intellectual History Blog on July 18, 2014]

In his triptych of articles for The New Left Review – “Homeland”, “Imperium” and “Consilium” – Perry Anderson examines the rise of the current American neoliberal national security state. Anderson uses each article to tackle a different aspect of this rise: “Homeland” looks at domestic politics, “Imperium” at foreign policy, and “Consilium” at current mainstream academic thinking on US foreign policy. These articles, and “Imperium” in particular, are historically oriented. Anderson traces the creation and development of the national security state from the 19th century to the present day with special emphasis on the way the presidencies of Theodore Roosevelt and Woodrow Wilson shaped US policy thinking about America’s place in the world. Anderson’s erudition and the breadth of his project is impressive. The further I read his NLR articles though, the more one question nagged at me: is this history?

I believe Anderson is writing as a genealogist, not a historian. Since being adopted by the philosopher Friedrich Nietzsche in the late 19th century, genealogy has been one of social criticism’s most powerful and versatile weapons. It is also an important corrective to the historian’s orientation toward objectivity and context. For the purposes of this article, I define genealogy as an archeology of knowledge interested in exposing incidents of knowledge/power as constructed with the hope of disrupting its application and affecting future change. In contrast to the historian, the genealogist Anderson’s aim is to show how the American neoliberal national security state developed from the Founding to the present. His hope is that by exposing the contours of American empire, it can be more effectively combatted.[1]

Friedrich Nietzsche developed the genealogical method in the 1870s. Genealogy’s inspiration was not historical. In his “On the Uses and Disadvantages of History for Life” (1874), Nietzsche dismissed most historical thinking for its undue emphasis on presenting an objective past. Attempting to present an objective past denied human subjectivity and deadened life-giving lessons by diluting them with spurious contextualization and detail.[2] Instead, Nietzsche crafted his genealogical method by combining his training as a classical philologist with his interest in Charles Darwin’s ideas about evolution. His interest in the relevance of non-historical scholarship to genealogy is evinced by one of Nietzsche’s fundamental questions in On the Genealogy of Morals (1887): “What light does linguistics, and especially, the study of etymology, throw on the history of the evolution of moral concepts?[3] Here both linguistics and evolutionary science play a crucial role in showing the development of Western morality and values. To Nietzsche, a virtue shared by linguistics and evolution is its rejection of objectivity.

The culmination of Nietzsche’s genealogical method is found in Michel Foucault’s work on institutional power and authority. In his essay, “Nietzsche, Genealogy, History”, Foucault provided a concise outline of his genealogical method and its indebtedness to Nietzsche. Foucault is explicit that genealogy is neither opposed to history nor is it a less rigorous version of it. “Genealogy does not oppose itself to history as the lofty and profound gaze of the philosopher might compare to the molelike perspective of the scholar,” Foucault wrote, “It opposes itself to the search for ‘origins’.”[4] Whereas Nietzsche used genealogy to undermine the foundations of Western morality, Foucault used the same method to unmask the power dynamics underlying institutions. His method was more historically oriented than Nietzsche’s, but in all of his genealogies (The Order of Things (1966), Discipline and Punish (1975), and The History of Sexuality (1976-84)) Foucault was committed to Nietzsche’s orientation toward the present and future at the expense of historical objectivity.

While Foucault was the most visible adopter of Nietzsche’s genealogical method, it also found an audience among American historians. These historians, including Gabriel Kolko and William Appleman Williams, believed that history should be used as a tool to criticize American culture, society, and politics.[5] In the preface to his The Contours of American History, William Appleman Williams defined the purpose of history, it “is neither to by-pass and dismiss nor to pick and choose according to preconceived notions; rather it is a study of the past so that we can come back into our own time of troubles having shared with the men of the past their dilemmas, having learned from their experiences, having been buoyed up by their courage and creativeness and sobered by their shortsightedness and failures.”[6] Like Nietzsche, Williams believed history should be future oriented; it should inform us as we try to make a better future. Though history cannot be the sole means of informing a better future – in Williams’ words “History offers no answers per se, it only offers a way of encouraging men to use their minds to make their own history” – it can be a useful tool.[7] It should be a means of “enrichment and improvement through research and reflection,” not a backward-looking scholasticism myopically obsessed with the past for its own sake.[8]

Beyond a common devotion to history for life, Nietzsche and Williams shared a commitment to using history as a tool to disrupt the search for singular origins, continuities, and wholes. Both reject the notion that total understanding of the past is possible. Perspective and subjectivity feature prominently in the work of both authors. In the same way Nietzsche could only create a genealogy of morals, Williams is limited to tracing the contours of American history. Nietzsche can trace a few common themes in his genealogy of morals: the rise of slave morality, the triumph of life-denying religion over life-loving warrior culture, and the coming of the übermenschen. Though Williams’ values could not be more dissimilar from Nietzsche’s, he also rejects a totalizing approach to history. Themes like frontier expansion, private property versus social property, and community are embodied in subjects like the Earl of Shaftesbury, Charles Beard, and Herbert Hoover. There are not impersonal historical forces for Williams, any historical continuities are dependent on individual agency for carriage between generations.

Perry Anderson is writing in the tradition begun with Nietzsche and carried on by Williams. Like Nietzsche and Williams, he is most interested in trying to understand how contemporary American domestic and foreign policy has come about. In line with his argument in “Homeland” that American political power has become unduly concentrated in the executive, Anderson embodies the characteristics of the neoliberal national security state in various Presidents. Woodrow Wilson in particular, is a pivotal figure for Anderson in unifying and arguing that, “Religion, capitalism, democracy, peace, and the might of the United States were one.”[9] It is no coincidence then for Anderson that, despite Wilson’s peace-loving rhetoric, he entered the United States into a world war that would massively expand American militarism at home and influence abroad. For Anderson despite periods of isolationism, “the ideology of national security, US-style was inherently expansionist.”[10]

Anderson’s vision of American foreign policy is erudite, but has clear limits. He’s not particularly interested in context or depth. To cover such a rangy topic as American foreign policy over a century’s duration in only a couple hundred pages compels Anderson to narrowly focus on a few individuals. He is also not concerned with counter-evidence or providing alternative opinions to his own. Like other genealogists, Anderson’s argument is polemical. While discussing the aims of the Truman administration after World War II, for example, Anderson focuses only on Europe.[11] The American government’s occupation of Japan, attempt to arbitrate the formation of a coalition government between the Chinese Communist Party and the Guomindang, and decolonization in South and Southeast Asia are omitted from Anderson’s account of postwar American policy. This omission streamlines Anderson’s argument regarding the expansion of executive power and political consensus regarding liberalism (in both its political and economic forms) across the world. The postwar reality was much messier. Congressional support for Chiang Kai-shek frustrated Truman and the State Department, opinion was divided as to the future of Southeast Asian independence movements with substantial support in the State Department for independent (even if leftist) nationalist regimes in Vietnam and Indonesia, and despite the surprising ease of US occupation in Japan, the future role of the US there was unclear. While not acknowledging these seminal events in US history would be a mortal sin if committed by a historian seeking to write an authoritative historical account of 20th century US foreign policy, providing this historical context is not important for realizing Anderson’s aims as a genealogist. Like Williams before him, Anderson is able to keep his argument focused and coherent by eschewing historical complexity.

Despite being interested in the past, Anderson rejects the methodological dogmas of the professional historian. Though Anderson is writing during a more methodologically inclusive period of post-linguistic, post-cultural turn historical scholarship, historians still prize values that Nietzsche criticized. Objectivity, now seen as a ‘noble dream’ instead of realizable goal by historians, nevertheless remains a valued mindset. Polemics and jeremiads, while useful fodder for historical articles and monographs, fall outside the bounds of acceptable historical scholarship. Many historians also continue to believe that history should be solely concerned with the past. Jill Lepore and other who have deigned to approach history with an eye to the present have been labeled “presentist” by their peers.[12] Frequently, these accusations amount to little more than political disagreement couched in the language of good scholarship. Still, they posit a hard, artificial dividing line between the past and present.

Genealogy provides an alternative to endless debates about historical methodology. It is a separate method with its own values. At the same time, it can inform our thinking about the past and its relevance to present and future events. Historians need not have a monopoly on the past. History can tell us about the past as it was and for its own sake. The genealogist is first and foremost a social critic, interested in history as a means of interpreting the present. Both are necessary for a complete understanding of the past and how it informs current events. Embracing methodological diversity will allow scholars, be they historians or genealogists, to construct a thoroughgoing and socially responsible vision of the past that can inform how we live in the present.

[1] Anderson, “Imperium”, 4.

[2] Michel Foucault wrote about Nietzsche’s criticism of objective history that, “The objectivity of historians inverts the relationships of will and knowledge and it is, in the same stroke, a necessary belief in providence, in final causes and teleology – the beliefs that place the historian in the family of ascetics.” The historians obedience to unchanging, dead facts denies, for Nietzsche, the perspectivalism of our understanding of past events as well as the historian’s own subjectivity. Michel Foucault, “Nietzsche, Genealogy, History” in Paul Rabinow (ed.) The Foucault Reader (New York City: Pantheon Books, 1984): 76-100.

[3] Friedrich Nietzsche, On the Genealogy of Morals in Walter Kaufman (ed.), Basic Writings of Nietzsche (New York City: The Modern Library, 1992): 491. Italics Nietzsche’s.

[4] Foucault, “Nietzsche, Genealogy, History”, 77.

[5] It should be noted that the genealogical method is not the exclusive purview of leftist social critics. I think the best way to understand much of Christopher Lasch’s late-career work is as genealogy instead of history. The Lasch monograph best fitting the genealogical description is The True and Only Heaven: Progress and Its Critics (1991).

[6] William Appleman Williams, The Contours of American History (New York City: New Viewpoints, 1961): 23.

[7] Williams, Contours of American History, 480.

[8] Williams, Contours of American History, 23.

[9] Anderson, “Imperium”, 9.

[10] Anderson, “Imperium”, 30.

[11] Anderson, “Imperium”, 17-18.

[12] This is obviously not a new phenomenon. American communist historians in the mid-20th century were often labeled unscholarly for viewing history as a tool for political struggle. My reference to Jill Lepore concerns a 2011 spat between her and Gordon Wood concerning her book The Whites of Their Eyes: The Tea Party’s Revolution and the Battle over American History (2011). See Gordon Wood, “No Thanks For the Memories”, The New York Review of Books (January 13, 2011): http://www.nybooks.com/articles/archives/2011/jan/13/no-thanks-memories/ and Claire Potter’s thoughtful riposte at The Chronicle of Higher Education: http://chronicle.com/blognetwork/tenuredradical/2011/01/department-of-snark-or-who-put-tack-on/. David Sehat has written on the debate for the S-USIH blog back in 2011. See his post here: http://s-usih.org/2011/01/wood-on-lepore-on-presentism-or-why.html.