Category: Blog

The Minimalism Morass

Blog

In contemporary culture, minimalism is in. Whether it’s Marie Kondo-ing your closet, clearing your mind with meditation, or buying fewer items, minimalism has become a buzzword signifying mindfulness, taste, and eco-consciousness. The minimalist movement seems to be divided into three interlocking categories:

  1. Material minimalism
  2. Consciousness minimalism
  3. Aesthetic minimalism

Material minimalism means having less stuff. It reached its cultural apex with the Marie Kondo craze in 2018-19 whereby people feverishly tossed every item out of their closet that didn’t “bring them joy.” It is also echoed in the sparsely decorated homes of the Kardashian-Wests and YouTube influencers. These empty spaces signify tidiness and mental clarity.

Minimal spaces are intended by their designers and curators to be windows into the resident’s mind and spirit. Consciousness minimalism is the mental and emotional counterpart of material minimalism: by de-cluttering mentally, emotionally, and spiritually its practitioners can achieve greater happiness, clarity, and productivity. Consciousness minimalism is practiced in many ways including meditation and yoga, fasting, and unplugging from technology. The hope is that by stripping away the noise and baggage of modernity, the minimalist will uncover a hidden well-spring of happiness and efficiency buried within.

Aesthetic minimalism binds material minimalism to consciousness minimalism. You could say material and consciousness minimalism are borne out of aesthetic minimalism, which, as an artistic and architectural movement, stripped away material ornamentation to uncover hidden beauty. Aesthetic minimalism moved from the art and literary worlds to day-to-day life through technology: the synthesis of form and function preached by Apple’s Johnny Ive or photo editing apps like Unfold and VSCO.

Taken together, these three forms of minimalism form a holistic worldview that is compatible with many ethical systems. Environmentalists, for example, can easily incorporate minimalism into their ethical system by stressing the importance of minimizing consumption (even though trends like Marie Kondo-ing have ended up producing huge quantities of waste). This portability is, perhaps, why minimalism has been able to transcend cultural differences and find expression in global online communities like Youtube and Reddit.

But is this minimalism business just corporate bullshit or, at best, a fleeting trend? There is absolutely evidence to support this. Youtube influencers, for example, will often post videos proclaiming the virtues of minimalism alongside videos hawking products or gushing over their latest luxury purchase. Minimalist advocates promote the instrumental value of minimalism for productivity. Ultimately, this is a story about more not less; minimizing or disengaging from certain tasks to prioritize others. It is easy to see how this view would find currency among younger Millenials and Gen-Zer creatives whose employment is more likely to be contingent on the ability to produce more work in less time.

Minimalism is also aspirational. These aspirations can be grand (having a massive modern apartment like the Kardashian-Wests) or more moderate (having a clean closet). This aspirational element is why minimalism is largely defined by a salespeople who project success, orderliness, and contentment. It’s the ultimate “one neat trick” that will simultaneously allow you to be more successful and happier in less time.

It is no coincidence that the minimalism craze is happening at this moment. Minimalism promises that you can gain control over your life without superhuman intelligence or work ethic. In a world defined by political disorder, climate catastrophe, and economic inequality, minimalism promises that with the right habits you can overcome barriers to happiness that stand in your way. It is also a response to the pressure to be always “on” made possible by flexible work scheduling and mobile technology.

Still, minimalism is an insufficient response to the challenges of contemporary society. In blames individuals for social failings and fragments collective action against social ills by emphasizing individual betterment instead of solidarity. Real, lasting change requires a movement, not minimalism.

More Than Just Packaging?: The Humanities and Interdisciplinarity in the Modern University

Blog

Several weeks ago I had the privilege of attending the National Academies of Sciences, Engineering, and Medicine event on “Intersecting Identities and Transdisciplinary Research at Historically Black Colleges and Universities.” It was a fascinating event that showed how resource scarcity at HBCUs and their strong commitment to a common mission has been fertile ground for trans- and interdisciplinary partnerships across STEM, the arts, and the humanities. It also argued that there was a consonance in the promotion of interdisciplinarity across academic fields and attempts to promote demographic diversity in student and faculty populations (Branches from the Same Tree, 15-16). Both rely on creating a flexible academy able to embrace multiple points of view and integrate them into a coherent whole. In an era where the university is in flux – shifting ever more towards the STEM fields and increasingly responsive to employer influence on curriculum – as a humanist it was heartening to hear STEM faculty and university administrators speak to the value of humanistic learning in creating well-rounded thinkers and informed citizens.

Despite the generally positive tone of the discussion, it was hard to look at the examples given of transdisciplinary cooperation and wonder if the arts and humanities were really equal partners in this endeavor or simply a way to make scientific and/or mathematical concepts more palatable to students. A representative from one university spoke about an entrepreneurship class that reach across disciplinary lines to design, create, package, and market a product. The purpose of the project was to show how the strengths of different team members were necessary to shepherd a project from an idea to the hands of an interested consumer. Unfortunately, the project was framed in a way that suggested that engineers and scientists would be the principal product developers and creators and the humanists would be there for marketing. This was echoed in a community college class at another university where music theory and history were used to teach the physics of sound. Again, the concepts were scientific and the communication humanistic.

Marketing and communications are viable and important career pathways for humanists (in fact, that’s the field I have gone into having earned my doctorate in history). Trans- and interdisciplinary projects that focus solely on the communicative function of the humanities badly misunderstand the value of those fields, however. The ways that fields like English and philosophy encourage critical engagement are valuable for challenging assumptions and pushing back against group think. History teaches research skills that are vital in an information economy where it is often difficult to distinguish fact from fiction. Even more importantly, those fields provide useful information and learning experiences. History helps us understand ourselves and one another. Tangling with a complex ethical dilemma prepares us to confront moral challenges at work and at home. Reading a novel or examining a piece of art gives us a vocabulary to understand and express aesthetic preferences and ideas.

One possible area of cooperation explored at the National Academies event and elsewhere is the integration of ethics into STEM. A panelist teaching physics at a community college argued that every STEM student should be required to take an ethics course before beginning a STEM major. She argued this would ensure that ethical concerns were foregrounded in scientific study instead of tacked on at the end of a major or as an elective course. This suggestion echoes Cathy O’Neil’s excellent book Weapons of Math Destruction which urged the integration of ethical matrices in corporate big data projects in order to avoid the perpetuation of race, gender, and social biases in the construction of algorithms.


These uneven partnerships have a long history. From physicists deriding social scientists for being insufficiently scientific to political scientists and economists labeling area specialists as mere fact-gatherers, the history of American higher education is ridden with spoiled and fraught interdisciplinary partnerships (Solovey, 2013 for the former and my forthcoming book for the latter). Most of these partnerships failed in either the short or medium term. Their failures are as diverse as the slights that occasioned the split: insufficient funding causing competition between fields or ways of knowing, too much funding disincentivizing collaboration, interpersonal rivalry, the retirement or death of a pioneering advocate, and the list goes on.

Two features of these failures are fairly constant, however: inequality and a lack of empathy. For contemporary interdisciplinary and transdisciplinary partnerships between STEM and other fields to succeed all participants need to be treated as equal partners even if STEM funding outpaces the arts and humanities. More importantly, all parties must identify with the triumphs and challenges of their compatriots in other fields. Trust is required for honest intellectual exchange and engagement. These relationships take time to build, but must be built on a foundation of rough equality and respect. Partners must share common goals and a vision for their shared enterprise. Mutual using – for accessibility or financial reasons – is not partnership and interdisciplinary projects rooted in this vision are doomed to duplicate the failures of the past.

On the Pleasures of Slow Reading

BlogUncategorized

Like many prospective historians, I was drawn to history because of my love of reading. As a child I spent countless hours reading at home and at the school library. I remember being enchanted by the Scholastic book fair and thrilled at the notion that I would be given – GIVEN – a book just for attending. As I grew older reading remained a sanctuary. In high school and in college, I always had a long reading list and stack of books beside my bed, which I would often read instead of spending time studying assigned coursework. Longer or more difficult books would hang around. I’d pick them up and get sidetracked only to return to them weeks or months later.

Becoming a historian seemed like the perfect career path to take my love of reading from hobby to profession. This was (obviously) before I knew what being a historian actually entailed. I learned a little bit about archival research in one of my undergraduate courses, but saw it as supplemental to secondary source reading. I also didn’t understand how difficult the path would be – both through graduate school and finding a career as a historian after graduation.

Still, what shook me most in graduate school was its approach to reading. The sheer pace of the exercise was exhausting; books were mutilated (“just read the intro and a review” was the common refrain) into content that needed to be crammed for classes and comprehensives. Graduate school reading was nothing like any reading I had done before. It was a marathon that felt like a sprint.

So, also like many graduate students in the humanities, my love of reading waned. I didn’t read for pleasure very often and when I did it was either for circumstantial (places with no internet) or social (book clubs) reasons. I lost touch with why books had mattered to me.

Choosing not to pursue an academic career was traumatic, but a silver lining has been that my passion for reading has been reignited. No longer taking part in the information-gathering arms race that is graduate school has allowed me to not only read slower, but also pause and reflect on what I’ve read.

It’s wonderful to really live with a book, to let it sit with you and accompany you. This has held true for fiction and non-fiction. I’ve been reading Thomas Mann’s The Magic Mountain for about three months, slowly chipping away at it in chunks often to twenty pages. Since it’s in many ways a novel about how time passes,this approach has made me appreciate how Mann structured the novel in a way that flows in the uneven ebbs and flows that subjective time does. It has been similarly rewarding as I’ve read Jill Lepore’s These Truths. Like Mann, Lepore is a pleasure to read and her prose and organization rewards close reading. It’s also refreshing to read a longue durée history from beginning to end and not feel the pressure to skip around or mine for argument.

Obviously, this approach is made possible by the privilege of working routine hours and having a low-stress, low-responsibility home life.At the same time, I can’t help but wonder if I would have learned better –maybe not more, but more deeply – without the pressures to consume as much information as possible. I also wonder if this type of churn disadvantages certain types of students who would benefit from more time to read and reflection each item.

As anyone who has written a dissertation (or any long document) knows well, it takes time to write. So too, to read. Even though there are many days that I yearn for the intellectual engagement and debate of my graduate school years, the solace of slow reading tempers my nostalgia for those days and reminds me of the promises life after graduate school hold for intellectual growth. 

This time of year can be full of holiday- and project-related bustle. Looming end of year deadlines can further heighten an already acute anxiety about not working fast enough. But if you can, resist the urge to rush, take some time to sit down, and read slow. 

The Bridge from Academe

AdviceBlog

My career as a history graduate student seems like a lifetime ago already. It was January 2018 and I had just defended my dissertation, removing the one constant that had unified my intellectual life for the past five years. I didn’t really have any job prospects. My furtive attempts to secure a tenure-track academic job the previous year had been met with such a deafening silence from university employers that I realized I needed an alternate path. I had applied to a bunch of university administration jobs in the kind of slipshod, arbitrary manner that showed that I had no idea what I was doing. Even though I had held many part-time jobs on campus, I wasn’t sure how any of them translated to an actual career. I was simply looking to do something because I wasn’t going to be able to make a living doing what I was trained to do.

Now almost two months into my job as the Publications and Communications Manager at the Council of Graduate Schools (CGS), it’s stunning to reflect on how far away that all seems. The restless nights after receiving another rejection. The weird combination of elation and anxiety that surged through my body after opening an email requesting an interview. All the wonderful and kind people I met in the hodgepodge of part-time jobs I worked as I tried to figure out what to do next. It all seems distant.

And in some ways it is distant. I’ve moved to a new city. Reconnected with old friends and met new people. The 9 to 5 office schedule provides a structure that I never had before in graduate school, though it can feel stifling on a beautiful day (which, fortunately, between the scalding heat and pouring rain there are few of in Washington). I used to wake up at 8, now I wake up at 6. Everything seems different.

I’m not sure there’s a lesson here. The usual platitudes – all variants of ‘don’t worry it will get better!’ – are too simple. The emotions are much more complex. There is certainly relief that the economic insecurity of graduate school and the post-graduate transition is over. I am incredibly lucky to have found a job that I enjoy at a supportive organization staffed by smart, compassionate people.

But there is also a nostalgic patina that is already starting to wear away the sharpness of those desperate feelings of my last days in graduate school. I had the opportunity to do the research that I loved. I got to work with students, faculty, and administrators from different backgrounds and with different life experiences who broadened by worldview. I lived in a beautiful place and was surrounded by people who cared about me (and hopefully still do).

Even though graduate school seems far away, real questions and doubts persist in my mind. As Erin Bartram and others have more eloquently noted there is an identity crisis that accompanies the transition out of academia and into another field. For many of us, and I know for me, “intellectual-in-training” or “scholar” was a core component of who we were and wanted to be. I often wonder if being a scholar or historian will be an important part of my identity going forward or, if through some transformative magic, it’s already gone. I would still like to be part of the history community, participate in scholarly debates in my field, and publish on my research interests. Since my research never seemed to generate much interest as a graduate student, however, I’m not sure I have enough momentum or a large enough public profile to continue on as part of that community.

I’ve been thinking often about how to bridge my former and current employment identities. I agree with the career pathways approach that rejects the bifurcation between academic and “alt-ac” employment trajectories to think more holistically about post-graduate employment. That sounds good as an ideal, but what does that look like as a lived experience? Is it really possible to pursue multiple career paths simultaneously without (either implicitly or explicitly) ranking them? Are we fighting in vain against the inevitable disappointment that will occur when graduate students enroll in programs expecting to become professors only to realize that career path is an incredibly difficult one? There are no easy answers to these questions.

Changing a core component of one’s identity is trying, but if we can work together as current graduate students, faculty, administrators, and alumni I think we can better show graduate students the many opportunities available to them. There will never be a perfect solution, but I think there is room to make the post-graduate identity change feel less like a crisis and more like a new phase of personal and intellectual discovery. 

Writing a Book Proposal – A Crowd-Sourced How-To

AdviceBlog

About a month ago I wrote on Twitter that I was looking for advice about how to write a book proposal. I am revising my dissertation into a book manuscript and wanted to begin writing a book proposal to circulate to academic presses. I do not currently work in a university and had no idea where to start, so I put a call out on Twitter. I received so many wonderful responses that I decided to compile them all in a Twitter story and post it here. The most interesting revelation was the diversity of responses I received. It seems there really are many different ways to successfully compose a book proposal that will be accepted by a university press.

I hope others will benefit (as I already have) from all this generous advice. The Twitter story can be found by clicking the link below:

http://wke.lt/w/s/K7zBN

The Arc-Hive Mind?

Blog

I have spent this past week conducting preliminary research for my dissertation prospectus at the National Archives II in College Park, Maryland. While there I had an epiphany (of sorts) about the problems academic historians have reaching a broader public. Unlike, other critics that point to the difficulties presented by academic writing or stress that the nuance of academic histories are too much for the “average” reader (whoever that is), I think the problem is the archive or rather, the relationship between academic standards of evidence and narrative.

Two interrelated events led to my epiphany. Working at a massive research facility like Archives II, I was confronted by a dizzying array of brilliant historians and archivists. Everyone had a thoughtful research project. The reading room resembled a historian hive with researchers buzzing between carts, computers, and copy machines. The collection of source material divided into its binders, folders, boxes, and carts was overwhelming. But one thing was missing: narratives. There was no box labeled “Narratives” and you couldn’t find it searching through the online database (believe me, I tried).

After a long day at the archive, I was killing time in Dupont Circle before meeting friends for drinks. I wondered into Kramerbooks and Afterwords – a charming independent bookstore – and began browsing their history new releases section. Being a poor graduate student and an AmazonPrime Member, I rarely peruse the new history releases and was shocked by how few names I recognized despite being a professional (in training) historian. Flipping through books by Mark Kurlansky and other popular historians, I – again – noticed something was missing: the archive. Most of the evidence in the popular histories was from primary and secondary sources; unsurprisingly many journalists were partial to newspapers. These histories were overwhelmingly story-driven. The few histories I had read were from old or deceased academics like Hofstadter, Lasch, Schlesinger Jr., Zinn, and Woodward. It seemed as though the history academy had lost the popular support it once had during the 1960s and 1970s. But was their work really so different? Feeling confused, I walked out empty-handed.

I thought about all the hustle and bustle in the archives earlier in the day and wondered: where is the output from all this research going? Who is reading all the brilliant work produced by researchers at Archives II? How did popular historians come to fill the void left by academic historians of earlier eras?

Then I began to think of the archive. Hofstadter was notoriously archive-resistant. Lasch seems to have drifted from archival work as he became more of a public intellectual after the publication of The Culture of Narcissism in 1979. Schlesinger’s The Vital Center is almost entirely supported by secondary sources. I thought of all the little archive boxes filled to the brim with folders and wondered: are archives limiting our horizons as historians? This problem seems particularly acute with young historians who are trying to evince academic rigor to their senior colleagues. I remember writing my master’s thesis and building chapters around novel archival evidence instead of published material because I thought it would showcase my research ability. Narrative was often the last of my concerns or something I simply hoped would happen if I added enough biographical anecdotes.

Beyond limiting research, it also seems to exacerbate a bifurcation between academic historical research and politics. It seems to me that the archive creates another separate community – along with the academy and scholarly associations – that isolate academic historians from the public. Academic isolation has intensified the professorial argot, but more importantly it has led to group-think about what is important and how to communicate themes that academics believe are significant to a broader audience. I was shocked how few histories of race, sexuality, and gender were carried at a well-stocked independence bookstore (the ones that were carried being published by non-academic historians like Isabel Wilkerson). Military histories still predominate in bookstores and on television, despite its declining significance in the academy. As the Supreme Court debate over DOMA continues, the value of historical context to politics is readily apparent. Histories of gender and sexuality composed by articulate academics, if widely read, could help Americans understand why gay marriage is an important issue in the same way C. Vann Woodward’s The Strange Career of Jim Crow made legible the origins of the Civil Rights Movement.

Obviously, I think academic historians need to look beyond the archive and focus on storytelling to broaden our audience. Outgoing American Historical Association president William Cronon has already made this point in his annual address, so there is no point to belabor it here, but the question remains: how can professional historians be encouraged to publish books for public consumption while retaining high research standards? Though it will undoubtedly be a long and difficult process, I am confident an accommodation can be reached. It will mean, however, that historians will have to enter the scrum of politics where the rules of reasoned debate rarely hold. But getting a little dirty is a small price to pay for the transformative potential of academic historians acting as popular social critics.

Researchers at Archives II and other reading rooms across the globe have a tremendous wealth of information to share. It is time we began to focus on communicating that information to the widest popular audience. To accomplish this, researchers will need to leave the arc-hive and venture out with their knowledge to pollinate the world.

I Defended My Dissertation – What Do I Do Next?

BlogUncategorized

I defended my dissertation this past December. In the defense’s afterglow, I started to wonder – what’s next? I knew I wanted to turn it into a book, but I had no idea where to start. Should I contact publishers? Start writing a book proposal? My advisor suggested stepping away from the project to gain perspective. Still, I was skeptical. I didn’t have a job and thought the only way to get a job (or even a post-doc) was to have, at the very least, the book under contract.

After taking some time to think through this, I did what any good millennial would do: took to Twitter. I received a staggering number of replies from professors, publishers, and graduate students offering advice or commiserating about the opacity of the publishing process. I am distilling the substance of the discussion into five points, in the hope that in the future they can guide graduate students wrestling with the same questions.

1. Take a break. Writing a monograph-length dissertation is a long and, oftentimes, arduous process. In so doing, it is easy to lose perspective. In my case, I had largely stopped reading books outside my narrow subdiscipline and immersed myself in the project’s archival and primary sources. Most Twitter respondents recommended putting the manuscript aside, focusing on other projects (an article, teaching a class, etc.), and returning to it with fresh eyes after between six months and one year. By then, the hope is that you can bring a new perspective to the project.

2. Read other things. Fresh perspective cannot be attained through idleness, however. When taking a break from the manuscript, respondents’ advised that you delve into books outside your disciplinary niche. Many recommended reading fiction, since its emphasis on narrative and readability are two qualities lacking in many manuscripts. Others recommended reading prize-winning books. These could act as models for a book proposal or provide insight into how to best frame arguments. For Twitter respondents the message was nearly universal: the best writing begins with omnivorous reading.

3. Network. Like any other employment opportunity, finding a publisher for your manuscript is easiest achieved through networking. The best place to find these networking opportunities is at academic conferences. Respondents in the publishing industry shared that they meet many first-time academic authors at conferences. Larger academic conferences (AHA, OAH, MLA, etc.) usually have the greatest number of publishers, but smaller conferences can present greater opportunities to meet and have sustained discussions with publisher representatives. If conferences are too expensive (and for many graduate students they are), rely on your existing social network. Ask your advisor, faculty in your department, or alumni if there is anyone at their publisher you could speak to about your manuscript.

4. Write your dissertation as a book. If you have not yet completed your dissertation or are in the beginning stages of your graduate career, you may want to think about your dissertation as a book. This is a polarizing approach and one that will need to be worked out with your advisor. There are at least two ways of thinking about a dissertation. First, the dissertation-as-certification approach, which sees the dissertation as a document proving your abilities as a scholar. This generally means lengthy forays into historiography, rigorous citation using mostly archival sources, and favoring argument over narrative. Scholars advocating this approach see the dissertation as a showcase for all the skills you have learned as a graduate student and the defense of the project as certification that you belong in company of other professional academic historians. Second, the dissertation-as-book believers argue that since the real disciplinary standard is a publishable manuscript emphasis should be placed on those traits publishers find desirable – narrative, clear argument, and a clear writing style – over skill demonstration. While there is disagreement over which approach is best, writing the dissertation as a book has obvious benefits in the transition from manuscript to published book.

5. Write a different book. The most surprising suggestion I received was not to transform the manuscript into a book at all. Instead, these respondents suggested to think of the book as a totally different project than the dissertation. On the surface this seems ridiculous. I just spent five, six, seven years writing a dissertation and now you’re telling me to scrap it and start over! What a waste of time! Yet, when you think more deeply about divergences in form and audience, thinking about the book as a new project makes more sense (particularly if you took the dissertation-as-certification approach, as I did). One respondent put it particularly succinctly, “You don’t revise your dissertation; you steal from your dissertation while you’re writing your first book.” Thinking about your manuscript as a second project can free you to think more capaciously about your manuscript topic than trying to revise a dissertation project intended for a narrower audience and with more limited objectives.

These five points are heuristics for the manuscript-into-book transformation that I intend to follow over the next six months. I’m sure there will be disagreement and all of these points are subject to debate (and if you have further questions or comments feel free to post below). Thank you to everyone who responded and I hope this short memo will help graduate students feel a little less lost after they defend.

It’s Only Business: Neoliberalism and the National Basketball Association

Blog

Following a hotly contested NBA Finals Game Six, which saw the Golden State Warriors’ star guard Stephen Curry ejected for throwing his mouthpiece into the stands, Curry’s wife, Ayesha, wrote several controversial tweets expressing her displeasure about how the game was officiated. “I’ve lost all respect sorry this is absolutely rigged for money,” Ayesha Curry wrote, “Or ratings not sure which.” Though she deleted the tweet and explained that she had written it out of frustration with the treatment of her father by Cavaliers security, Curry’s tweet expressed a widespread rumor whispered on NBA message boards and joked about between friends after a few beers that the 2016 NBA playoffs have been officiated to keep series competitive, ensuring series last longer, and meaning more money for sponsors, television providers, and league owners.

Whispers that the 2016 playoffs were rigged began in Game Three of the Western Conference Finals when Warriors forward Draymond Green kicked Oklahoma City Thunder center Stephen Adams in the groin. Green received a flagrant 1 foul on the play and, though it was upgraded to a flagrant 2 foul upon league review, did not receive a suspension for Game Four. Rumors swirled that the reason Green was not suspended for Game Four was because, with the Warriors down 2-1 and Green playing such a pivotal role as a rebounder and defender, his absence would ensure a Thunder victory and, theoretically, a short series.

Cries of conspiracy grew louder after Game Four of the NBA Finals when Draymond Green was suspended for flagrant foul accumulation after a swipe at the groin of Cleveland Cavaliers’ superstar LeBron James. At the time, Golden State led the series 3-1, most of the games had been blowouts, and as the series moved back to the Bay Area for a close out Game 5 (the Warriors had only lost two games at home all year) it appeared a five game series was in the offingNBA Money. With heroic performances by James and Kyrie Irving, the Cavs won in Oakland and now the series is moving to a Game Seven. Even mainstream journalists like ESPN’s Michael Wilbon questioned whether the NBA suspended Green to make the series more competitive.

Accusations of league tampering are nothing new. In inaugural draft lottery, the NBA was accused of using a cold envelope to guarantee the New York Knicks would receive the first overall draft pick – which would mean the rights to draft Georgetown University superstar Patrick Ewing. NBA Commissioner David Stern was said to have rigged the lottery for New York because the city was the league’s largest market and the selection of Ewing would immediately make them a playoff contender. Similarly, in the 2002 Western Conference Finals between the Los Angeles Lakers and Sacramento Kings, a series a bad refereeing decisions allowed the Lakers to erase a 3-2 series deficit to win in overtime of Game Seven. A massive free throw disparity (40-25 Lakers) and the fouling out of Kings’ big men Vlade Divacs and Scot Pollard helped the Lakers win a close Game Six. Suspicions of impropriety were seemingly confirmed in 2007 when Tim Donagy, one of the referees for the game, plead guilty to betting on games he officiated and said he point shaved by calling phantom fouls on teams he was betting against. While David Stern claimed Donagy was a lone wolf and that his calls did not impact the results of games, Donagy did serve over two years in prison for crimes related to illegal gambling.

Yet, what separates charges of conspiracy during the 2016 playoffs compared to earlier accusations during the David Stern period is that the 2016 conspiracies are not “for” any particular team or player, but, instead, for the league as a business. In one sense this points to the NBA’s national viewership. With stars like Curry, LeBron, and Kevin Durant attracting national audiences, it is no longer necessary to have big market teams like the Knicks and Lakers reach the Finals to attract large viewerships. NBA Finals Game One set a league record with over 19 million viewers, despite two mid-market teams vying for the title. Game Six set the record once again with over 20.7 million viewers. The evidence suggests that Game Seven will set the viewership record even higher on Sunday. What Ayesha Curry is claiming is supported by the viewership numbers: it is in the NBA’s best business interest to extend the 2016 Finals for as many games as possible.

All the 2016 conspiracy talk brings into relief a conflict for the NBA and its fans – balancing the league as a business with the sanctity of free competition in professional sports. The NBA wants to make the most money it can, but it is constrained by the sports ethics of non-interference into free competition. If too many fans believe the NBA is fixed it will likely lose its audience and cease to be a successful business. Yet, as Americans living in a capitalist society know all too well, when business interests and the public interest clash it is usually business that wins out. The American university, which was once thought to value education and freedom of thought over all, has gradually succumbed to operating like a business cutting positions and programs which are not profitable or thought to have instrumental value. Public transport has been privatized in many major American cities and those companies have increased fares to the point that it is a hardship for many riders. Why should sports be exempt from these economic pressures when civic institutions like education and transportation have already succumbed?

So why should the values of free competition in professional sports be any different? As the subtle market-logic of neoliberalism creeps deeper into American institutions it is an open question if any professional sphere exists outside of profit-making. While accusations of the NBA putting profits over competition have frustrated Ayesha Curry and Thunder fans, the vast majority of the American public is excited by the opportunity for more basketball. And it’s it the cardinal rule of American capitalism that more is always better?

President Obama’s Moral Revolution and Its Dying Vanguard

Blog

On May 27, President Obama became the first US president to visit Hiroshima since the United States dropped an atomic bomb on that city on August 6, 1945. The visit was the culmination of a long process of reconciliation between the United States and Japan since the end of World War II when Japan changed from wartime enemy to valuable ally almost overnight. In his speech delivered before the bombing’s survivors and their relatives, President Obama called for a “moral revolution” in the face of increased technological capacities to kill large numbers of people. “Technological progress without an equivalent progress in human institutions can doom us,” he warned (full text of the speech can be found here).

President Obama’s speech echoed the concern about technological change outpacing human moral capacity that many American journalists and academics felt in Hiroshima’s immediate aftermath. Journalist John Hersey, sent to interview bombing survivors for The New Yorker, told stories of dread, shock, and suffering. Lieutenant Daniel McGovern captured videos of the bomb’s impact showing bombed out buildings and the bleached skulls of the blast’s victims. Upon hearing about the bombing painter Pablo Picasso is supposed to have remarked, “the genius of Einstein leads to Hiroshima,” linking the beauty of scientific discovery to the devastation of instantaneous mass murder.

Nothing captured concerns about the ethics of using an atomic bomb better than Alexander Leighton’s 1949 book Human Relations in a Changing World, however. Before arriving in Hiroshima in December 1945 to map the bomb’s psychological effects on Japanese civilians for the United States Strategic Bombing Survey, Leighton had spearheaded research on the morale of Japanese-Americans interned at the Colorado River War Relocation Center at Poston, Arizona. The aim of his research at Poston was to assess how the Japanese community responded to the stress of relocation and internment. Leighton hoped that an administration informed by social scientific knowledge – group psychology in particular – would be more efficient and humane. While Leighton did not oppose internment, he advocated administrative reform in the camps emphasizing cooperation between administration and internees on issues ranging from public health to community leadership with the hope of combating the dehumanization of internees. When he left in 1943 to take job with the Office of War Information, Leighton was confident that social science had ameliorated conditions in the camps by improving relations between camp administrators and internees.

Leighton’s attitudes between his work at Poston and his trip to Hiroshima differed markedly, showcasing a lost confidence in the ability of administrative reform to keep pace with the technology of dehumanization and killing. Whereas the poor conditions at Poston – sweltering heat, unsanitary and overcrowded facilities, and popular distrust of administrators – could be overcome by administrative reforms and improved communication, at Hiroshima there was little left to reform. Describing his first impression upon arriving at Hiroshima, Leighton invoked a “city dump with its smells of wet ashes, mold and things rotting, but one that runs from your feet out almost to the limits of vision.” The 4.4 square miles of downtown Hiroshima were completely destroyed. Leighton found a people shattered by the experience of vaporized lives and lost loved ones. An elderly schoolteacher told Leighton the bomb had transformed Hiroshima from “Paradise to Hades” in an instant. What haunted Leighton most was a feeling that Hiroshima was only the beginning. “This is a preview of things to come where I live,” he wrote, “These thousands dead will not be strangers. They will include friends, brother, sister, father, mother, wife and children. Some will die instantly and some will survive awhile to realize their slow death and to beckon for help where there will be no one to answer.”

Leighton came to believe Hiroshima was made possible by the outpacing of moral or civilizational progress by technological development. He hoped that social scientific advances would make using weapons of mass destruction obsolete by easing international tensions. Work in the fields of sociology and anthropology had important roles to play as well, highlighting commonalities unifying the human species. Furthermore, the very place of the social sciences in tying the impersonal work of the hard sciences to the moral world of human beings was significant. Leighton believed social scientific interventions into the natural sciences were necessary for moral guidance. “Moral values when pertinent dominate scientific values at three contiguous points: the selection of the problem to be investigated, the limits of the human and other materials that may be used, and the determination of what shall be done with the results.” Social scientists with their specialty in human values and experience would prevent scientists from privileging scientific theories and results over ethical concerns.

Leighton made numerous recommendations for how to disseminate social scientific knowledge ranging from expanded university fellowships to public education initiatives. Explaining the values and experiences unifying humanity was, for Leighton and others who experienced Hiroshima’s aftermath, an obligation shared across American society from policymakers in Washington to families in small towns.

Leighton’s suggestions make uneasy reading with the continued national defunding of the social sciences during the Obama administration. The Obama administration has vocally supported the STEM fields, but have elicited a lukewarm (at best) response to promoting the social sciences and humanities. In April 2015 the Republican-led House of Representatives Committee on Science, Space and Technology proposed a 45% reduction in federal funding for the social sciences (a useful summary can be found here). This while increasing the overall budget for the National Science Foundation, “adding more than one hundred million dollars each to the offices for biology, computing, engineering, math, and physical sciences.” National cuts reflect declining university enrollments in the social sciences. The University of Washington, for example, reported declining enrollments

Alexander H. Leighton in Poston, AZ during World War II.
Alexander H. Leighton in Poston, AZ during World War II.
in the social sciences ranging from four to forty-five percent depending on the department and responded by cutting twenty-five teaching assistant positions. The 2015 panic over national cuts confirmed fears that waning American economic competitiveness made separating the “useful” natural sciences from the superfluous social sciences a priority for policymakers and universities alike.

President Obama’s visit comes at a crucial moment as America’s East Asian allies are challenged in the South and East China Seas by an expansionist China. His speech was both a reaffirmation of his commitment to Japan as a US ally and a warning to China about the dangers of expansionism. The President’s speech also underlined the perils of dehumanizing language for American audiences. Donald Trump has risen to the Republican Presidential nomination on hateful rhetoric meant to demonize racial, gender, and cultural “others” as inferior and dangerous.

The moral revolution Obama sees as the anecdote to aggressive expansionism abroad and xenophobic nationalism at home begins by reaffirming the human obligations of global citizenship. Yet, it is difficult to imagine constructing a civically responsible American populous while systematically defunding its social scientific and humanistic vanguard. Moral revolutions are not spontaneous. They begin with an understanding of current ethical problems facing humankind and the context of how we are all facing those problems together as part of a single global community. The social sciences and humanities have an important role to play in demystifying other cultures and educating Americans how to become contributing global citizens.

Making History Too Big To Fail?: The History Manifesto and the Return of History as Science

Blog

In The History Manifesto historians David Armitage and Jo Guldi add their voices to a growing body of literature on the humanities in crisis.[1] As the book’s title suggests, they focus on the field of history and its diminishing influence on public life. This descent into irrelevance is not the result of a changing public. Instead, academic historians’ embrace of short-term thinking has made the discipline unresponsive to the global crises of the day including wealth inequality and environmental degradation.

Despite historians’ retreat from their publics, Armitage and Guldi are hopeful for the resuscitation of publicly-minded history. They see two ways for historians to regain their audiences. First, historians need to reject short-termism and return to studying longue duree narratives. Looking back to earlier history (it seems here that Armitage and Guldi are thinking pre-industrial) will allow historians to show policymakers real alternatives to ingrained economic and political systems. They provide many examples of historians who have used the longue duree to challenge established institutions and systems including the Webbs, R.H. Tawney, and Eric Hobsbawm. Furthermore, Armitage and Guldi see an expansion of temporal scope as a useful counterpart to historians’ embrace of larger geographic areas. Just as transnational oceanic, continental, and comparative imperial histories have allowed historians to tell new stories about systems, institutions, and ideas, an expanded time scale would provide the same benefits. Their second solution is an embrace of “big data”. Armitage and Guldi believe historians are uniquely suited to effectively utilize big data because of their ability to make data meaningful through contextualization and narrative. “History has an important role to play in developing standards, techniques, and theories suited to the analysis of mutually incompatible datasets where a temporal element is crucial to making sense of causation and correlation” (104).

To Armitage and Guldi, a focus on the longue duree and an embrace of big data need to go hand-in-hand in combating short-termism. A focus on the long temporal scopes brings with it the inevitable problem of information overload. The tools of big data offer a solution to this problem by condensing and visualizing large datasets into manageable graphs, maps, and charts. They highlight the Google NGram viewer as a freely available, easy to use big data tool already being used by historians. Embrace of big data also offers historians a way to remain relevant in a technologically modernizing university. Armitage and Guldi recognize the crisis of the humanities extends to employment as well as larger social relevance. “If History departments train designers of tools and analysts of big data, they stand to manufacture students on the cutting edge of knowledge-making within and beyond the academy” (107).

While The History Manifesto presents itself as a revolutionary way of approaching novel problems facing contemporary history scholarship, its solutions are old ones. The crisis of the public intellectual has been a preoccupation of historians since at least the 1980s if not earlier. A lack of responsiveness to public needs is often viewed as an important explanation for the public intellectual’s demise. The entire discourse surrounding the “ivory tower” is a reflection on academics’ insecurities about their relationship to the larger public and perceived differences between subjects of scholarly interest and public needs. Even if historians embrace longue durees and big data it seems unlikely that the tension between scholarly freedom of inquiry and the public embrace of intellectuals will be resolved.

The tools proposed by Armitage and Guldi also have problems. As Deborah Cohen and Peter Mandler have convincingly demonstrated, The History Manifesto’s data often doesn’t support its argument.[2] Historians have not retreated from longer temporal studies and embraced short-termism. Anecdotally, scholars held by Armitage and Guldi as exemplars of long-term thinking like Arthur Schlesinger Jr. and Charles Beard published many books examining short time periods. It’s also unclear how longue durees will lead to more relevant history scholarship. As has been pointed out by several others, some of the most politically engaged fields – notably the history of American capitalism – seem particularly plagued by short-termism but for the reason that the ascendency of neoliberalism is a relatively recent phenomenon.

The turn to big data represents another tried and true response by the history field in times of crisis: an appeal to science. An earlier history crisis in the mid-20th century provides a useful lens for understanding both the current crisis and the Armitage/Guldi response. After World War II, the expansion of social science funding and prestige put history in an undesirable position. While leading scholars like Arthur Schlesinger Jr. had success as traditional narrative historians, younger academics and those in less established sub-disciplines were concerned about diminished influence and funding. An enterprising few, particularly those studying strategically important areas like the Soviet Union and China, began rebranding the discipline as a social science. These historians claimed that they were like scientists because they used data to find objective answers to historical questions.

They envisioned a substantive role in interdisciplinary social science in particular. In an interesting parallel to Armitage and Guldi, they claimed that historians were essential in contextualizing the findings of other social scientists. At Harvard, Washington, and Johns Hopkins (among others) interdisciplinary social science programs were established with substantial history components. At Johns Hopkins, Owen Lattimore led an interdisciplinary study on Xinjiang province in China. Its aim was to analyze its politics and role as a Chinese frontier, but a large part of the study was devoted to understanding how its history shaped its politics. These programs were incredibly successful at attracting funding and grew exponentially between 1945 and 1965. As the hierarchy of university disciplines shifted in the mid-20th century from a humanity-centered university toward one more in line with American national security interests (what Rebecca Lowen has called “the Cold War university”), history maintained high standing by appealing to science.[3]

The History Manifesto is a call to revolutionary action. It aims to persuade students and faculty to use the longue duree and new technology to seek broader audiences and answer bigger questions. These are noble and worthwhile goals. They are also not revolutionary. As Deborah Cohen and Peter Mandler have shown, the longue duree never left and the short-termism of recent historical scholarship is a canard. The appeal to data is similarly an old tactic dressed up in new language and concepts. In an age where the value of scientific research is rarely questioned amid massive cuts to university budgets, it’s natural for historians to appeal to science in an effort to defend their discipline. It has worked in the past (to a certain extent) and the current enthusiasm around big data may allow it to succeed again. Still, in the spirit of Armitage and Guldi, I think it’s important not to become myopically focused on the current crisis. Instead, a deeper exploration of why history faces periodic methodological crises is necessary. It’s also necessary to define with greater precision what public engagement means for scholars. While size and scale are important metrics for gauging influence, extension without clear goals can not only compromise historians’ relations to a wider public, but can jeopardize our stature within the university as well.

[1] David Armitage and Jo Guldi, The History Manifesto (Cambridge, UK: Cambridge University Press, 2014).

[2] This is not to say Cohen and Mandler’s response is unproblematic. They go too far in opposing Armitage and Guldi to the point of denying any sort of crisis in the humanities. Though their riposte was likely intended as a full-throated call for methodological pluralism, it often reads as a defense of the status quo.

[3] Rebecca S. Lowen, Creating the Cold War University: The Transformation of Stanford (Berkeley: University of California Press, 1997).

  • 1
  • 2