Tag: higher education

More Than Just Packaging?: The Humanities and Interdisciplinarity in the Modern University

Blog

Several weeks ago I had the privilege of attending the National Academies of Sciences, Engineering, and Medicine event on “Intersecting Identities and Transdisciplinary Research at Historically Black Colleges and Universities.” It was a fascinating event that showed how resource scarcity at HBCUs and their strong commitment to a common mission has been fertile ground for trans- and interdisciplinary partnerships across STEM, the arts, and the humanities. It also argued that there was a consonance in the promotion of interdisciplinarity across academic fields and attempts to promote demographic diversity in student and faculty populations (Branches from the Same Tree, 15-16). Both rely on creating a flexible academy able to embrace multiple points of view and integrate them into a coherent whole. In an era where the university is in flux – shifting ever more towards the STEM fields and increasingly responsive to employer influence on curriculum – as a humanist it was heartening to hear STEM faculty and university administrators speak to the value of humanistic learning in creating well-rounded thinkers and informed citizens.

Despite the generally positive tone of the discussion, it was hard to look at the examples given of transdisciplinary cooperation and wonder if the arts and humanities were really equal partners in this endeavor or simply a way to make scientific and/or mathematical concepts more palatable to students. A representative from one university spoke about an entrepreneurship class that reach across disciplinary lines to design, create, package, and market a product. The purpose of the project was to show how the strengths of different team members were necessary to shepherd a project from an idea to the hands of an interested consumer. Unfortunately, the project was framed in a way that suggested that engineers and scientists would be the principal product developers and creators and the humanists would be there for marketing. This was echoed in a community college class at another university where music theory and history were used to teach the physics of sound. Again, the concepts were scientific and the communication humanistic.

Marketing and communications are viable and important career pathways for humanists (in fact, that’s the field I have gone into having earned my doctorate in history). Trans- and interdisciplinary projects that focus solely on the communicative function of the humanities badly misunderstand the value of those fields, however. The ways that fields like English and philosophy encourage critical engagement are valuable for challenging assumptions and pushing back against group think. History teaches research skills that are vital in an information economy where it is often difficult to distinguish fact from fiction. Even more importantly, those fields provide useful information and learning experiences. History helps us understand ourselves and one another. Tangling with a complex ethical dilemma prepares us to confront moral challenges at work and at home. Reading a novel or examining a piece of art gives us a vocabulary to understand and express aesthetic preferences and ideas.

One possible area of cooperation explored at the National Academies event and elsewhere is the integration of ethics into STEM. A panelist teaching physics at a community college argued that every STEM student should be required to take an ethics course before beginning a STEM major. She argued this would ensure that ethical concerns were foregrounded in scientific study instead of tacked on at the end of a major or as an elective course. This suggestion echoes Cathy O’Neil’s excellent book Weapons of Math Destruction which urged the integration of ethical matrices in corporate big data projects in order to avoid the perpetuation of race, gender, and social biases in the construction of algorithms.


These uneven partnerships have a long history. From physicists deriding social scientists for being insufficiently scientific to political scientists and economists labeling area specialists as mere fact-gatherers, the history of American higher education is ridden with spoiled and fraught interdisciplinary partnerships (Solovey, 2013 for the former and my forthcoming book for the latter). Most of these partnerships failed in either the short or medium term. Their failures are as diverse as the slights that occasioned the split: insufficient funding causing competition between fields or ways of knowing, too much funding disincentivizing collaboration, interpersonal rivalry, the retirement or death of a pioneering advocate, and the list goes on.

Two features of these failures are fairly constant, however: inequality and a lack of empathy. For contemporary interdisciplinary and transdisciplinary partnerships between STEM and other fields to succeed all participants need to be treated as equal partners even if STEM funding outpaces the arts and humanities. More importantly, all parties must identify with the triumphs and challenges of their compatriots in other fields. Trust is required for honest intellectual exchange and engagement. These relationships take time to build, but must be built on a foundation of rough equality and respect. Partners must share common goals and a vision for their shared enterprise. Mutual using – for accessibility or financial reasons – is not partnership and interdisciplinary projects rooted in this vision are doomed to duplicate the failures of the past.

The Bridge from Academe

AdviceBlog

My career as a history graduate student seems like a lifetime ago already. It was January 2018 and I had just defended my dissertation, removing the one constant that had unified my intellectual life for the past five years. I didn’t really have any job prospects. My furtive attempts to secure a tenure-track academic job the previous year had been met with such a deafening silence from university employers that I realized I needed an alternate path. I had applied to a bunch of university administration jobs in the kind of slipshod, arbitrary manner that showed that I had no idea what I was doing. Even though I had held many part-time jobs on campus, I wasn’t sure how any of them translated to an actual career. I was simply looking to do something because I wasn’t going to be able to make a living doing what I was trained to do.

Now almost two months into my job as the Publications and Communications Manager at the Council of Graduate Schools (CGS), it’s stunning to reflect on how far away that all seems. The restless nights after receiving another rejection. The weird combination of elation and anxiety that surged through my body after opening an email requesting an interview. All the wonderful and kind people I met in the hodgepodge of part-time jobs I worked as I tried to figure out what to do next. It all seems distant.

And in some ways it is distant. I’ve moved to a new city. Reconnected with old friends and met new people. The 9 to 5 office schedule provides a structure that I never had before in graduate school, though it can feel stifling on a beautiful day (which, fortunately, between the scalding heat and pouring rain there are few of in Washington). I used to wake up at 8, now I wake up at 6. Everything seems different.

I’m not sure there’s a lesson here. The usual platitudes – all variants of ‘don’t worry it will get better!’ – are too simple. The emotions are much more complex. There is certainly relief that the economic insecurity of graduate school and the post-graduate transition is over. I am incredibly lucky to have found a job that I enjoy at a supportive organization staffed by smart, compassionate people.

But there is also a nostalgic patina that is already starting to wear away the sharpness of those desperate feelings of my last days in graduate school. I had the opportunity to do the research that I loved. I got to work with students, faculty, and administrators from different backgrounds and with different life experiences who broadened by worldview. I lived in a beautiful place and was surrounded by people who cared about me (and hopefully still do).

Even though graduate school seems far away, real questions and doubts persist in my mind. As Erin Bartram and others have more eloquently noted there is an identity crisis that accompanies the transition out of academia and into another field. For many of us, and I know for me, “intellectual-in-training” or “scholar” was a core component of who we were and wanted to be. I often wonder if being a scholar or historian will be an important part of my identity going forward or, if through some transformative magic, it’s already gone. I would still like to be part of the history community, participate in scholarly debates in my field, and publish on my research interests. Since my research never seemed to generate much interest as a graduate student, however, I’m not sure I have enough momentum or a large enough public profile to continue on as part of that community.

I’ve been thinking often about how to bridge my former and current employment identities. I agree with the career pathways approach that rejects the bifurcation between academic and “alt-ac” employment trajectories to think more holistically about post-graduate employment. That sounds good as an ideal, but what does that look like as a lived experience? Is it really possible to pursue multiple career paths simultaneously without (either implicitly or explicitly) ranking them? Are we fighting in vain against the inevitable disappointment that will occur when graduate students enroll in programs expecting to become professors only to realize that career path is an incredibly difficult one? There are no easy answers to these questions.

Changing a core component of one’s identity is trying, but if we can work together as current graduate students, faculty, administrators, and alumni I think we can better show graduate students the many opportunities available to them. There will never be a perfect solution, but I think there is room to make the post-graduate identity change feel less like a crisis and more like a new phase of personal and intellectual discovery. 

The Trouble(s) with Dissertations

Uncategorized

It seems like most of the chatter among American historians has focused on two debates about the dissertation: should graduate students approach dissertation writing with the intention of having it ready for publication upon completion or is dissertation writing somehow different from book writing? AND should universities automatically embargo dissertations – that is prevent digital copies of them from being made available to scholars – or allow access to them immediately upon completion? I have largely remained on the sidelines for both debates for several reasons. I am in the early stages of my dissertation research and do not have much valuable wisdom to provide on either topic. I am not particularly interested in these kinds of debates and would rather argue about historical content instead of the politics of the history profession. But more than anything else, I have stood aside because I think the answer to both questions seems clear: allow the individual graduate student their own approach to writing their dissertation and choice whether – and for how long – they would like to have their university embargo it.

I am approaching my dissertation as a dissertation and not as a book. My dissertation is on the history of American China Studies and how it shaped and was shaped by mid-20th century American politics. While I feel it has the potential to have mainstream appeal, I am not sure that writing a book for a wide audience is the best way to present the significance of my argument or my skills as a historian to my peers. Fundamentally, I see the dissertation as a certification as to one’s qualifications as a historian. Demonstrating these qualifications – ability to use archives, work in foreign languages, articulate a novel and significant argument – does not always make for the most compelling reading even for one’s scholarly peers. Yet, I believe that it is important for my project and for my potential employers to demonstrate these skills, although doing so may mean substantial revisions (including cutting, adding, and rewriting chapters) when the dissertation is transformed into a book. The process make take more time, but I am confident in my dissertation prospectus and believe the final product will be well worth the wait.

Though I am not approaching my dissertation as a book, that does not mean every graduate student should avoid writing their dissertation as a book. At the Society of U.S. History blog, Rachel Shelden has given a litany of reasons why writing her dissertation as a book worked for her. Ultimately, each graduate student and their advisors and mentors must choose their own path. There is no “right” answer.

I feel similarly about embargoing dissertations; each student should be allowed to choose whether or not her dissertation will be embargoed by her university and for how long. Debate over embargoing dissertations was brought to the fore by an American Historical Association statement in June urging universities to embargo all student dissertations. This attracted criticisms from many historians who saw the announcement as a foolhardy commitment to the dying medium of print monographs and doing a disservice to young scholars and the profession as a whole by keeping the innovative work of young scholars out of the hands of their peers. Further arguments for the embargo have been forwarded since the AHA’s initial announcement, most eloquently by former AHA President Bill Cronin. I understand this puts a lot of stress on university administrators and library personnel who have to process these requests. I understand that it is easier to approach embargoing with an all or nothing mentality. But in the end, the dissertation is the intellectual property of the graduate student who researched and wrote it and they should be allowed to restrict or provide access to it as they see fit.

There are some obvious pitfalls to this case-by-case approach. What if a graduate student forgoes embargoing her dissertation and it is never published as a result? What if a young author’s work is preempted while her dissertation is embargoed? Shouldn’t the university have some control over the dissertation seeing as they provided at least some of the financial and material support necessary for its completion? Though these issues may seem significant – and indeed many are – the fundamental point remains that neither the AHA nor the university should be compelling graduate students to either embargo or not embargo their dissertations. The choice should remain their’s and their’s alone. Historians differ in how they want their work to reach their target audience. Some may want their dissertation to be published as a book, others may not want an academic career and therefore do not see the need to revise their dissertation and make it a book. All of these approaches are valid and the university should be compelled to respect all of them, even if they’re inconvenient.

To me, both of these controversies point to the continued employment crisis facing young historians. With their traditional means of ideological dissemination (the print book) and their workspace (the university) contracting, even as the number of graduate students continues to grow, the uncertainty facing young scholars adds urgency to debates that to outsiders may seem like small potatoes. After all, writing dissertations as books and embargoing dissertations are only relevant issues if there continues to be a publishing industry looking to publish those books and universities looking to hire their writers. Despite their seeming insignificance, both debates highlight the one thing the graduate student does control in this unstable professional climate – their own work and ideas. If control over those ideas and their form is taken out of the young scholar’s hands, be it by the university or the AHA, then there is nothing left for the young historian or the future of the profession.

I Defended My Dissertation – What Do I Do Next?

BlogUncategorized

I defended my dissertation this past December. In the defense’s afterglow, I started to wonder – what’s next? I knew I wanted to turn it into a book, but I had no idea where to start. Should I contact publishers? Start writing a book proposal? My advisor suggested stepping away from the project to gain perspective. Still, I was skeptical. I didn’t have a job and thought the only way to get a job (or even a post-doc) was to have, at the very least, the book under contract.

After taking some time to think through this, I did what any good millennial would do: took to Twitter. I received a staggering number of replies from professors, publishers, and graduate students offering advice or commiserating about the opacity of the publishing process. I am distilling the substance of the discussion into five points, in the hope that in the future they can guide graduate students wrestling with the same questions.

1. Take a break. Writing a monograph-length dissertation is a long and, oftentimes, arduous process. In so doing, it is easy to lose perspective. In my case, I had largely stopped reading books outside my narrow subdiscipline and immersed myself in the project’s archival and primary sources. Most Twitter respondents recommended putting the manuscript aside, focusing on other projects (an article, teaching a class, etc.), and returning to it with fresh eyes after between six months and one year. By then, the hope is that you can bring a new perspective to the project.

2. Read other things. Fresh perspective cannot be attained through idleness, however. When taking a break from the manuscript, respondents’ advised that you delve into books outside your disciplinary niche. Many recommended reading fiction, since its emphasis on narrative and readability are two qualities lacking in many manuscripts. Others recommended reading prize-winning books. These could act as models for a book proposal or provide insight into how to best frame arguments. For Twitter respondents the message was nearly universal: the best writing begins with omnivorous reading.

3. Network. Like any other employment opportunity, finding a publisher for your manuscript is easiest achieved through networking. The best place to find these networking opportunities is at academic conferences. Respondents in the publishing industry shared that they meet many first-time academic authors at conferences. Larger academic conferences (AHA, OAH, MLA, etc.) usually have the greatest number of publishers, but smaller conferences can present greater opportunities to meet and have sustained discussions with publisher representatives. If conferences are too expensive (and for many graduate students they are), rely on your existing social network. Ask your advisor, faculty in your department, or alumni if there is anyone at their publisher you could speak to about your manuscript.

4. Write your dissertation as a book. If you have not yet completed your dissertation or are in the beginning stages of your graduate career, you may want to think about your dissertation as a book. This is a polarizing approach and one that will need to be worked out with your advisor. There are at least two ways of thinking about a dissertation. First, the dissertation-as-certification approach, which sees the dissertation as a document proving your abilities as a scholar. This generally means lengthy forays into historiography, rigorous citation using mostly archival sources, and favoring argument over narrative. Scholars advocating this approach see the dissertation as a showcase for all the skills you have learned as a graduate student and the defense of the project as certification that you belong in company of other professional academic historians. Second, the dissertation-as-book believers argue that since the real disciplinary standard is a publishable manuscript emphasis should be placed on those traits publishers find desirable – narrative, clear argument, and a clear writing style – over skill demonstration. While there is disagreement over which approach is best, writing the dissertation as a book has obvious benefits in the transition from manuscript to published book.

5. Write a different book. The most surprising suggestion I received was not to transform the manuscript into a book at all. Instead, these respondents suggested to think of the book as a totally different project than the dissertation. On the surface this seems ridiculous. I just spent five, six, seven years writing a dissertation and now you’re telling me to scrap it and start over! What a waste of time! Yet, when you think more deeply about divergences in form and audience, thinking about the book as a new project makes more sense (particularly if you took the dissertation-as-certification approach, as I did). One respondent put it particularly succinctly, “You don’t revise your dissertation; you steal from your dissertation while you’re writing your first book.” Thinking about your manuscript as a second project can free you to think more capaciously about your manuscript topic than trying to revise a dissertation project intended for a narrower audience and with more limited objectives.

These five points are heuristics for the manuscript-into-book transformation that I intend to follow over the next six months. I’m sure there will be disagreement and all of these points are subject to debate (and if you have further questions or comments feel free to post below). Thank you to everyone who responded and I hope this short memo will help graduate students feel a little less lost after they defend.

President Obama’s Moral Revolution and Its Dying Vanguard

Blog

On May 27, President Obama became the first US president to visit Hiroshima since the United States dropped an atomic bomb on that city on August 6, 1945. The visit was the culmination of a long process of reconciliation between the United States and Japan since the end of World War II when Japan changed from wartime enemy to valuable ally almost overnight. In his speech delivered before the bombing’s survivors and their relatives, President Obama called for a “moral revolution” in the face of increased technological capacities to kill large numbers of people. “Technological progress without an equivalent progress in human institutions can doom us,” he warned (full text of the speech can be found here).

President Obama’s speech echoed the concern about technological change outpacing human moral capacity that many American journalists and academics felt in Hiroshima’s immediate aftermath. Journalist John Hersey, sent to interview bombing survivors for The New Yorker, told stories of dread, shock, and suffering. Lieutenant Daniel McGovern captured videos of the bomb’s impact showing bombed out buildings and the bleached skulls of the blast’s victims. Upon hearing about the bombing painter Pablo Picasso is supposed to have remarked, “the genius of Einstein leads to Hiroshima,” linking the beauty of scientific discovery to the devastation of instantaneous mass murder.

Nothing captured concerns about the ethics of using an atomic bomb better than Alexander Leighton’s 1949 book Human Relations in a Changing World, however. Before arriving in Hiroshima in December 1945 to map the bomb’s psychological effects on Japanese civilians for the United States Strategic Bombing Survey, Leighton had spearheaded research on the morale of Japanese-Americans interned at the Colorado River War Relocation Center at Poston, Arizona. The aim of his research at Poston was to assess how the Japanese community responded to the stress of relocation and internment. Leighton hoped that an administration informed by social scientific knowledge – group psychology in particular – would be more efficient and humane. While Leighton did not oppose internment, he advocated administrative reform in the camps emphasizing cooperation between administration and internees on issues ranging from public health to community leadership with the hope of combating the dehumanization of internees. When he left in 1943 to take job with the Office of War Information, Leighton was confident that social science had ameliorated conditions in the camps by improving relations between camp administrators and internees.

Leighton’s attitudes between his work at Poston and his trip to Hiroshima differed markedly, showcasing a lost confidence in the ability of administrative reform to keep pace with the technology of dehumanization and killing. Whereas the poor conditions at Poston – sweltering heat, unsanitary and overcrowded facilities, and popular distrust of administrators – could be overcome by administrative reforms and improved communication, at Hiroshima there was little left to reform. Describing his first impression upon arriving at Hiroshima, Leighton invoked a “city dump with its smells of wet ashes, mold and things rotting, but one that runs from your feet out almost to the limits of vision.” The 4.4 square miles of downtown Hiroshima were completely destroyed. Leighton found a people shattered by the experience of vaporized lives and lost loved ones. An elderly schoolteacher told Leighton the bomb had transformed Hiroshima from “Paradise to Hades” in an instant. What haunted Leighton most was a feeling that Hiroshima was only the beginning. “This is a preview of things to come where I live,” he wrote, “These thousands dead will not be strangers. They will include friends, brother, sister, father, mother, wife and children. Some will die instantly and some will survive awhile to realize their slow death and to beckon for help where there will be no one to answer.”

Leighton came to believe Hiroshima was made possible by the outpacing of moral or civilizational progress by technological development. He hoped that social scientific advances would make using weapons of mass destruction obsolete by easing international tensions. Work in the fields of sociology and anthropology had important roles to play as well, highlighting commonalities unifying the human species. Furthermore, the very place of the social sciences in tying the impersonal work of the hard sciences to the moral world of human beings was significant. Leighton believed social scientific interventions into the natural sciences were necessary for moral guidance. “Moral values when pertinent dominate scientific values at three contiguous points: the selection of the problem to be investigated, the limits of the human and other materials that may be used, and the determination of what shall be done with the results.” Social scientists with their specialty in human values and experience would prevent scientists from privileging scientific theories and results over ethical concerns.

Leighton made numerous recommendations for how to disseminate social scientific knowledge ranging from expanded university fellowships to public education initiatives. Explaining the values and experiences unifying humanity was, for Leighton and others who experienced Hiroshima’s aftermath, an obligation shared across American society from policymakers in Washington to families in small towns.

Leighton’s suggestions make uneasy reading with the continued national defunding of the social sciences during the Obama administration. The Obama administration has vocally supported the STEM fields, but have elicited a lukewarm (at best) response to promoting the social sciences and humanities. In April 2015 the Republican-led House of Representatives Committee on Science, Space and Technology proposed a 45% reduction in federal funding for the social sciences (a useful summary can be found here). This while increasing the overall budget for the National Science Foundation, “adding more than one hundred million dollars each to the offices for biology, computing, engineering, math, and physical sciences.” National cuts reflect declining university enrollments in the social sciences. The University of Washington, for example, reported declining enrollments

Alexander H. Leighton in Poston, AZ during World War II.
Alexander H. Leighton in Poston, AZ during World War II.
in the social sciences ranging from four to forty-five percent depending on the department and responded by cutting twenty-five teaching assistant positions. The 2015 panic over national cuts confirmed fears that waning American economic competitiveness made separating the “useful” natural sciences from the superfluous social sciences a priority for policymakers and universities alike.

President Obama’s visit comes at a crucial moment as America’s East Asian allies are challenged in the South and East China Seas by an expansionist China. His speech was both a reaffirmation of his commitment to Japan as a US ally and a warning to China about the dangers of expansionism. The President’s speech also underlined the perils of dehumanizing language for American audiences. Donald Trump has risen to the Republican Presidential nomination on hateful rhetoric meant to demonize racial, gender, and cultural “others” as inferior and dangerous.

The moral revolution Obama sees as the anecdote to aggressive expansionism abroad and xenophobic nationalism at home begins by reaffirming the human obligations of global citizenship. Yet, it is difficult to imagine constructing a civically responsible American populous while systematically defunding its social scientific and humanistic vanguard. Moral revolutions are not spontaneous. They begin with an understanding of current ethical problems facing humankind and the context of how we are all facing those problems together as part of a single global community. The social sciences and humanities have an important role to play in demystifying other cultures and educating Americans how to become contributing global citizens.