The problem with preparing grad students for alternative, non-academic careers

Author Profile: 

Anthony Grafton and Jim Grossman recently published an editorial in the AHA magazine, Perspectives on History, that argues that graduate schools must do a better job preparing students for work outside of academe.

As Grafton and Grossman note, many - perhaps most - graduates will not find tenure-track jobs. “ Why not tell our students, from the beginning, that a PhD in history opens a broad range of doors? As historians, let's begin with some facts. Holders of doctorates in history occupy, or have recently occupied, a dizzying array of positions outside the academy: historical adviser to the Chief of Staff of the Army, Speaker of the House of Representatives, the Chief of Staff to the Speaker of the House of Representatives, museum curators, archivists, historians in national parks, investment bankers, international business consultants, high school teachers, community college teachers, foundation officers, editors, journalists, policy analysts at think tanks (yes, an entry-level position). The skills that these historians mastered as graduate students—doing research; conceptualizing relationships between structure, agency, and culture; combining research and analysis to present arguments with clarity and economy; knowing how to plan and carry out long-term projects—remain vital in their daily work. In many organizations outside the academy, a doctorate is a vital asset for those who want to rise above the entry level.”

While acknowledging the already severe time-limitations on students, they recommend broadening the curriculum to include courses in statistics, economics, public policy, and perhaps other disciplines (including acting!). 

Their article misses some key features of the economy that make their analysis, worrisome as it already is, too optimistic. What’s worse, students of premodern Europe are in an especially unfortunate position.

Last spring, Stanford hosted the Bibliotech conference. The aim of the congress was to bring Silicon Valley bigwigs and Stanford professors together to discuss the role of advanced humanities in the tech industry. While almost every discussion was illuminating, what emerged was not encouraging. We learned that many tech moguls identified the humanities with the “creative right brain,” as though all of us were artists, and that others did not even know what the humanities were (one eminent Googler consistently spoke of social psychology and political science as humanistic disciplines). 

Those speakers who did know what graduate programs entailed were unanimous: advanced study in the humanities was nice, but did not confer a distinct advantage over people who had merely majored in the humanities in college. In other words, 30-year-old PhDs would be competing with 22-year-old bachelors for the same positions. If anything, the decision to get a PhD aroused suspicion (as one venture capitalist put it) that we had merely been trying to escape the real world by getting a doctorate.

I am amazed that scholars continue to believe that advanced skills in the humanities are an asset in the working world. Policymakers and business leaders have declared, time and again, that they want brief, clear, and actionable analysis from their employees. Professional scholars are trained to write long, complicated, and nuanced analysis: almost precisely the opposite of a memo. I do not doubt that some of us could learn to produce what the working world demands, but our training in no wise prepares us for that activity. 

Then there is the problem of passion. The phrase “corporate slavery” suggests total servitude in a stultifying job in exchange for being very well paid. In reality, people compete as aggressively for jobs in the corporate world as we do for academic positions, and for the same reason: passion. Employers expect job applicants to have a demonstrated interest in their sector or field. While in some cases doctoral work may prove passionate interest in a field, more often than not, our specialities have no clear analog in the outside world (by “clear” I mean clear to employers).

The situation for memsians* is particularly dire. As scholars of the premodern world, we do not have an obvious interest in the urgent political or economic issues of our time. On the contrary, methodologically historians at least are concerned with evacuating all trace of presentism from their work. Even if my work on seventeenth-century bureaucracy sheds some light on the modern world, I could not pass myself off as an expert on contemporary bureaucracy. 

Then there is the foreign language problem. These are the languages memsians probably know: Latin, French, Italian, German, and Spanish. None of these tongues are considered rare and therefore do not command much value in the marketplace. By contrast, scholars of the Ottoman Empire in the age of Suleiman the Magnificent know Persian, Turkish, and Arabic. There are plenty of jobs outside of academe for Ottoman scholars, even if their dissertations dealt with love poetry or mystical philosophy.

If memsians face an especially difficult time of things, however, a separate set of calculations affects all of us equally. That is opportunity cost.

Most graduate students who have passed their oral exams, conducted their research, and started writing their thesis face an insistent question: are they good enough to make it? Like Weber’s Protestant artisans, the best way to answer that question is to work hard (VERY hard) and lead a virtuous life. Preparing oneself for a job outside of academe is incompatible with that goal.

Changing careers is always difficult and time consuming. The weekly marginal benefit of doing more academic work seems high over time. Let’s say that grad students have six hours of “free” time per week. They could either spend that time preparing for a non-academic job, or they could read another book related to their field. That’s fifty-two books a year: it might make a substantial difference to the quality of their dissertations and give them the edge that helps them get tenure-track jobs.

Spending substantial time preparing for a non-academic job makes it that much more likely that one will not land an academic job. 


~Corey Tazzara

 Editor's Note:

memsians*: specialists in the field of MEMS (Medieval and Early Modern Studies)


Author Profile: 

Thanks for this illuminating, first-hand response to the current dilemma of graduate programs in the Humanities: stick to the narrow definition of what we train graduates for (in earnest, reproduce their professors' paths for the  lucky few, and for the rest at the very best find a job in other parts of academia such as administrative positions or libraries, in the publishing industry, long-form writing, of maybe consulting) and risk deluding students in an all or nothing mind set about he sacro-saint tenure-track holy grail, or address the realities of the job market and realize that training graduate students for other jobs is an immense challenge.

I was struck that you confessed that NOT relating the past to the present is --more or less--a sign of good scholarship in history. This does not have to be that way, and I would argue, it should not, at least if we want to share why the past matters to a larger audience (those CEO in tech industries who would benefit from learning about information technology in the 17th century for instance), and if we want to be honest with why we are passionate about it. We just have to look at the long history of academic trends (in topics and methodologies) to see that what's "hot" at one time cannot be separated from the cultural, economic and political context of the scholars themselves. Why are Persian and Arabic studies in fashion right now? Because they matter to us, who live in a post 9/11, Arabic Spring world (with, in Europe, large communities of 3rd and 4th generation immigrants embracing Islam.)

So why not change the culture of graduate studies so that what we do connects to the present in more explicit, rather than buried, ways?

In addition, I would not understate the value for an employer of the work ethics and intellectual profile that a PhD (in whatever field) implicates: hard work, autonomy and perseverance at mastering huge amounts of data and "cracking" the meaning of at first illegible strings of events or facts. Who would not want an employee like that?

And we should be reminded that not all nuclear physicists end up becoming nuclear physicists either: in fact, one of my good friends who was at one of the first three top labs in the world in that field is now a financial analyst for Wall Street (what can I say for my defense: I met him in high school). So all graduate programs face the same problem of placement in and outside the field. We are just not as good at, and more reluctant to, marketing our programs for the real, rare intellectual qualities they nurture. (Any CEO will also tell you that Business School graduates have learnt close to nothing that is directly applicable to their first job in a company).


Did I mention that our hope is that a revamped, opened-minded CMEMS could change the perception of graduate programs in the Humanities by the outside world and our own professional culture?

A problem in the Grafton/Grossman piece is that they focus on changing the "vocabulary"--important enough, to be sure--but don't want to change the practices. That is insufficient. There is no reason to assume that the form of doctoral programs, as they now exist, is the only form they could ever have.  On the contrary, they have changed rapidly--for example, according to the Survey of Earned Doctorates the time to degree in the literature fields has grown from around 5 years in the 1960s to more than 9 today. History degrees take even longer. (By way of comparison, the M.D. is a four-year degree).  Keeping graduate students in an extended limbo does not benefit most of them: by which I mean, only about half who start, finish; and only about half who finish end up in faculty positions in academia. Many of them of course leave with considerable debt (one of the big issues in the current protest wave).


It's time that academic programs begin to take a close look at the trajectory of doctoral education and ask what is really necessary and what is just habit. We should also be equipping students with skills that surpass their research needs--skills that could be transfered to non-academic careers or that, alternatively, could prepare our doctoral students for leadership careers within the academy. If we humanists don't do that, we should not complain that presidents and provosts come from other sectors.

This problem reaches far beyond the humanities. In many universities, graduate programs in the social sciences and the physical sciences as well as the humanities really exist (or exist at a much larger scale) to provide low-cost teaching and laboratory assistants.

And this problem reaches far beyond graduate education. The move to equip undergraduates with "skills" necessary for outside employment was the central conceit of last year's notorious Browne Report (see Stefan Collini's account here), a terrifying re-imagining of the UK's higher education system as the handmaiden of economic growth.

The truth is: very few jobs require a ph.d., and in few are applicants even significantly advantaged for possessing one instead of a less time-intensive postgraduate degree.

All this in the face of the worst youth labor market since record keeping began, in 1948.

To me, all this suggests that advocating for humanities education by discussing the value such an education offers employers is the wrong approach.

Finally, re: Cécile's advocacy of showing links between present and past: I understand the sentiment, but there are real reasons why historians are suspicious of such projects. It's not just that we're stuffy--I mean, we are stuffy, but it's not just that. Actively shaping narratives to serve present purposes is a constant temptation. But that tendency has been responsible for the profession's greatest failures--mythic histories used to celebrate, legitimate and justify regimes of domination and oppression.

My suspicion is, that it is not an accident that popular history is generally filled with lies. Historical stories play important roles in people's conceptions of who they are and what part they play in the world. To the extent that historians are willing to distort their narratives to create romances, they can command popular attention. Most of us regard it as our intellectual responsibility to puncture these stories, not perpetuate them.

Author Profile: 

Thanks for the pointed, if dark, argument you make here Noah. A tiny correction: I don't think I was suggesting that historians (and we are all historians when we study the past) should romance the past to make it alive to present audiences. Quite the contrary: myth-making is not the prerogative from the modern age, and tracing the processes and discourses where our modern myths and political concepts originate clearly define them as what they are, cultural and ideological constructs, while also pointing to similar kinds of social constructions and their dubious grounds in our present time. Connecting past and present is not finding mythological origins of the later in the former: rather, it implies bringing awareness to the historicity of our times, and the exacting demands of speaking acurately and critically of ourselves.


First I want to thank Corey for his forthright appraisal of the situation, as well as to the others who have responded. I truly believe in the value of what we do, but also suspect, as Corey does, that this may not be a branding/awareness issue.


Perhaps it is overly pessimistic, but when I hear scholars in the humanities talk about how the business or corporate world “needs” the skills of Ph.Ds, I can’t help but think of an old Calvin & Hobbes comic strip. Calvin is selling “a swift kick in the butt” for $1, and is mystified that his business is terrible because, “Everybody I know needs what I am selling.”


In my (limited) experience, it’s almost certainly true, though I hope to be shown otherwise, that people in the business world generally don’t care about or don’t understand humanities PhDs. Panelists at the Bibliotech conference (graduates of our own Stanford University, I hasten to add) who could not even identify what the humanties are provide an excellent case in point.


Even when they do understand, though, the positions they give to humanities PhDs seem fairly unique, and hardly an outlet for large numbers of graduate students. Talks at the Bibliotech conference, given by humanities PhDs working in Silicon Valley, left me with the impression they were the court/corporate philosopher – every court needs one, but only one. Additionally, although they would be the first to say that an outside or critical perspective is useful for a business, I suspect what they mean is that they want an insider who can also think like an outsider, not just someone with a critical perspective.


I don’t have the sense that professors and academics (here I include myself) have a great deal of experience in the world of business. I suspect that they are not credible advisors (for graduate students or for CEOs) about what that world needs or critics of what companies should or should not be doing for exactly that reason. They may even be suspected, rightly in some cases, of being disdainful of or hostile to the culture of business. I am sure that to people in the working world, much of the humanities-produced discourse about what they “need” sounds pretty flat, produced by a group of people, trained to assemble hammers, for whom every problem looks like a nail.


Finally, regarding the issue of presentism: Pointing out that “modern myths and political concepts” are “cultural and ideological constructs”, showing the “dubious grounds” of these beliefs, is an enterprise I believe to be essential to a functioning liberal society, but beside the point of this discussion. It is a massive leap from identifying constructs to providing some kind of useful expertise or service outside the academy. That’s not to say that we shouldn’t deflate them – people should know that blood libel is a vicious lie in the name of which appalling violence and persecution have been perpetrated. Still, as Calvin & Hobbes suggest, it’s an open question whether puncturing people’s illusions makes for a good business model.

Thank you for opening this dialogue on the value of preparing doctoral students in the humanities for non-academic careers. I'd like to respond to your post as both a scholar of early modern French literature as well as the organizer of the BiblioTech Conference and Director of the new BiblioTech Program at Stanford.


I think it is important to remember that the BiblioTech Conference was the first of its kind and served to begin a conversation between humanist professionals and industry leaders. I agree that the discussion was illuminating and do believe that it was encouraging. As your comments reveal, one of the critical things that we learned from the conference is that we need to do a much better job of articulating who we are, what we do, and what value we bring to the world outside of the academy. We need to do this in very concrete terms and repeat ourselves until we are understood. It is not that we do not possess important and unique skills that the world of industry needs, it is that we have not articulated them clearly enough to those outside of the university.


In my ongoing conversations with industry leaders, I am struck by the consistency with which they tell me that they like what we do and don't think we should pass ourselves off as someone else. We need to champion our difference, for it is precisely this difference that makes us attractive to industry, and it is this difference that synergizes best with our colleagues in the social sciences, sciences, engineering, medicine, and business. It is the concept of difference that defines Apple's success. We need to define our difference and market it.


As a scholar of the early modern period, I believe there is great value in what we do precisely because it is different and difficult, and requires stamina, discipline, and intense intellectual focus to master. Our prize commodity is not to be found in the product of our study, but in the process.


We need to articulate the difference of our study of the early modern. As information begins to replace knowledge, scholars of the early modern period are even more valuable. Why? Because we have honed our skills of research, sustained inquiry, discipline, and engagement with a period at once different from -- yet having echoes in -- our own, and we can see trends over the long haul.


I wholeheartedly agree with Grafton and Grossman and applaud their insightful and progressive article. I especially agree that a Ph.D. in a humanist field opens a broad range of doors. I do think that those doors may have been closed for some time now for myriad reasons, but I know that the time is now to open them. And since we know best what it is that we bring to the table, it is up to us to use our skill of persuasive communication to convince the world of our value.


Please don't be discouraged by individual public comments regarding the perceived lack of value of a Ph.D. in a humanities discipline. Industry leaders love "smart people" and there is no need to prove your intelligence beyond those three letters after your name.


My goal is for a Ph.D. in a humanities discipline to be regarded as a professional degree that justifies entrance into mid-level positions across employment sectors. As we work to reach that goal, Ph.D.s in humanities disciplines may need to accept entry-level positions. The good news is that Ph.D.s tend to advance in their careers much more quickly than those possessing a bachelors or masters degree.


As for passion, again, it is not passion for our subject that matters most, it is passion, period. If you can demonstrate your passion for the early modern period, you can demonstrate your passion for the sector in which you'd like to work. Mind you, this is just as true in academe. Professors who transfer their passion to their students are better professors. Period.


I am quite optimistic that this is an exciting moment in history to be a Ph.D. in a humanities discipline. This is the first time in history that we are able to openly discuss the merits of our work in non-academic employment. We need to articulate our value to the "outside" world -- though I would argue that the line between inside the academy and outside the academy is becoming more and more faint -- and do our best to prepare for the real possibility that we will need to find a job outside of the academy.


As for the foreign language "problem," I see no problem at all. Again, it is not the product; i.e., the language that we speak, but the process of mastering one or more languages and inevitably those cultures, as well, which is most important. It is up to us to articulate that to industry. If I had to choose a language to master today, I would choose French again. I love French, I'm passionate about French, and my mastery of it and French culture influences my work every day in the best possible way.


As for time, yes, a graduate student's schedule is full. It is also flexible. I must disagree with your statement that, "Preparing oneself for a job outside of academe is incompatible with that goal [of working very hard and leading a virtuous life]." While one could certainly put forward many possible goals of a humanities Ph.D., for the sake of discussion, I would say that it may very well help graduate students to finish their dissertation if they budget some time to work in other employment sectors. The different perspective they would bring to the dissertation could be very productive.


The BiblioTech Program has just gotten under way, and I encourage you to follow our progress over the year. We aim to articulate the value of humanities Ph.D.s to the world outside of the academy; establish internships for humanities Ph.D. students; hold a BiblioTech Symposium for humanities professionals and industry leaders to discuss in-depth the concrete ways in which humanities Ph.D.s can enter into compelling mid-level careers across a broad spectrum of industry; design curricula that can aid our humanities Ph.D.s in attaining non-academic jobs; and offer opportunities for networking with industry leaders. Our website will be up soon. In the meantime, I encourage anyone with an interest in BiblioTech to contact me.


Having a doctorate in history gives a person the abillity to apply logical thinking beyond the average temperment. This person's view of the world is beyond most peoples' scope of imagination or brain power, for that matter. 


Granted, outside of academia, thier career paths seem slim and outright, well - weird, but if this person is wise in marketing themselves they would still have a wealth of opportunity. 

The style of thinking that this degree gives a person would make them applicable to any type of medical industry, such as software, medicine, human resources as they apply to the masses, etc. Stay with me: when it comes to the medical industry - you have to consider a mass of people and/or cultures as your target audience. The medical care field is one that is about to crumble in our society - so this is a delicate matter. A historian would have input on how to delicately approach the sales in a field like this.

This, again, is applicable to a lot of other industries. However - when I graduate, I am most definitely going to be looking to remain in the academic realm. Things are much easier that way :).