The American Historical Association (AHA) held their annual meeting recently, which sparked the usual discussions on historian Twitter about a perennial problem within the field; namely, the job market is dreadful. The odds of finding a tenure-track position grow worse by the year, and Covid has only exacerbated the issue. A couple of days ago, I saw a thread in which one professor stated that history PhD programs should no longer take 6-7 years to complete, and another historian brought up the possibility of graduate students co-authoring dissertations.
With respect to both of these scholars, I fail to see how making it easier to complete a PhD would remedy anyone’s job search woes. If there are too many people on the job market as is, what is the benefit of churning out more graduates at an even higher rate? At the same time, I understand the importance of a strong background in the humanities. Everyone is better off in a society where people understand the complexity of the past. Neither is the answer, then, to turn people away from graduate programs en masse. On a practical level, fewer graduate students means more work for professors and fewer course offerings for undergraduates. How can academia balance these competing concerns?
I believe the solution lies in manufacturing a bottleneck between master’s programs and doctoral programs. As it stands, most universities usher master’s students through to a doctoral program at the same school almost automatically. It is very rare for a master’s student not to be admitted to the school’s doctoral program. A rejection often occurs only if the student’s work is shockingly subpar and it is clear the student is incapable of producing acceptable scholarship. This should change. Acceptance into a doctoral program should no longer be a foregone conclusion for most graduate students. Under my proposed system, here’s how graduate studies in history would play out:
- Schools would accept the same number of master’s students (or possibly an even higher number than they currently accept). These students would receive full funding and the typical stipend, working as TAs and research assistants, just as they do now.
- It would become far more common for students to change schools between the master’s and doctoral programs. Second-year master’s students would reflect on their experience at their current school and apply to several universities for their doctoral program, not just the one at which they are currently enrolled.
- Universities would have far fewer spots in their doctoral programs, and being accepted to one of these programs would become a serious challenge. A student’s scholarship would need to rise above the level of “decent enough” or “pretty good” in order to secure a spot.
- Students who are accepted into a doctoral program would be provided with an increased salary, on par with that of a first-year teacher in the same area. This would reduce the financial burden on young academics as they spend the next four or five years completing their doctoral studies. A smaller total number of doctoral students at the school would make this increased funding possible.
- The majority of students would not be accepted into a doctoral program. However, there would be options that allow these people to become qualified for academia-adjacent careers. What if schools gave history master’s students the opportunity to complete an additional, one-year master’s degree in topics such as library science or museum studies or secondary education? With three years of (fully funded) graduate schooling, these students would be able to find good jobs that put their history education to use while also improving their community.
On some level, I understand why universities would be hesitant to change to this model. No one wants to be the one to tell a young person, “Sorry, but you’re almost certainly not going to get a tenure-track academic job. It’s time to move on.” Even for hardened academics completely lacking in sentimentality, losing graduate students translates to more time spent grading and leading discussion sections.
My model avoids the latter scenario by maintaining the same number of master’s students (or even increasing that number). As for the former, we need to take a hard look at the difference between kindness and avoidance. It might seem kind in the moment to allow someone to continue through a program despite having practically no chance of receiving their desired job. What is the final outcome, though? Is it truly kind to produce thousands of PhDs who have spent their young adulthood in pursuit of an impossible dream? Many of these people try to stay in the game by adjuncting, which pays very little and provides no benefits. Is it kind to ignore reality and set up young scholars for failure?
It is rare for someone who is an average or just decent master’s student to become a stellar doctoral student. Why delay this revelation, then? While it might sting in the moment, I would rather know that I have no future in academia after two years of work as opposed to six or seven. Having several options for training in history-adjacent fields would also soften the blow.
Academics can dress up the issue however they like; the problem remains the same. The field of history has long had a crisis of overproduction. You don’t address a crisis of overproduction by developing convoluted ways to up your productivity. You address a crisis of overproduction by producing less. My model allows universities to produce fewer, but higher quality, PhDs without denying people an opportunity to study history at the graduate level.
Leave a comment