The articles in this symposium raise a host of issues about the role that desk rejects should play in our discipline and how journal editors should balance often-competing considerations. How can we strike a balance between providing meaningful feedback to authors and managing a taxed reviewer pool? How can we focus on publishing the best work in the field while simultaneously helping scholars develop their research so it can eventually meet that bar? How can we ensure that well-resourced individuals are not the only authors for whom top-tier journal gates are open?
Although we cannot answer these questions definitively here, we can use our experience as coeditors in chief of the American Journal of Political Science (AJPS) to summarize how we attempt to balance these competing considerations. From our perspective, the information we provide brings transparency to our desk-reject process, begins to debunk some myths about desk rejects (at least at AJPS), and offers useful data for the discipline as it grapples with the issue. We hope this article also allows authors to understand better the desk-reject procedures we use at AJPS.
THE DESK-REJECT PROCESS
At AJPS, we do not predetermine the number or percentage of submissions to desk reject. Rather, we approach each manuscript with an eye toward determining whether it is suitable for review at a top-tier, general-audience journal. This assessment led us to desk reject 377 of the 1,408 submissions we received in our first year as editors (figure 1). Because this number is not driven by a quota or target, it should quell concerns among those who believe that premier journals in the discipline desk reject as many manuscripts as they send out for review.Footnote 1
How do we assess whether a manuscript is suitable for review? Unlike the process that Professor Gibson fears, in which “one pair of eyes” determines a manuscript’s fate, we use a systematic process that includes several people—at least one of whom is a subfield expert, when a substantive decision is required.
The first step in the submission process is a technical check, during which a highly trained team of doctoral students reviews submissions to ensure that they meet formatting and anonymity guidelines. Before considering formatting and anonymity, however, the editorial assistants flag manuscripts that do not appear to meet the baseline criteria for a manuscript at AJPS: journalistic or opinion pieces, review essays, or those that contain no data or theoretical argument. As coeditors in chief, we review the flagged manuscripts and determine whether each, in fact, is inappropriate for AJPS. Because this decision precedes the formatting and anonymity aspects of the technical check, we can issue it quickly (i.e., typically within three to five days). Of the 377 manuscripts we desk rejected, more than half (194) fell into this category (figure 2). The majority of desk rejections we issue, therefore, involves manuscripts that simply are not political science as we commonly understand it.
The majority of desk rejections we issue, therefore, involves manuscripts that simply are not political science as we commonly understand it.
Manuscripts that make it through the complete technical check then are assigned to subfield experts: the coeditors for American politics and the relevant associate editors for comparative politics, international relations, political theory, and formal theory and methods. At this stage of the process, the subfield expert considers whether the manuscript makes a theoretical and/or empirical contribution sufficient for a chance to receive positive external reviews. In the context of a journal that accepted less than 6% of the manuscripts sent out for review during the first year of our term, it does not make sense for us to put under review a manuscript that clearly does not meet the substantive criteria for potential success.
During the first year of our term, 183 manuscripts fell into this category. Some manuscripts were essentially empirical exercises with no animating theoretical foundation or clear research question. Others offered mismatched theory and data. Some manuscripts submitted in 2020 took on interesting and relevant political questions but relied on dated evidence that did not adequately test the author’s hypotheses. Many submissions failed to break new ground. Still others were duplicate submissions; our policy precludes resubmitting a revised version of a manuscript that previously was rejected, even by a former editorial team.
Because we recently instituted a more streamlined technical-check process that eliminates many of the more time-consuming formatting requirements, we also can quickly transmit substantive desk-reject decisions to authors. In the overwhelming majority of cases, authors will receive the desk-reject decision within a week of submitting their manuscript (and often more quickly than that).
RESPONDING TO TWO MAJOR CRITICISMS OF DESK REJECTS
Critics of desk rejects often point out that the practice—regardless of how judiciously it is used—carries with it two undesirable consequences. First, it leaves authors without the type of significant feedback that might strengthen their work. Second, it has the potential to undermine issues of equity by disproportionately disadvantaging scholars who are not well resourced and/or are from traditionally underrepresented groups. These criticisms often are made without the benefit of available data; therefore, we address them using data from our first year at AJPS.
Providing Feedback versus Taxing the Reviewer Pool
At AJPS, submissions have increased each year for the past 20 years, but the relatively recent escalation has been dramatic. Ten years ago, AJPS received 760 manuscripts per year. In our first year as editors, we received 1,408—an 85% increase. This surge undoubtedly results from myriad factors: authors who “aim high” in submitting manuscripts; departments that require junior scholars to publish in or at least submit to one of the top three journals; widespread submission of papers to top journals simply to receive high-quality review; a requirement by faculty in some graduate programs that every seminar student must submit a paper to a journal; and, in the case of AJPS, a reputation for a short turnaround time (i.e., only 44 days in our first year).
Regardless of the cause, the exponential growth in submissions—along with the launch of several new journals since 2010—means that “reviewer fatigue” is a real and growing problem. At AJPS, we are relatively fortunate that our reviewer refusal rate does not appear to be as high as it is at many other journals. But when people decline our invitations, the primary reason is that they already are overcommitted in the number of reviews they must produce.
If we do not see a manuscript as substantively strong enough to have even a small chance of receiving positive reviews, a desk rejection allows us to “save” reviewers for a manuscript that has a better chance of success. This also allows authors to move immediately to put the manuscript under review at a more appropriate journal, saving them time on the publication clock. Our desk-reject policy, in other words, is beneficial for authors as much as it is for reviewers.
We recognize that even our careful approach does not eliminate the concern that we have denied desk-rejected authors reviews to help them prepare their manuscript for submission to another journal. Although we try to provide at least some helpful suggestions in our desk-reject letters about how to revise the paper for a more specialized outlet, our feedback is not as thorough as a complete set of reviews would be. This is why we need to have the broader conversation about the role of journals in the discipline. Professor Gibson’s desire for a practice in which journals send out for review all manuscripts that they receive means that the discipline must be ready with a solution to the problem of overburdened reviewers.
Issues of Equity
Critics of desk rejects suggest that the practice disproportionally disadvantages certain scholars—namely, those outside of the R1 universe, junior scholars, female scholars, and scholars of color. The logic is that members of these groups are less likely to be established in networks in which they can gain feedback and critiques of their work. Although some might contend that this argument has a slightly patronizing tone toward these scholars’ professional networks, we can examine whether the evidence supports this assumption.
We hand-coded all desk-rejected manuscripts for a series of equity-related characteristics about the authors. More specifically, we coded for their sex, whether there was a Black author, the tenured/untenured status of authors, whether the authors were at an R1 or non-R1 institution, and whether they were affiliated with an institution outside of the United States. We also coded the subfield and whether the submission was solo- or coauthored. These data, of course, can speak only to the representation of authors within the pool of desk rejects. They do not reveal anything about whether women, untenured faculty members, or scholars from outside of the United States are overrepresented in the desk-rejected pool as a proportion of their presence in the pool of submissions. In our year as coeditors, we have learned the limitations of the editorial management systems that most journals use and how little information we gather about the people who submit manuscripts.
Although we can describe the characteristics of only those authors whose manuscripts we desk rejected, we find no clear evidence that authors who might be considered by some to be from a disadvantaged group were any more likely than others to be desk rejected. Solo-authored papers constituted the majority of desk rejects (54%) but not by a substantial margin. Half of the papers we desk rejected were authored by at least one scholar from an R1 institution. More than four of every 10 desk-rejected manuscripts (43%) had at least one tenured author. Moreover, only 3% of desk rejects included a Black author. Sex is the one variable for which we have information on the entire pool of submissions: women comprise about one third of submitting and one third of desk-rejected authors.
Although we can describe only the characteristics of those authors whose manuscripts we desk rejected, we find no clear evidence that authors who might be considered by some to be from a disadvantaged group were any more likely than others to be desk rejected.
Although we must be cautious when interpreting these data and careful not to make direct comparisons to the overall pool of submitted manuscripts, these numbers do not suggest that any one category of scholars is bearing the brunt of the desk-reject process.
SUGGESTIONS FOR DISCIPLINARY CHANGE
The practice of desk rejecting journal submissions raises a host of important issues that relate to one of the central dictates of our discipline: the need to publish in peer-reviewed outlets. The pressure scholars face to publish their work contributes to the escalating submission patterns at top journals, the proliferation of the number and range of journals, and reviewer fatigue. The “elephant in the room” is the degree to which our disciplinary norms have socialized scholars to use the journal-submission process to receive feedback on their work.
The “elephant in the room” is the degree to which our disciplinary norms have socialized scholars to use the journal-submission process to receive feedback on their work.
Many scholars “aim high” in their submissions. They know that they have only a slight chance of having a manuscript accepted at a top journal, but their likely consolation is three helpful reviews. This misuse of the journal-submission process must change. It is absolutely the case that scholars need feedback on their academic work before they submit it for review at the journal where it ultimately will land. But other journals and their reviewers do not have the bandwidth to serve that function. We urge authors to think more carefully about the contributions of any particular manuscript and to balance aiming high with aiming realistically high given the project at hand.
Several venues are appropriate for fostering professional development pertaining to offering thoughtful substantive feedback and identifying appropriate venues for manuscripts. Dissertation directors, seminar instructors, and departmental and subfield colleagues already do a substantial amount of this work. Perhaps our professional associations—whether American Political Science Association, Midwest Political Science Association, or International Studies Assocation—and subfield organizations should contribute as well. Presenting work at academic conferences often is a first step in the process of developing a manuscript. Our professional associations could ensure that manuscripts presented at conferences receive substantive feedback from assigned discussants and reviewers, making the conference-participation stage more valuable to authors. Associations also could investigate the possibility of establishing pools of reviewers willing to read and provide feedback on manuscripts in the pre-publication stage.
Furthermore, departments and graduate programs could help. Rather than encourage (or require) junior faculty and graduate students to use journals as a way to receive feedback, they could consider developing regional or online workshops as an alternative. Moreover, as scholars and mentors, we all can play a part. When we offer comments on a paper or provide advice about where to send a manuscript, we should consider the substance of the piece, the extent to which it would appeal to a general versus more specialized audience, and the quality of the research.
These are only a few plausible suggestions, raised in response to legitimate issues that should spark a broader conversation in the discipline about what the journal peer-review process is designed to accomplish and how much professional development should be part of a journal’s mission. But this conversation should be driven by data and a more fully informed understanding of how different journals handle desk rejects. That is what this symposium sets out to do, and we are pleased to be able to contribute.