This is the third year in which I have attempted to put together data on development journals that is not otherwise publicly available or easy to access (see 2017, 2018). Thanks again to all the journal editors and editorial staff who graciously shared statistics with me. Is this a good quality, high visibility journal to ...
David McKenzie considers the following as important:
This could be interesting, too:
Amol Agrawal writes 10 questions and answers about the distributional effects of monetary policy
Amol Agrawal writes CBDCs beyond borders: Allowing tourists to use CBDC?
Amol Agrawal writes Saas, bahu, and ASHA: Information diffusion in rural Bihar
- Is this a good quality, high visibility journal to publish my work?
The first point to note here is that if the mean impact factors are only 1 to 3, and we expect citations to have a long right tail, this would suggest that lots of articles published in journals are cited only once or not at all! Luckily things are not that bleak – it is just that the impact factors only capture citations in published works; which given the long time lags for publications in economics, makes them (to my mind) almost completely meaningless. I therefore took the 2017 issues of each journal, and looked up (in March 2019) the Google Scholar citations of each paper published. Figure 1 gives a boxplot of the data, sorted by median citation rate, and compared to the development papers in the AER and AEJ Applied in 2017 as benchmarks (I took a random sample of half the issues of World Development (WD) and the Journal of Development Studies (JDS), given how many articles they publish).
Figure 1: Boxplot of citations as of March 2019 of articles published in 2017
This figure presents a much more optimistic view of research getting cited. The median AER development paper published in 2017 has accumulated 66 cites and the median development paper in the AEJ applied has accumulated 34 cites. WBRO publishes a small number of review type pieces which get more cites on average than other development papers. But then the median citations are very similar, in the 9 to 11 range, for a group of four development journals (JDE, WD, WBER, and EDCC) – which would suggest a rough translation of one top-5 journal is equivalent to two AEJ Applieds or 6 top field journals in terms of citations. JDS and Economia have medians of 5 to 6 cites, and then the remaining journals have 2 to 3 cites as the median. Some key points to note are:
- Zero citations is rare: only 25 out of the 517 papers considered in Figure 1 have 0 cites (<5%). It is most common for papers in Development Engineering, the IZA Journal of Development and Migration, and Economia, where 15% or more of papers published in 2017 currently have no citations. And citations aren’t the only goals of research of course - papers can still be helpful for policy, teaching, or research practice even if they are not cited.
- There is massive heterogeneity, with the top tail of papers in many development journals having more citations than papers in top general journals: don’t judge the paper by the journal
- As noted last year, these comparisons are complicated further by issues of differences in the ages of the papers by time they finally get published at different journals: which depends on how many other journals authors try first, how efficient journals are in processing papers, how long it takes authors to make revisions, and how many rounds of revisions they require, and then the backlog between acceptance and publication.
Card and Dellavigna document that top-5 journals have seen tremendous growth in the number of submissions over time. While not all development journals were able to provide long time series data on submissions to me, Figure 2 shows more than a doubling in submissions over the last decade: WBER went from 179 submissions in 2008 to 493 in 2018; the Journal of African Economies from 210 in 2008 to 420 in 2018; and the Journal of Development Studies from 515 in 2008 (up from 289 in 2004) to 1255 submissions in 2018. Even over shorter periods, World Development’s submissions increased by over 1000 submissions in 5 years, going from 1720 submissions in 2014 to 2018.
Figure 2: Trends in Number of Submissions to Development Journals over Time
(note: EDCC, WBER and J African Econ left axis, JDS and WD right axis).
Table 2 then shows the number of submissions, and number of papers published each year (excluding online supplements and proceedings), and the acceptance rate. The acceptance rate can be complicated due to articles being submitted one year and accepted in another, and I use what each journal office reports to me. You could also just take the number published over the average of the number of submissions in the last two years and get similar figures. My three takeaways from table 2 are:
- Three journals have expanded the number of papers they publish: WBER which moved from 21 papers in 2016 (and 2015), to 35 to 36 papers each of the last two years; World Development, which has dramatically increased the number of papers from 183 in 2016 to 335 in 2018; and the JDE which published 112 papers in 2018 compared to 80 or fewer the previous two years. These expansions in the number of papers published have helped to keep acceptance rates stable (or even get higher) at these journals over the last three years as submissions have grown.
- Acceptance rates of 5 to 7 percent at some of the development journals are similar to the acceptance rates at the AER and Econometrica (see Card and Dellavigna). Given Figure 2 and that most journals have not increased space that much, acceptance rates in many cases will be half of what they were a decade ago.
- The arrival of new journals like the Journal of Development Effectiveness, IZA Journal of Development and Migration, and Development Engineering has opened up more opportunities for publishing good research that doesn’t face as high a rejection rates.
3. How long does the review process take?
A new interactive website provides desk rejection rates and expected decision times for a number of general journals in economics. Table 3 provides the information I have been able to gather for development journals. The first column shows the desk rejection rate, which is above 50 percent at most journals, and up to 77 percent. I then use the acceptance rates from Table 2 together with these desk rejection rates to give the mean chance of acceptance conditional on the paper going to referees. These are a bit more encouraging than the rates above – and should be some guidance for your over-zealous referee 2’s who want to reject everything – if editors are doing a good job in desk rejecting, then at least one-third of the papers getting sent to referees should be good candidates for accepting (eventually, after they have written the paper the way you would have done, cited several of your papers, and exerted enough blood, sweat and tears to have earned it of course...).
The remaining columns then report the amount of time taken for decisions. I asked for both unconditional on going to referees (which captures lots of quick desk rejects also), and conditional on going to referees. However, not all journals are able to split their data this way, and I caution that sometimes it is unclear whether the last two columns are conditional on being refereed, or also include all the desk rejects. The Journal of African Economies gave me both numbers – and you can see it makes a huge difference for the 3 month horizon (only 23% of papers that went to referees were decided in this timeframe, versus 82% of submissions).
My takeaway from this is that, conditional on going to referees, 60-70 days seems about the average. Add onto this the time taken for the editors to review and decide whether to send to referees, and for referees to agree to the assignments, and you are at about 3 months. The big concern for authors is the probably the chance of ending up in the right tail – so journals where there is a 10% or more chance of the paper taking more than 6 months to referee will be ones looking to improve hopefully here. Of course this depends heavily on the referees also doing their part – so don’t procrastinate too much on those reports!
Comparing these results to last year, the most improved award goes to Economia-Lacea, which greatly increased its desk rejection rate (from 28% to 49%), and lowered its mean decision time conditional on going to referees from 148 days to 82 days. Two other strong improvers were the IZA Journal of Development and Migration (which lowered the mean time conditional on going to referees by 20 days; and EDCC which lowered the mean time conditional on going to referees by 34 days). Even more pleasing, no journals got slower.
4. Open Science
I thought for a last topic this year, I would look at what journals are doing in terms of making research openly accessible, and in terms of data transparency.
Open access: given that research in economics comes out as publicly accessible working papers well before the final journal versions are published, and that many authors put the final versions on their webpages, I think this is less of an issue in economics than in other fields (Berk’s caveats about working papers aside). Nevertheless, gated articles are still a concern for many, and a number of donors are pushing for researchers to publish open access. All but one of the development journals listed above have this as an option – except Economia, and at the IZA journal of development and migration it is mandatory to pay the fee if your paper is accepted. The fee for open access publication is typically $2,500 to $3,000, but a number of the journals have cheaper rates for authors from developing countries. So if you are putting together a grant proposal, this might be one additional item to include.
Finally, an area that is rapidly evolving is whether journals require authors to post data and replication materials for empirical work. Many journals have relatively new policies in this area, and journals range from requiring online posting of the data, to encouraging authors (e.g. EDCC policy) to post materials, to not currently requiring data to be posted. One of the trends at the AEA journals has been for more and more empirical papers to get exceptions to data availability policies because of the use of confidential administration data such as tax data. None of the journals had statistics on exemptions to their data-sharing policies, but note that the policies have been in place for relatively short periods and they have not seen many requests for exceptions yet.