Covid-19 refereeing duration and impact in major medical journals

Abstract Two partly conflicting academic pressures from the seriousness of the Covid-19 pandemic are the need for faster peer review of Covid-19 health-related research and greater scrutiny of its findings. This paper investigates whether decreases in peer review durations for Covid-19 articles were universal across 97 major medical journals, as well as Nature, Science, and Cell. The results suggest that on average, Covid-19 articles submitted during 2020 were reviewed 1.7–2.1 times faster than non-Covid-19 articles submitted during 2017–2020. Nevertheless, while the review speed of Covid-19 research was particularly fast during the first 5 months (1.9–3.4 times faster) of the pandemic (January–May 2020), this speed advantage was no longer evident for articles submitted in November–December 2020. Faster peer review was also associated with higher citation impact for Covid-19 articles in the same journals, suggesting it did not usually compromise the scholarly impact of important Covid-19 research. Overall, then, it seems that core medical and general journals responded quickly but carefully to the pandemic, although the situation returned closer to normal within a year.


INTRODUCTION
The daily international death tolls from the Covid-19 pandemic in 2020 and 2021 created an urgent need for health-related research into treatments, vaccines, and safety precautions, as well as fundamental science to investigate the cell biology of the virus. The social restrictions and economic disruption subsequently generated spin-off urgent requirements for many other types of academic research. This has, in turn, generated the need for faster research than normal to tackle urgent unexpected problems (Bramstedt, 2020;Kousha & Thelwall, 2020). For example, in January 2020, Wellcome called for fast, transparent, open access academic publishing to help humanity deal with the pandemic (Wellcome, 2020). Academia apparently responded quickly, with about 4% of published research being about the pandemic by mid-2020, although mostly not addressing its key health priorities (Odone, Galea et al., 2020).
The rapid Covid-19 research production is faced by a bottleneck for publishing speed in the form of editorial peer review, with the separate stages of editorial review/rejection, reviewing, and author revisions each often taking over a month, depending on the journal and field (Huisman & Smits, 2017). The peer review system before Covid-19 was already considered slow and overburdened by many academics (Huisman & Smits, 2017;Nguyen, Haddaway et al., 2015). For instance, there is evidence that most of the preprints about the Ebola and Zika outbreaks published later in the peer-reviewed journals were accessible online more than 100 days before their publication (Johansson, Reich et al., 2018). Peer review delays may be a n o p e n a c c e s s j o u r n a l Citation: Kousha, K., & Thelwall, M. (2022). Covid-19 refereeing duration and impact in major medical journals. Quantitative Science Studies, 3(1), exacerbated by increased Covid-19 submission rates for some medical journals (Kondziolka, Couldwell, & Rutka, 2020; almost tripling at JAMA: Bauchner, Fontanarosa, & Golub, 2020). Many academics have bypassed this stage by publishing medRxiv, bioRxiv, or arXiv preprints (e.g., Aviv-Reuven & Rosenfeld, 2021; Kataoka, Oide et al., 2020;Vlasschaert, Topf, & Hiremath, 2020), attracting academic and nonacademic audiences. For example, about 25% of all Covid-19-related publications from the first 10 months of the coronavirus pandemic were deposited in the bioRxiv and medRxiv preprint repositories for medicine and the health sciences, indicating the recognized value of preprints to bypass the peer review process, given the pandemic urgency (Fraser, Brierley et al., 2021). Peer review is vital to evaluate academic findings, however, because "rapid publication of clinically actionable information during a pandemic is important, [but] publishing results that are not valid can do harm" (Bauchner et al., 2020, p. 453). This threat has been highlighted by high-profile Covid-19 retractions (Agoramoorthy, Hsu, & Shieh, 2020;Mann, 2020).
Peer review has evolved over time, becoming standardized in the 20th century (Burnham, 1990), but with processes varying between journals (Tennant & Ross-Hellauer, 2020). Some journals, including eLife, have relaxed their peer reviewing standards during the pandemic for all research by ruling out reviewer requests that would be unreasonable for lockdown-affected authors, such as long extra experiments (Eisen, Akhmanova et al., 2020). In contrast, JAMA: The Journal of the American Medical Association has modified its editorial process for all papers and has a fast track for key Covid-19 research that is designed to be as rapid as possible without compromising rigor (Bauchner et al., 2020). The Lancet (Agoramoorthy et al., 2020), PNAS (Berenbaum, 2020) and probably many other journals also have formal or informal fasttrack processes (e.g., Welch, Long et al., 2020). It is not clear whether this translates into falling standards, however. For example, if a peer review is submitted after one day instead of 3 weeks, is the reviewer likely to have rushed the review rather than prioritizing it and giving it the same (or more) time? Intuitively, it seems possible that both happen: Reviewers might spend extra time on a Covid-19 paper because of its higher importance, might forgive its shortcomings (e.g., in writing style) on the same basis, or might attempt to complete the review quickly, recognizing its urgency, even if they did not have time for an adequate review. If large high-impact journals are uniquely able to build bespoke rigorous peer review pathways for Covid-19 research, then this would explain why they initially published the largest share of it (Kambakamba, Geoghegan, & Hoti, 2020). A qualitative investigation into open peer review reports for eLife and the British Medical Journal found similar overall levels of thoroughness, but reviewers tended to use partly different criteria for Covid-19 research, such as requesting less extra work, being more constructive, and requesting that limitations are acknowledged rather than addressed (Horbach, 2021). Comparing Covid-19 and non-Covid-19 research is complicated by the non-Covid-19 health research having changed its average nature due to the pandemic; for example, with fewer randomized controlled trials published in three medical journals (Bian & Lin, 2020).
Previous studies have investigated the duration of acceptance or peer review for Covid-19 research using early data from 2020. However, assessing Covid-19 research published in 2020 gives biases, as Covid-19 publishing effectively started in early 2020 and restricting data to papers submitted in 2020 gives biases by excluding submitted papers that have not yet been published due to long reviewing times. Some studies have nevertheless suggested that Covid-19 reviewing processes tend to have been shorter, which seems intuitively likely. Covid-19 articles posted to PubMed by June 2020 tended to have a fifth of the time in review compared to non-Covid papers posted during the same period and in the same journals (Putman, Ruderman, & Niforatos, 2020) but this comparison is biased because the collection included older non-Covid-19 papers (e.g., 1,089 articles submitted in 2017) than possible for Covid-19. A similarly large-scale study investigated PubMed Covid-19 records from 30 January to 23 April 2020, finding them to be published twice as quickly as comparator sets of articles, with a median 6 days taken to accept Covid-19 papers (Palayew, Norgaard et al., 2020; see also Kambakamba et al., 2020). This comparison was similarly biased in favor of Covid-19 publishing speed. An investigation of 259 Covid-19 and 270 non-Covid-19 articles in 14 health and general journals before and during the Covid-19 pandemic found that publication delays for Covid-19 research had almost halved (51%) due to faster peer review, but non-Covid-19 articles had no change in publishing speed (Horbach, 2020). An investigation pointed out that some Covid-19 articles are published on the day submitted to a journal (orthopedics-related), suggesting special treatment (Khalifa & Ahmed, 2021). A study using the full text records from the Dimensions database compared average days between submission and accepted dates of academic and news articles about infectious diseases (e.g., influenza, pneumonia, or vaccine), finding that weekly review durations of publications during January to July 2020 had reduced by about 30% (40 days) compared with the same period in 2019 and 2018 (Hook, Porter et al., 2021). Another investigation compared average acceptance durations of Covid-19 articles from 11 medical journals published between January to June of 2020 with average acceptance durations of all non-Covid articles published during 2016-2019, finding that Covid-19-related publications were accepted almost five times faster (19.3 vs. 91.3 days, respectively) (Aviv-Reuven & Rosenfeld, 2021). However, the acceptance durations were defined as "the period between the date received and the date accepted or the date online for a paper, whichever is earlier." This may introduce bias because of mixing review and publication durations. Despite the papers reviewed above, no study has investigated a large set of major journals for the influence of Covid-19 on peer review on a per-journal basis. This is important because highimpact journals may generate bespoke pathways, whereas others may not have the resources for this. In addition, most previous studies of peer review speed have used early Covid-19 publication data for small sets of full text articles and were not able to control for Covid-19 studies being young at the time. Moreover, it is not clear whether the initially apparently fast peer review for Covid-19 research has continued to operate over time ( January-December 2020). In response, this article uses a semiautomatic method to assess peer review speed for health-related Covid-19 journals, on a per journal basis, and updating previous studies (mostly) from early 2020. It is driven by the following research questions.
• RQ1: Is Covid-19 research in high-impact medical relevant journals published more quickly than non-Covid-19 research during (2020) and before the pandemic (2017-2019)? Are there significant differences between journals? • RQ2: Does faster-reviewed Covid-19 research generate more citation impact than slower-reviewed Covid-19 research in high-impact relevant journals?

METHODS
To compare the duration of review processes for Covid-19 and non-Covid-19 articles submitted during and before the coronavirus pandemic ( January 2017 to December 2020), the received-accepted dates of 104,851 PubMed indexed articles from 100 mostly medical journals with the highest impact factors were automatically extracted (see Table S1 in the Supplementary material). PubMed seems to have more coverage of current biomedical research than Scopus and Web of Science, making a useful source for addressing the research questions (e.g., Falagas, Pitsouni et al., 2008;Tober, 2011). Although high citation impact is not the same as high quality and impact factors vary greatly between fields, impact factors are a useful quantitative indicator to generate a set of journals that are likely to be much more influential than average for the pandemic. The average review durations (days) of Covid-19 and non-Covid-19 articles were compared across individual journals with confidence intervals at the 95% level to help assess if differences between review processes are statistically significant. Here, the "review process" includes all stages between initial submission and formal acceptance of articles, including refereeing, author revisions, and any other editorial steps. Correlations between review durations and Scopus citations to Covid-19 and non-Covid-19 articles were calculated to examine if articles with faster reviews had more citations. Scopus citations were used because there is evidence that Scopus has wider coverage and indexes Covid-19 publications more quickly compared to the Web of Science (da Silva, Tsigaris, & Erfanmanesh, 2021; Kousha & Thelwall, 2020). The details of this are given below.

Data Set
Clarivate Journal Citation Reports ( JCR) 2019 was used to identify medical journals and PubMed (https://pubmed.ncbi.nlm.nih.gov/) was queried to extract the bibliographic information and the publication history of their articles (received and accepted dates, if any) from January 2016 to June 2021.
The main JCR journal ranking list was manually checked to select relevant medical journals with the highest impact factors. We did not select journals based on JCR subcategories (e.g., Medicine, General & Internal) because some high-impact medical journals with many Covid-19 relevant publications were indexed in other JCR categories. For instance, The Lancet Respiratory Medicine ( JCR rank: 57) was indexed in "Respiratory System," The Lancet Infectious Diseases (JCR rank: 59) in "Infectious Diseases," Immunity (65) in "Immunology," and Gut ( JCR rank: 88) in "Gastroenterology & Hepatology." Moreover, initial checking showed that three high-impact multidisciplinary and cell biology journals, Nature, Science, and Cell, which were not indexed in any medical -related JCR categories, had published many Covid-19 articles. Although this is a multidisciplinary set of journals of different types-some general, some specialist-and with different Impact Factors, the analyses focus on comparisons between the duration of review processes for Covid-19 and non-Covid-19 articles within journals rather than comparisons between journals.
For each initially selected journal, searches were performed in PubMed to check if they had articles with received and accepted dates in PubMed during January 2017 to June 2021. As an example, the following PubMed query was used to download articles published in the Lancet between January 2017 and June 2021: PubMed provides "received for review" and "accepted for publication" dates in the Publication History Status (PHST) field (e.g., "PHST -2020/06/01 [received] and "PHST -2020/07/ 15 [accepted]"). Many journals do not provide article publication histories or only provide either submission or acceptance dates. For instance, The New England Journal of Medicine ( JCR rank 2) does not provide publication histories and The Nature Reviews Drug Discovery ( JCR rank 4) only provides acceptance dates 1 . To identify which journal provides both dates for articles, we checked the PubMed results for all medical journals with the highest impact factors. Although some journals provided submission and acceptance dates, they had very few 1 The required submission and acceptance dates were not identified for articles published in Nature Reviews and JAMA journals such as Nature Reviews Immunology and JAMA Internal Medicine.

Quantitative Science Studies
published standard articles (excluding editorials, standard reviews, notes, letter, and other nonarticle contributions). However, systematic reviews and meta-analysis were not excluded because they present results by analyzing and combining data from relevant investigations (e.g., Diagnostic performance of different sampling approaches for SARS-CoV-2 RT-PCR testing: a systematic review and meta-analysis). For example, CA: A Cancer Journal for Clinicians ( JCR rank 1) with 75 articles during 2017-June 2021 had only one Covid-19-relevant article at the time of data collection. To have meaningful statistical evidence, only journals with 13 or more Covid-19 articles were selected for analysis.
To automatically extract publication history histories of articles from PubMed outputs, a program was written and added to the free Webometric Analyst software (https://lexiurl.wlv .ac.uk: Services menu, "MeSH and PubMed/Convert PubMed plain text format to tabdelimited"). The program extracts bibliographic information as well as received, revised, and accepted dates of articles (if any) from PubMed format saved files in PubMed format. The differences between submission and acceptance dates were calculated using the DATEDIF function in Excel. Covid-19 relevant articles submitted during 2020 were identified through relevant Covid-19 terms in their titles, abstracts or keywords ("Covid-19" OR "Covid" OR "coronavirus" OR "corona virus OR "2019-nCoV" OR "SARS-CoV-2").

Data Cleaning
Although the PubMed filter option was used to limit the search results to "Journal Articles," manual checking showed that some PubMed results were not articles. These are important to exclude because peer review might be quicker for editorial material or slower for review papers. To have a more uniform data set for statistical analysis, the DOIs of the extracted PubMed publications were searched and matched against both Scopus and the Web of Science ( WoS) to get additional type classifications, and any identified as nonarticles in either were excluded. Moreover, to minimize the duration advantage of Covid-19 articles, the submission years of articles were used instead of publication years for analysis to have no article older than a year. This reduces bias because the submission dates of all Covid-19 articles were after January 2020 (during the pandemic), but many non-Covid-19 articles published in 2020 were submitted in 2019. Hence, comparisons between the speed of review processes during the pandemic (2020) could be unfair. For instance, the non-Covid-19 article "Integrin α6 mediates the drug resistance of acute lymphoblastic B-cell leukemia" was submitted to the journal Blood on 20 March 2019 (almost 9 months before the pandemic) and published on 27 March 2020, and so would automatically have a longer review time than all Covid-19 articles published on the same date. To obtain Scopus citations, queries were automatically generated to combine DOIs of PubMed data and searched via the Scopus Advanced Search interface (e.g., DOI("10.1128/jvi.02422-20") OR…).
In summary, the final data set mainly consisted of 97 medical journals with the highest impact factors that also published submission and acceptance dates in PubMed and at least 13 Covid-19 articles. It also contained two leading multidisciplinary journals (Science and Nature) and one cell biology journal (Cell ), all of which had published many Covid-19 publications. For convenience, this set will be described as 100 medical high-impact relevant journals.
A shared data set provides bibliographic information, submission and acceptance dates, and Scopus citations of 104,851 PubMed indexed articles from 100 medical journals used in this study (https://figshare.com/s/fac222ae70ce013128cd).

Analysis
Geometric means were used to compare the average peer review durations for Covid-19 and non-Covid-19 articles published by year and by journal. The geometric mean was used as an appropriate central tendency measure for highly skewed data and confidence intervals were calculated at the 95% level to assess if differences between the average review processes for Covid-19 and non-Covid-19 articles were statistically significant in each journal.
Spearman correlation tests were used to assess the strength of association between the review durations and Scopus citation counts for Covid-19 and non-Covid-19 articles. For instance, negative Spearman correlations between peer review durations and Scopus citations would suggest that faster-published articles tend to generate more citation impact.

Review Process Duration for Covid-19 and Non-Covid-19 Articles
On average, Covid-19 articles submitted to the 100 selected medical-relevant journals during the coronavirus pandemic in 2020 were reviewed almost 1.7 times faster than non-Covid articles submitted to the same journals in 2020. There was a 36 days shorter average review processes (49 and 85 days, respectively). The reviewing process speed of Covid-19 research also was 2.0 to 2.1 times faster than non-Covid-19 research before the pandemic submitted to the same medical journals during 2017-2019. Because the confidence intervals do not overlap, the difference between Covid-19 and non-Covid-19 article review durations is statistically significant at the 95% level (Figure 1).
The fastest Covid-19 review processes were for articles submitted during March and April 2020 (3.4 and 2.7 times faster). However, the fast-reviewing of Covid-19 research decreased over the following months and review durations became similar to those for non-Covid-19 research during November and December 2020 (Figure 2). This suggests that, on average, the peer review system in the medical journals had responded appropriately to prevent publication delays for Covid-19 research during the first months of coronavirus pandemic. For instance, of 185 Covid-19 articles that had been reviewed within 5 days or less, 83% (153 articles) were submitted to the journals analyzed during the first 6 months of pandemic ( January to June, 2020).

Review Process Duration for Covid-19 and Non-Covid-19 Articles by Journal
The overall Covid-19 reviewing process speed increase described above could be universal across journals or restricted to a small number of them. To check for this, Figure 3 shows the average (geometric mean) review duration for Covid-19 and non-Covid-19 articles submitted to the 50 journals in the set analyzed with the highest JCR impact factors during and before the pandemic, in 2019 and 2020 separately. Figure S1 in the Supplementary material gives a similar evaluation for the next 51-100 journals. Overall, the average review durations for Covid-19 articles were shorter than for non-Covid-19 articles either submitted in 2019 or 2020 across all journals (except Journal of Virology; see discussion) and this difference was statistically significant for 80% of the journals (80 journals out of 100) at the 95% level. Figures 3 and S1 in the Supplementary material also show that the average review durations for non-Covid-19 articles submitted during pandemic in 2020 were statistically shorter than non-Covid-19 articles before the pandemic in 2019 for 35% of journals, whereas only 2% journals had shorter review processes for 2019 non-Covid-19 articles compared with 2020 non-Covid-19 articles.
On average, the Covid-19 article review durations are shorter by 49% (47 days) and 53% (55 days) compared with non-Covid-19 articles submitted during 2020 and 2019, respectively. This average reduction in review duration for Covid-19 research is more substantial in some journals, such as Gastrointestinal Endoscopy, with about 91% and 92%, and Journal of Infection, with 90% and 89%, compared with non-Covid-19 articles submitted during and before the pandemic in 2020 and 2019, respectively (Table 1). The three journals with the highest impact factors, The Lancet (62% and 66%), Science (62% and 69%), and Nature (57% and 65%), had also shortened their review processes for Covid-19 articles significantly compared with non-Covid-19 articles submitted during and before the pandemic (2020-2019, respectively), perhaps to avoid publication delays for coronavirus scientific results. Table 1 reports the 35 journals with the highest relative decrease of review processes of Covid-19 articles compared with non-Covid-19 articles published during 2020-2021.
To estimate whether the publication processes of journals could influence the results, the average durations between submission to a journal and addition to PubMed were calculated, finding that the publication processes for Covid-19 articles were shorter than for non-Covid-19 articles in both 2019 or 2020 for all journals. This difference was statistically significant for 84% of the journals (84 journals out of 100) at the 95% level (See Figures S3 and S4 in the Supplementary material). PubMed addition dates were used as the only available data relevant to publication dates in PubMed and hence the results should be cautiously interpreted here.

Correlations Between Review Processes Duration and Research Impact
Spearman correlation tests were calculated separately for Covid-19 and non-Covid-19 articles submitted to each of the 100 journals during 2019-2020 to assess whether shorter review processes for Covid-19 research are an indicator of likely future citation impact. Figures 4 and S2 in the Supplementary material show that, in general, there were negative correlations between   (61) the duration of review processes and Scopus citations to articles for most high-impact journals. However, the magnitudes of the correlations between citation counts and the duration of review processes of Covid-19 articles are much higher than for non-Covid-19 articles submitted to the journals either during or before the pandemic.
The Spearman's coefficients for 24% of the journals (24 out of 100) were strongly negative (more than −0.70) and for 36% moderately negative (−0.50 to −0.70) for Covid-19 articles. Hence, the results suggest that there was a high to medium degree of association between reviewing speed for Covid-19 research and higher citation impact for 60% of the medical journals. Nevertheless, many important medical journals do not provide the publication history of articles (e.g., The New England Journal of Medicine) and the results for these journals might be different. Presumably, the importance of the most critical Covid-19 research was clear to the authors, editors, and reviewers, who sped up the entire process in response. These importance judgments might take into account factors such as whether the research was likely to lead to effective safety measures, treatments, or vaccines for humans rather than being of indirect value for the pandemic. The correlation may have been influenced by any tendency for faster-reviewed papers to also be available simultaneously as preprints, giving them more time to attract citations. It is not clear whether this is a factor, however. The difference between journals in correlation magnitudes might be affected by some journals attracting a higher share of papers from early in the pandemic, when they would have more time to attract citations and would therefore generate a higher correlation, other factors being equal. The stronger correlations for Covid-19 than for non-Covid-19 for journals that are not gold open access may be partly due to a higher proportion of the Covid-19 papers being open access, generating more citations and thus making the correlation test statistically more powerful.
In contrast, correlations between review duration and Scopus citation counts for non-Covid-19 research submitted during 2019-2020 were in most cases insignificant and weak (less than 0.30) only 8% and 3% of the journals had moderate negative correlations, respectively. For instance, the three Covid-19 articles with the fastest review process (1, 2, and 3 days) from the 10 highest impact journals were "Feasibility of controlling COVID-19 outbreaks by isolation of cases and contacts" (Lancet Global Health), "Clinical features of patients infected with 2019 novel coronavirus in Wuhan, China" (The Lancet), and "Nowcasting and forecasting the potential domestic and international spread of the 2019-nCoV outbreak originating in Wuhan, China: a modelling study" (The Lancet), attracting 775, 14,854 and 1,478 Scopus citations as of June 2021. Thus, faster reviewing processes for more cited articles are a feature of Covid-19 publishing, although it is not clear the same is true in general or for other topics.

Further Analysis of Topics and Types of the Fastest Published Covid-19 Research
To assess the main topics and types of coronavirus research with the shortest review processes, 49 Covid-19 articles from the 30 journals from the set analyzed here with the highest impact factors and with quick (up to 10 days) review processes were manually checked (Table S2 in the Supplementary material).
The first author read the articles to broadly classify the topics and types of these articles. Three general topics were used to classify the broad subject of articles:  • Covid-19 genomic structures: Laboratory research about coronavirus genome structure, sequence, replication, or mutation. • Covid-19 vaccine: Research about the safety, efficacy, and immunogenicity of  vaccines.
Most Covid-19 articles with the fastest review process were about infection, transmission, or control (81.6%), genomic structures (10.2%), and vaccines (8.2%). A clear majority (59.2%) of Covid-19 articles were clinical, including clinical trials, clinical observations, or clinical case reports (e.g., clinical trials for vaccine safety and effectiveness), 18.4% were practice guidelines (e.g., guidance for management of coronavirus), 10.2% were epidemiological modeling (e.g., forecasting the spread of Covid-19), 8.2% were laboratory research (e.g., structure of the coronavirus spike), and 4.1% were surveys (e.g., symptoms among positive cases). Intuitively, all these articles seem to be of high and universal value in addressing the pandemic, suggesting that the fast review processes were appropriate and necessary for them (see also Table S2 in the Supplementary material).

DISCUSSION
This study has some methodological limitations. First, some important Covid-19 research is published in journals not analyzed here, including The New England Journal of Medicine ( JCR 2019 rank 2) and JAMA ( JCR rank 11), and the results may differ for the excluded journals if they had provided publication histories for their articles. As the patterns found here are universal or near-universal across journals, they seem likely to apply to all major medical journals, but may not apply to minor medical journals or other fields. For the citation counts, the results do not use 2-year citation windows to give articles submitted during 2020 enough time to be cited by other researchers and hence should be interpreted cautiously, especially for non-Covid-19 articles submitted during 2020. It is therefore possible that some of the patterns found are due to differences over time in the impact of Covid-19 research. This is possible because early papers may have been the most impactful as a side-effect of the rapidly expanding Covid-19 literature (because moderately early Covid-19 papers would have few Covid-19 papers to choose from to cite). Moreover, the editorial policy of publishers could have an impact on reviewing and publishing Covid-19 research by prioritizing the fast review and publication of Covid-19-related research. For instance, a group of academic publishers and organizations have joined an initiative to ensure that Covid-19 research is peer-reviewed and published as quickly as possible (OASPA, 2021). Future research could investigate the impact of reviewing policy of publishers on rapid reviewing of coronavirus research in more detail.
An important factor that this study does not take into account and has no data about is the possibility that some articles had fast review processes because they had been submitted to one journal, then rejected but cascaded with reviews to another journal from the same publisher. If the cascade was to a lower Impact Factor journal, then this might result in fast review (or almost immediate acceptance based on the reviews) and a relatively highly cited article for the second journal (because it had been seriously considered by the higher impact journal). These two factors combined would contribute to a negative correlation between publishing speed and citations for the second journal, other factors being equal.
Moreover, the analysis of the 49 Covid-19 articles with the fastest review processes (see Table S2 in the Supplementary material) showed that three were published as a "Brief Communication" or "Viewpoint" rather than an "Article," despite being recorded as articles in PubMed, Scopus, and WoS. There is no universally agreed definition of articles that could be used to decide whether these classifications were correct. Hence, there might be some nonarticle Covid-19 publications in the main data set with shorter review processes. We could not find any practical method to systematically identify these cases for a large-scale study, however, and presumably the same applies to the non-Covid-19 articles in the comparator sets.
The results confirm prior findings that Covid-19 research is published more rapidly (Horbach, 2020;Palayew et al., 2020), strengthening them by considering a much larger set of journals, a longer period, and methods that greatly reduce bias in the comparisons. The results also show, for the first time, that faster-published Covid-19 articles tend to generate higher citation impact, apparently breaking the normal scientific pattern in this regard.
A further investigation was made to understand the reason for the exceptional result from Journal of Virology with slightly shorter review processes for 2019 and 2020 non-Covid-19 articles compared with 2020 Covid-19 (albeit statistically not significant). In many cases the term "coronavirus" was used in the Covid-19 data set (n = 96) for other non-Covid-19 infection diseases such as Middle East Respiratory Syndrome Coronavirus (MERS-CoV), Severe acute respiratory syndrome Coronavirus (SARS-CoV ), and avian coronaviruses. For instance, although the articles "Coronavirus Endoribonuclease and Deubiquitinating Interferon Antagonists Differentially Modulate the Host Response during Replication in Macrophages" and "Evolutionary Arms Race between Virus and Host Drives Genetic Diversity in Bat Severe Acute Respiratory Syndrome-Related Coronavirus Spike Genes" had the term "coronavirus" in their titles and were submitted during the pandemic (2020), they were not related to Covid-19. We identified these cases after manual checks of the papers (n = 64) and moved them to the 2020 non-Covid data set to assess if this had an impact on the results. Again, although the average review processes for 2019 and 2020 non-Covid-19 articles (38.9 and 37.9) were slightly shorter than for 2020 Covid-19 articles (39.7), this difference was not statistically significant. Hence, it seems that the review procedure had about the same speed for Covid-19 and non-Covid-19 research before and during the pandemic for this journal.
The three Covid-19 articles with the longest review processes for this journal were, "LY6E Restricts Entry of Human Coronaviruses, Including Currently Pandemic SARS-CoV-2" (97 days), "HTCC as a Polymeric Inhibitor of SARS-CoV-2 and MERS-CoV" (95 days), and "The Enzymatic Activity of the nsp14 Exoribonuclease Is Critical for Replication of MERS-CoV and SARS-CoV-2" (82 days). These relatively short times suggests that the journal has fast review processes for all articles. As all three papers covered multiple coronaviruses, it is possible that they were adapted to include Covid-19 during the review process, or perhaps during the final stages of preparation, so the reviewers might have identified related procedural issues (this is pure speculation). If this occurred often for the Journal of Virology then it would explain the lack of shorter review processes for Covid-19 articles in the journal.

Impact of Rapid Reviewing on Health Policy
An additional local miniexploration was conducted to test whether quickly reviewed Covid-19 research might influence policy. To assess whether faster reviewed coronavirus research could impact Covid-19 vaccination policy-making in the UK, 26 references from the Joint Committee on Vaccination and Immunisation ( JCVI) report on priority groups for Covid-19 vaccination 2 were manually checked for the durations of their review processes. From these, eight medical journal articles provided submission and acceptance dates. The median number of days of review processes for the identified references cited in the JCVI report is 11 days for journals providing publication histories for their articles (Table 2), which is very fast compared with the overall duration of Covid-19 articles in this study (49 days; see Figure 1). The report also cited four preprint articles in medRxiv (a preprint service for the medicine and health sciences), bypassing peer review safeguards. Hence, it seems that the fast reviewing of articles by journals and disseminating coronavirus research via preprint platforms such as medRxiv and bioRxiv (see Fraser et al., 2021) found here could have had a significant impact on Covid-19 vaccine policy-making. However, more research on other Covid-19 reports and government advice in other countries (e.g., European Union or United States) is needed to investigate this further.

Correlation Between Faster-Reviewed Covid-19 Research and their Citation Impact
The findings show that faster-published coronavirus research tends to have higher citation impact ( Figure 4). This seems likely to occur primarily because authors, editors, and/or reviewers recognize the time sensitivity of the pandemic and try to ensure fast reviewing and revisions for articles that are potentially important. Although there are some examples of high-profile retractions for apparently hastily published Covid-19 research, the results suggest that the peer-review process is not compromised by publishing a substantial number of poor, uncitable studies. To support this discussion, Table 3 shows that 10 out of 15 Covid-19 articles with the most Scopus citations in the study had been reviewed within 9 days or less. Safety and efficacy of the ChAdOx1 nCoV-19 vaccine (AZD1222) against SARS-CoV-2: an interim analysis of four randomised controlled trials in Brazil, South Africa, and the UK.

Median 11
For instance, the article "Clinical features of patients infected with 2019 novel coronavirus in Wuhan, China" and "Nowcasting and forecasting the potential domestic and international spread of the 2019-nCoV outbreak originating in Wuhan, China: a modelling study" had the shortest review processes (2 and 3 days) and received 14,854 and 1,478 Scopus citations, respectively (as of June 2021). Presumably these or other articles analyzed here had fast reviewing because of the fast-track submission services provided by some journals to accelerate the review procedure. For instance, The Lancet has a fast-track publication service for randomized controlled trials and original research of major health importance, aiming "to peer review and publish papers within four weeks of submission" 3 .

CONCLUSIONS
On average, Covid-19 research papers were reviewed almost 1.7 to 2.1 times faster than non-Covid-19 papers during and before the coronavirus pandemic (e.g., 49 days compared with 85 and 105 days in 2019 and 2020, respectively). However, there were some differences between journals. The three journals with highest impact factors, The Lancet, Nature, and Science, also reduced their review processes for Covid-19 articles by 57%-62% compared with non-Covid-19 articles submitted during the pandemic in 2020. This confirms a widespread fast academic response to Covid-19 in the form of apparently universally supporting faster publishing for relevant research. This is a welcome conclusion in terms of the health of academia and its contribution to societal challenges. More specifically, given that academic journals must be cautious and conservative to maintain standards, it is reassuring to know that health journals are able to respond effectively and quickly when necessary.
Faster-reviewed Covid-19 research had more Scopus citations across all major Covid-19related journals and correlations were medium-high negative (more than −0.50) for most journals (60%). In contrast, in most cases the corresponding correlation was insignificant and small (less than 0.30) for non-Covid-19 research submitted during 2019-2020. This suggests that faster reviewing for articles that go on to be more cited is a distinctive feature of Covid-19 publishing. This again proposes that the academic publishing system has responded well to the need for publishing speed to ensure that the most relevant Covid-19 research (assuming that citations are an indicator of this) can be properly assessed and disseminated quickly enough to help with the pandemic.