While funders increasingly request evidence of the societal benefits of research, all academics in the UK must periodically provide this information to gain part of their block funding within the Research Excellence Framework (REF). The impact case studies produced in the UK are public and can therefore be used to gain insights into the types of sources used to justify societal impact claims. This study focuses on the URLs cited as evidence in the last public REF to help researchers and resource providers to understand what types can be used and the disciplinary differences in their uptake. Based on a new semiautomatic method to classify the URLs cited in impact case studies, the results show that there are a few key online types of source for most broad fields, but these sources differ substantially between subject areas. For example, news websites are more important in some fields than others, and YouTube is sometimes used for multimedia evidence in the arts and humanities. Knowledge of the common sources selected independently by thousands of researchers may help others to identify suitable sources for the complex task of evidencing societal impacts.

Although knowledge-building is a core goal of much scholarship, it is important to assess the impacts of research outside academia when evaluators or funders need evidence of its societal impacts (Dinsmore, Allen, & Dolby, 2014; Thelwall, Kousha et al., 2015). This is because funders consider research findings to have added value when they benefit society, such as by influencing policy (Oliver, Innvar et al., 2014). Assessing these nonacademic impacts is difficult because there are many types and no systematic record of them. In contrast, academic impacts are partly trackable by citation indexes. To illustrate the variety of potential nonacademic impacts, 27 categories of impact within four broad areas (research-related, policy, service, societal) have been suggested to help health researchers describe the benefits of their research when writing impact narratives (Kuruvilla, Mays et al., 2006). At a finer-grained level, 100 indicators have been suggested for the policy, health, economic, teaching, and career development impacts of biomedical research alone (Guthrie, Krapels et al., 2017). General recommendations have also been provided for interpreting nonacademic indicators of research impacts (Wilsdon, Allen et al., 2015).

The UK Research Excellence Framework (REF) is an exercise that runs every 6–7 years, assessing scholarly and nonscholarly research achievements to allocate block grant research funding. It groups UK academic research into four broad disciplinary panels (A, B, C, and D), containing 36 field-based Units of Assessment (UoAs, see Supplementary Information Tables S1–S4 for a list) in the 2014 iteration. The REF assesses the nonacademic impacts of research primarily through impact case studies, which are structured evidence-based narrative claims of nonacademic impacts written by the groups of researchers evaluated. In the REF context, research impact has been defined as “an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia” (Research Excellence Framework, 2014, p. 26). The weighting of the case studies for funding purposes has been increased from 20% in REF 2014 to 25% in REF 2021 (Research Excellence Framework, 2019). Impact case study mandatory “Sources to corroborate the impact” sections contain citations to the evidence underpinning the narratives. These must be found by the researchers themselves, typically with the support of university impact support officers and digital resources, such as Altmetric.com (nonacademic citations to academic publications) and Overton.io (policy documents mentioning researchers).

A range of sources may be used as evidence of nonacademic impacts. These include government publications, regulations, legislation, policy documents, parliamentary reports, statistics, white papers, medical treatment information sheets, clinical guidelines, patents, standards, book reviews, and news stories. For example, an independent review of the role of metrics in research assessment in REF 2014 and future exercises has suggested that “citations from online ‘grey’ literature seem to be an additional useful source of evidence of the wider impact of research, but there do not seem to be any systematic studies of these” (Wilsdon et al., 2015, p. 38). These sources of nonacademic impact evidence cannot be easily captured through scientific databases and may need extensive searches on the web to locate, if they are online at all. Although there have been attempts to propose methods to capture different types of nonacademic impacts based on web citation searches (Kousha, 2019) and social media websites (Thelwall, Haustein et al., 2013), these have tended to focus on assessing the availability of information rather than its utility for evidencing nonacademic impacts. It is therefore important to identify sources commonly used by academics to corroborate their claims of nonscholarly impacts in different subject areas. This may help researchers and university impact support officers to build their cases and may help altmetrics providers or others to index the necessary sources.

Most previous studies of REF case studies have used text mining (e.g., King’s College London and Digital Science, 2015; Parks, Ioppolo et al., 2018) or content analysis (e.g., Brook, 2018; Wilkinson, 2019) to identify the types of impacts claimed by researchers, rather than the types of evidence cited. In contrast, one (not peer-reviewed) study has listed the 40 websites most cited in impact case studies, broken down into four broad disciplinary groups (Digital Science, 2016) but did not analyze the cited URLs further. There seems to have been no large-scale assessment of the types of URLs cited in “Sources to corroborate the impact” evidence sections. The current study addresses this gap with a hybrid automatic and manual method to extract and classify the most cited of these URLs for 6,637 downloadable REF 2014 impact case studies across all 36 UoAs. The same method can also be used for the systematic classification of URLs cited in future impact case studies (e.g., REF 2021) or other similar large-scale exercises with URL citations outside the UK to understand their characteristics and disciplinary differences.

2.1. Text Mining Analyses of REF 2014 Impact Cases Studies

Several text mining studies have assessed the narrative sections of impact case studies. A large-scale topic modeling of the REF 2014 impact case studies found subject differences in the types of impact reflected in them. For instance, in medical and biological sciences (Panel A), about 20% of the case studies related to Clinical guidance, whereas in the arts and humanities (Panel D) the most common topic was “Media” (26%) (King’s College London and Digital Science, 2015). Another text mining analysis of REF 2014 impact case studies used seven categories (People, Economic, Reach, Significance, Prestige, Health, and Environment) to identify quantitative indicators of impact, finding that sentences matching the categories People (35%) and Economy (30%) were the most common (Parks et al., 2018). A further study classified the words in two sections (“Summary of impact” and “Details of impact”) of the impact case studies into six categories: Education (22.8%), Public engagement (17%), Environmental and energy solutions (17.7%), Enterprise (11.8%), Policy (17.1%) and Clinical uses (13.7%). Differences between broad disciplines in types of impact were identified. For instance, in the Social Sciences (Panel C) over one-third (34%) of the identified impact types were classified as Museums and cultural heritage, whereas in the Life Sciences (Panel A) about half of the impact types were categorized as Public health policy (Terämä, Smallman et al., 2016).

2.2. Content Analyses of REF 2014 Impact Cases Studies

Several content analyses have used human coders to classify aspects of the REF 2014 impact case studies. In terms of the types of documents cited, most case studies corroborate impact through at least one of Testimonial (80%) or Project report (78%) compared to Websites (30%) or Media (26%) (Hughes, Webber, & O’Regan, 2019).

The types of narrative impact claims found have differed greatly between disciplines. An analysis of the REF impact case studies from one university faculty in Health and Applied Sciences (n = 18) found impacts on Policy (e.g., policy reports and guidelines), Specific information and advice (e.g., online materials or toolkits), Research field (e.g., clinical trial procedures), and Patient interventions, protocols or standards of care (Wilkinson, 2019). For 162 impact case studies submitted to the Public Health, Health Services and Primary Care sub-panel, three quarters (75%) had impacts on New or revised clinical guidelines and more than half influenced International, national or local policy (54%) or changed Clinical or public health practice (52%) (Greenhalgh & Fahy, 2015). For 194 REF 2014 impact case studies in Business and Management, impact claims mentioned “Specific actions by practitioners or policy-makers” (93%), “Specific and quantifiable results” (43%), “Indirect influence on the public” (31%), or “Direct influence on the public” (4%). One study used a different approach to select case studies to examine. Using selected Leadership, Governance, and Management keywords, 1,309 relevant impact case studies were identified. Their most common impacts were related to Government policy (52%), Training (47%), Impact on understanding (e.g., awareness, attitude, or behaviors) (39%), and Strategy (e.g., knowledge transfer, organizational development, or performance) (37%) (Morrow, Goreham, & Ross, 2017).

Different types of evidence can be presented to justify impact claims. Most of the 63 Arts REF impact case studies contained evidence of the number of people who attended an event (73%). Other common types of evidence were implementing policy or influencing policymakers, industry, or other activities (60%), media coverage (52%), the number of events in a festival or other relevant cultural program (52%), and benefit to artists, curators, and cultural institutions (51%). The study argued that it is particularly challenging to provide evidence for artistic impacts in the REF because it requires looking at the opinions or behaviors of a wide range of audiences (Brook, 2018).

Some content analyses have examined the sources of evidence cited. For 46 cancer trials impact case studies, most (93%) of the supporting evidence was from either clinical guidelines (e.g., National Institute for Health and Care Excellence, National Comprehensive Cancer Network, or European Society for Medical Oncology) or trial research published by medical journals (e.g., The Lancet, Journal of Clinical Oncology, New England Journal of Medicine) (Hanna, Gatting et al., 2020). Another analysis of 25 Library and Information Science (LIS) case studies found that the most frequent types of impact evidence identified were about Cultural and heritage preservation, Historical archives, and Informing government policy. The categories Workers, Policymakers, Companies/businesses, and Governments were most frequently mentioned as research beneficiaries (Marcella, Lockerbie, & Bloice, 2016).

2.3. Alternative Sources for Assessing Wider Impacts of REF Case Studies

Alternative indicators might help to evidence the societal impacts of publications submitted as research outputs or referenced in impact case studies within REF 2014. One study identified mentions of social media platforms in REF 2014 impact case studies (all sections) through 42 terms, finding that blogs (52%), podcasts (21%), and YouTube (25.6%) were more commonly mentioned in Panel D case studies (Arts and Humanities) than other main panels. However, in Panel A (Medicine, Health, and Life Sciences) about a quarter (23.7%) of social media mentions were for YouTube, whereas Google Scholar (46%) was commonly referenced in Panel B (Physical and Mathematical Sciences), despite being a primarily academic source (it was sometimes used to evidence the credentials of the researcher or the scholarly uptake of the research despite this not being assessed, e.g., https://impact.ref.ac.uk/casestudies/CaseStudy.aspx?Id=20952, https://impact.ref.ac.uk/casestudies/CaseStudy.aspx?Id=938). In Panel C (Social Sciences), blogs (about 40%) were most common (Jordan & Carrigan, 2018). Another investigation gathered six altmetric indicators (Twitter, Wikipedia, Facebook, blogs, news, and policy-related documents) for publications (with DOIs) submitted either as REF 2014 research output or publications cited in impact case studies to support the underpinning research, finding that the publications referenced in impact case studies tended to be mentioned more commonly in altmetrics sources than were publications submitted as REF research outputs (Bornmann, Haunschild, & Adams, 2019). Although an early study of REF 2014 case studies found no obvious association between altmetric scores and REF impact scores (Ravenscroft, Liakata et al., 2017), a later investigation found a significant correlation between altmetric scores and expert peer review ratings of nonacademic impacts for publications (with DOIs) cited in the “Underpinning Research” sections of 1,469 REF 2014 impact case studies submitted under main panel B (Wooldridge & King, 2019).

It seems that only one study has assessed the frequency of URL citations from all impact case studies, reporting the 40 most cited websites (Digital Science, 2016, p. 30, Annex 4). This study did not classify URL types and did not use manual checking to exclude URLs mentioned for other reasons (e.g., archived copies of submitted REF impact case studies from https://www.wiki.ed.ac.uk/ and https://apps.lse.ac.uk/). It also did not merge all relevant types of cited URLs under one category (e.g., URL citations from all newspapers or news agencies and sources under the category “News and media”). Thus, this study has not given an overall picture of the types of URL cited in REF case studies.

The objective is to identify the main types of websites cited in REF 2014 impact case studies. This will shed light on how academics in all fields use online sources differently to reflect the nonacademic impacts of their research. The following questions address different aspects of this.

  1. Which types of website (e.g., news and media, governmernal, clinical guideline or social media) are cited in UK REF impact case studies to evidence research impacts?

  2. Which websites (e.g., BBC, UK Parliament, or NHS) are most frequently cited in the impact case studies in all broad fields and all 36 Units of Assessment?

  3. Are there disciplinary differences in the answers to the above questions?

4.1. The Data Set of REF 2014 Impact Case Study URL Citations

The metadata and full text of all 6,637 REF 2014 case studies1 were downloaded from the main REF website2 in Excel format. Note that of the 6,975 of impact case studies submitted to the REF2014, 6,637 case studies were downloadable from the REF database due to reuse and licensing arrangements (https://impact.ref.ac.uk/casestudies/FAQ.aspx). A program was designed and added to the free Webometric Analyst software (see https://lexiurl.wlv.ac.uk/, “Extract URLs from Impact Case Studies” option under “Citations”) to automatically identify and extract URL citations from these impact case studies. The term “URL citation” in this article refers to mentions of URLs in the “Sources to corroborate the impact” section of impact case studies (see Figure 1). Only this section of case studies was used for analyses because researchers “should list sufficient sources that could corroborate key claims made about the impact of the unit’s research” in it, such as “reports, reviews, web links or other documented sources of information in the public domain” (Research Excellence Framework, 2014, p. 54). The official case study template had recommended an indicative maximum of 10 references in this section (Research Excellence Framework, 2014). The software extracted 32,196 raw URLs from all impact case studies based on the mentions of https://, https://, or www. anywhere in the references to corroborate the impact.

Figure 1.

Examples of URLs cited in an impact case study from the section “Sources to corroborate the impact.”

Figure 1.

Examples of URLs cited in an impact case study from the section “Sources to corroborate the impact.”

Close modal

4.1.1. Data cleaning

An initial check of the 32,196 extracted URLs showed that 1,929 were from the link shortening websites tinyurl.com (1,055) or bit.ly (874). Hence, a program in Webometric Analyst was used to identify the redirected URLs (see “Get redirected URLs” under the “Service” menu), finding 1,871 (97%) of the ultimately cited URLs, which were used for analysis. However, manual checks of URLs containing the terms “REF,” “impact,” or “case study” revealed that 1,059 of the extracted URLs (mostly from a few universities) were either archived copies of submitted REF impact case studies (e.g., https://ref2014.inf.ed.ac.uk/impact/) or other uploaded files or relevant information about the submitted impact case studies that were inaccessible (https://apps.lse.ac.uk/impact/download/file/1194) and hence were excluded from the study. To have more unique and reliable cited URLs for analysis, duplicate URLs in case studies were excluded (e.g., see https://impact.ref.ac.uk/casestudies/CaseStudy.aspx?Id=38782), giving a final total of 29,830 URLs from all 36 UoAs (data is available via https://doi.org/10.6084/m9.figshare.14447295.v1).

4.2. Semiautomatic Classification of the Websites of the Cited URLs

An initial URL classification scheme was developed by checking the most cited websites (i.e., domain name or domain name ending) of the URL citations from all UoAs. For instance, manual checks showed that many cited URLs from impact case studies in the arts, humanities, and social sciences were from news and media (e.g., BBC News, the Guardian, and the Daily Telegraph) or governmental websites (e.g., UK government and UK parliament). In Clinical and Applied Medicine, health care organizations (e.g., the National Health Service) and clinical guidelines (e.g., NICE clinical guidelines) commonly documented research impacts. In Science and Engineering subject areas, commercial or business websites (e.g., Rolls-Royce or Apple) also frequently evidenced societal impacts. Nevertheless, the initial categories were subsequently modified to include new types of websites identified during the classification process. For instance, only one general category was first assigned for social media websites, but due to many cited URLs to online videos, it was split into two: Social Media and Blogs (e.g., Twitter, Facebook, WordPress) and Video and Photo Sharing websites (e.g., YouTube, Vimeo, Flickr). Moreover, in the arts and music a new category was added for artistic-related websites that could not be classified elsewhere (e.g., music, film, television, galleries, and museums). The URLs cited by impact case studies were eventually classified into 18 categories and eight broad areas, as shown below.

4.2.1. Initial automatic classification of cited URL websites

Because it was not practical to manually classify the websites of all 29,830 cited URLs extracted from the impact case studies, a systematic method was developed to automatically match the domains of the cited URLs (e.g., https://www.bbc.co.uk/news/health-18366437) against a manually curated list of relevant websites in predefined categories (e.g., bbc.co.uk in the category News and media). The relevant URLs for each category were identified and extracted from different sources such as DMOZ—The Directory of the Web (https://dmoz-odp.org/), Wikipedia lists of websites (e.g., https://en.wikipedia.org/wiki/List_of_intergovernmental_organizations), and top visited websites listed by alexa.com in different categories (e.g., https://www.alexa.com/topsites/category/Top/Reference/Encyclopedias/). Additional searches were carried out to identify reliable lists of websites for each category, such as the Webometrics Ranking of World Universities (https://www.webometrics.info/) for university websites worldwide, a list of UK healthcare organizations published by the NHS (https://www.england.nhs.uk/tis/our-members/), Ulrich’s Periodicals Directory (https://www.ulrichsweb.com/) for academic journal websites, the Directory of Intellectual Property Offices (https://www.wipo.int/directory/en/urls.jsp) for URL citations to patents, or National and International Clinical Guidelines Organizations (https://www.openclinical.org/guidelines.html/) for clinical guidelines. A program was written and added to Webometric Analyst to match lists of domain names in one category against the URLs from the impact case studies (see “Copy all URLs from long results files that match list of domain names” option under “Utilities”).

The systematic classification of the cited URLs may be useful to assess how academics are documenting research impacts in terms of the types of online sources but does not provide contextual evidence about how the cited sources have been used—this needs manual content analysis.

  1. Arts: This broad category includes URLs of art-related websites, such as museums (e.g., the British Museum), galleries (Tate Modern or National Portrait Gallery), film and television (e.g., the British Film Institute or British Academy of Film and Television Arts), theater (e.g., Royal National Theatre), music (e.g., ukmusic.org), dance (e.g., the National Dance CATs), or other relevant websites, such as the Royal Academy of Arts (royalacademy.org.uk), Arts Council England (artscouncil.org.uk), the Internet Movie Database (imdb.com), or The Stage magazine (thestage.co.uk).

  2. Governmental websites: URLs of governmental and parliamentary websites were classified in this broad category.

    • 2a. 

      UK government: This subcategory contains URLs of the main UK government websites, such as the main GOV.UK website (www.gov.uk) and the UK Government Web Archive (nationalarchives.gov.uk) and ministerial departments, such as the Department of Health & Social Care (dh.gov.uk), the Department Education (education.gov.uk)3, and other local authorities, such as Birmingham City Council (birmingham.gov.uk).

    • 2b. 

      UK Parliament: URLs from impact case studies mentioning UK parliament and other relevant parliamentary sources such as parliamentlive.tv were classified in this category.

    • 2c. 

      Non-UK governments or parliaments: This includes any other URLs from non-UK governmental or parliamentary websites, such as the U.S. Environmental Protection Agency (epa.gov), the Parliament of Canada (parl.gc.ca) and the U.S. State Department (state.gov).

  3. Organizational websites (other): This includes URLs from organizations not classified elsewhere.

    • 3a. 

      International organizations (including EU): This subclassification includes URLs of international organizations such as the World Health Organization (who.int), the United Nations (un.org), and the World Bank (worldbank.org). Citations to the European Union website (europa.eu) have also been classified under this category due to its intergovernmental structure for 27 European member states.

    • 3b. 

      UK healthcare organizations: URLs of UK medical and healthcare organizations, charities or nonprofit organizations have been classified under this category, including the UK National Health Service (NHS) (nhs.uk), Cancer Research UK (cancerresearchuk.org) and the British Diabetic Association (diabetes.org.uk). For example, 558 URL citations targeted NHS websites.

    • 3c. 

      UK organizations (nonhealthcare): This contains URLs of other UK organizations, charities or nonprofit organizations, such as the Royal Society (royalsociety.org), English Heritage (english-heritage.org.uk) or the British Council (britishcouncil.org).

    • 3d. 

      Other organizations: This contains URLs of other non-UK organizational websites not classified above (3a, 3b, and 3c) or in other classes (1,2, 4–8), such as the Organization for Security and Co-operation in Europe (osce.org) and the American Library Association (ala.org).

  4. News and media: This includes the URLs of newspapers (e.g., guardian.co.uk, telegraph.co.uk), news agencies (e.g., bbc.co.uk, reuters.com) and other news sources (e.g., businesswire.com, channel4.com).

  5. Commercial and business: URLs of commercial, product, or technology websites were classified into this category, including apple.com, rolls-royce.com, oracle.com and tripadvisor.co.uk.

  6. Scholarly publications: This category reflects URLs in scholarly or research communication systems.

  7. Universities: URLs of university websites not classified elsewhere are included in this category.

  8. Social networking websites: This contains URL citations from impact case studies to blogs and social networking websites.

    • 8a. 

      Social media and blogs: This includes URLs of social media websites (e.g., Facebook, Twitter, Tumblr, LinkedIn) and blogs (e.g., WordPress and Blogspot).

    • 8b. 

      Video and photo sharing websites: This includes URLs of video or photo sharing websites.

4.2.2. Manual checking of automatically classified URL websites

To make sure that the URL citations were reasonably classified into the predefined categories by the above automatic domain name–based method, the 20 most cited URLs in each of the 36 UoAs from the initial systematic classification were manually checked and reclassified if necessary (20 × 36 = 720 URLs). For instance, URLs for the National Audit Office (nao.org.uk) were first classified as UK organization URL citations, but the manual checks revealed that this organization is part of the UK government sector. The manual checking was based on visiting the websites and reading relevant sections, including “about us,” “our mission,” or “contact us,” if necessary. Nevertheless, about 12% (3,545 out of 29,830) of the URLs cited in the case studies were not classified even after the manual checking phase, although not classified cited URLs were more common in UoA 12—Aeronautical, Mechanical, Chemical and Manufacturing Engineering (21.4%), UoA 15—General Engineering (19.5%), UoA 11—Computer Science and Informatics (19.2%), and UoA 13—Electrical and Electronic Engineering (17.9%) than UoA 2—Public Health, Health Services and Primary (5.6%), UoA 22—Social Work and Social Policy (5.7%), UoA 18—Economics and Econometrics (6.0%), and UoA 20—Law (6.2%). This is because in the engineering fields researchers may use a range of different industries, businesses, or manufacturing companies as evidence of nonacademic impacts. The 3,545 not classified URLs were from 3,028 different websites, suggesting that they were less frequently cited in impact case studies. For instance, the most common not classified URLs were docs.google.com (cited nine times), thefreelibrary.com (cited seven times), and scribd.com (cited six times). Table 1 gives examples of website reclassifications from this stage.

Table 1.

Examples of reclassified websites based on manual checks of the top 20 most cited URLs in the impact case studies

NameURL domain nameInitial classificationReclassification
Public Health England www.hpa.org.uk UK healthcare organization UK Government websites 
British Medical Association www.bma.org.uk UK organization (nonhealthcare) UK healthcare organization 
People’s Trust for Endangered Species www.ptes.org Other organizations UK organization (nonhealthcare) 
European Society of Human Genetics www.eshg.org Other organizations International organization (including EU) 
Royal Exchange Theatre www.royalexchange.co.uk Commercial, industrial and business Artistic (e.g., music, dance and film) 
Farmers Weekly www.fwi.co.uk Commercial, industrial and business News and media 
The Royal Ballet School www.royalballetschool.org.uk UK organization (nonhealthcare) Artistic (e.g., music, dance, and film) 
The Nuffield Council on Bioethics www.nuffieldbioethics.org Other organizations UK healthcare organization 
The Business of Photonics optics.org Other organizations News and media 
Royal Society of Arts www.thersa.org Other organizations UK organization (nonhealthcare) 
Internet Engineering Task Force www.ietf.org Other organizations International organization (including EU) 
NameURL domain nameInitial classificationReclassification
Public Health England www.hpa.org.uk UK healthcare organization UK Government websites 
British Medical Association www.bma.org.uk UK organization (nonhealthcare) UK healthcare organization 
People’s Trust for Endangered Species www.ptes.org Other organizations UK organization (nonhealthcare) 
European Society of Human Genetics www.eshg.org Other organizations International organization (including EU) 
Royal Exchange Theatre www.royalexchange.co.uk Commercial, industrial and business Artistic (e.g., music, dance and film) 
Farmers Weekly www.fwi.co.uk Commercial, industrial and business News and media 
The Royal Ballet School www.royalballetschool.org.uk UK organization (nonhealthcare) Artistic (e.g., music, dance, and film) 
The Nuffield Council on Bioethics www.nuffieldbioethics.org Other organizations UK healthcare organization 
The Business of Photonics optics.org Other organizations News and media 
Royal Society of Arts www.thersa.org Other organizations UK organization (nonhealthcare) 
Internet Engineering Task Force www.ietf.org Other organizations International organization (including EU) 

4.2.3. Broad subject classifications

The 36 REF UoAs were combined into seven broad subjects for disciplinary analyses (Table 2). The REF classification of Units of Assessments across four main panels (A–D) was modified to represent results within more uniform broad subject areas. For instance, all artistic fields (including Art and Design: History, Practice and Theory and Music, Drama, Dance, and Performing Arts) and related humanities (e.g., English Language and Literature, History, Philosophy, and Law) in Panel D were combined to form the broad subject categories Arts and Humanities, respectively. Similarly, all relevant engineering, hard science subjects and medical relevant fields were merged to represent Engineering and Computer Science, Hard Sciences, and Medical Sciences and Healthcare.

Table 2.

Broad subject groupings for the 36 REF UoAs

Broad subjectUnit of assessment
Arts Art and Design History, Practice, and Theory 
Music, Drama, Dance, and Performing Arts 
  
Biological and Agricultural Sciences Agriculture, Veterinary, and Food Science 
Biological Sciences 
  
Engineering and Computer Science Aeronautical, Mechanical, Chemical, and Manufacturing Engineering 
Civil and Construction Engineering 
Computer Science and Informatics 
Electrical and Electronic Engineering, Metallurgy, and Materials 
General Engineering 
  
Hard Sciences Chemistry 
Earth Systems and Environmental Sciences 
Physics 
Mathematical Sciences 
  
Humanities Anthropology and Development Studies 
Area Studies 
Classics 
Communication, Cultural and Media Studies, Library and Information Management 
English Language and Literature 
History 
Law 
Modern Languages and Linguistics 
Philosophy 
Theology and Religious Studies 
  
Medical Sciences and Healthcare Allied Health Professions, Dentistry, Nursing, and Pharmacy 
Clinical Medicine 
Psychology, Psychiatry, and Neuroscience 
Public Health, Health Services and Primary Care 
  
Social Sciences Architecture, Built Environment, and Planning 
Business and Management Studies 
Economics and Econometrics 
Education 
Geography, Environmental Studies, and Archaeology 
Politics and International Studies 
Social Work and Social Policy 
Sociology 
Sport and Exercise Sciences, Leisure, and Tourism 
Broad subjectUnit of assessment
Arts Art and Design History, Practice, and Theory 
Music, Drama, Dance, and Performing Arts 
  
Biological and Agricultural Sciences Agriculture, Veterinary, and Food Science 
Biological Sciences 
  
Engineering and Computer Science Aeronautical, Mechanical, Chemical, and Manufacturing Engineering 
Civil and Construction Engineering 
Computer Science and Informatics 
Electrical and Electronic Engineering, Metallurgy, and Materials 
General Engineering 
  
Hard Sciences Chemistry 
Earth Systems and Environmental Sciences 
Physics 
Mathematical Sciences 
  
Humanities Anthropology and Development Studies 
Area Studies 
Classics 
Communication, Cultural and Media Studies, Library and Information Management 
English Language and Literature 
History 
Law 
Modern Languages and Linguistics 
Philosophy 
Theology and Religious Studies 
  
Medical Sciences and Healthcare Allied Health Professions, Dentistry, Nursing, and Pharmacy 
Clinical Medicine 
Psychology, Psychiatry, and Neuroscience 
Public Health, Health Services and Primary Care 
  
Social Sciences Architecture, Built Environment, and Planning 
Business and Management Studies 
Economics and Econometrics 
Education 
Geography, Environmental Studies, and Archaeology 
Politics and International Studies 
Social Work and Social Policy 
Sociology 
Sport and Exercise Sciences, Leisure, and Tourism 

Considering the indicative maximum of 10 references to support wider impacts of research in the REF 2014 template (Research Excellence Framework, 2014), it is unsurprising that the average number of URL citations is less than seven for all UoAs (Figure 2). Nevertheless, in public health and other allied health professions, impact case studies tended to cite on average more online sources (6.0 to 6.1) than in other fields, such as most engineering subjects (2.6 to 3.8). This may reflect health information being increasingly public and online, in contrast to engineering research innovation documentation.

Figure 2.

Average number of unique cited URLs in the UK REF impact case studies in 2014 across 36 UoAs, after data cleaning.

Figure 2.

Average number of unique cited URLs in the UK REF impact case studies in 2014 across 36 UoAs, after data cleaning.

Close modal

About a third of the cited URLs in the impact case studies were for other organizational websites (30%), with many others directed to news and media (19%) and government (17%) websites. Nevertheless, there are clear disciplinary differences (Figure 3). For instance, in Medical Sciences and Biological and Agricultural Sciences, URL citations of organizational websites were more numerous (40% and 32%, respectively), whereas in the Arts and Humanities, citations to news and media (28% and 26% respectively) were more common. In Engineering and Computer Science, more citations to commercial and business websites (26%) were identified. This confirms that broad fields tend to cite different types of online sources to evidence the impact of their research.

Figure 3.

Percentage shares of the cited URLs in the impact case studies based on eight broad categories across seven fields.

Figure 3.

Percentage shares of the cited URLs in the impact case studies based on eight broad categories across seven fields.

Close modal

Figure 4 gives more fine-grained details about the types of websites cited in the impact case studies. For instance, artistic contents (Music, Drama, Dance, and Performing Arts) were commonly cited in the Arts impact case studies (14.2%) and citations to the UK government and the UK parliament were most common in Social Science impact case studies (20.3% and 6.5%). UK healthcare organizations (e.g., NHS) and clinical guidelines (e.g., NICE) were more cited in Medical Sciences and Healthcare impact case studies (15% and 12% respectively) than in other fields. Perhaps surprisingly, citations to social media websites were relatively common in Humanities (7%) and Arts (5%) impact case studies.

Figure 4.

Percentage shares of specific types of cited websites in the impact case studies based on 18 categories across seven fields.

Figure 4.

Percentage shares of specific types of cited websites in the impact case studies based on 18 categories across seven fields.

Close modal

The 10 websites with the most citations from all impact case studies also vary in prevalence between broad subjects (Figure 5). In the Arts and Humanities, 6% and 5% of citations from the impact case studies were to the BBC website and 3% and 4% to the Guardian. YouTube videos were also more cited in the Arts (3.5%) and Humanities (2.8%) than in other subjects. In Social Sciences, citations to UK parliament (5%) and in Medical Sciences and Healthcare citations to the UK NHS (6.7%) and the National Institute for Health and Care Excellence (5.4%) were more common. This suggests that disciplinary-relevant online sources can be used to reflect the societal impacts of research. Tables S1–S4 in the Supplementary Information report similar information for the 36 UoAs.

Figure 5.

Percentage shares of the top 10 cited websites in all impact case studies across seven fields.

Figure 5.

Percentage shares of the top 10 cited websites in all impact case studies across seven fields.

Close modal

The results show, for the first time, the main types of website cited in all REF impact case studies. The method can be used to identify the most common online sources provided for societal impact claims in the new REF 2021, informing evaluators about the norms of societal benefits of research across their own field judgements. This might be more useful in the arts, humanities, and social sciences, where academics may use nonstandard online sources such as news sources, multimedia information, and social media for evidencing research impact. Because many online sources used as evidence of nonacademic impacts have not been covered by altmetric platforms, future tools may capture and analyze societal impacts from wider online sources of impact. The results show the existence of substantial disciplinary differences in the websites cited, as might have been suspected from prior evidence of disciplinary differences in the types of impact claim made (e.g., Hanna et al., 2020; Marcella et al., 2016). There are many ways in which academics may cite online evidence of the societal impacts of their research in REF impact case studies. In this section, different examples of the most frequently cited URLs are given in across subjects to provide richer insights into the main quantitative findings above.

In main Panel A (Medicine, health and life sciences), the most common types of claimed evidence were from clinical guidelines or trials (nice.org.uk or clinicaltrials.gov) followed by the World Health Organization (who.int), the UK NHS (nhs.uk), and the UK government (defra.gov.uk, gov.uk). For instance, in Public Health, Health Services, and Primary Care about a third (32.9%), in Clinical Medicine over a quarter (26.4%) and in Allied Health Professions, Dentistry, Nursing, and Pharmacy” less than a fifth (18.1%) of the claimed online evidence about benefits or wider impact of submitted impact case studies were from the above web sources. In Clinical Medicine, a wide range of clinical documents were used to demonstrate the benefits of medical research, such as changes in drug labels and guidelines (fda.gov/drugs/postmarket-drug-safety-information-patients-and-providers/information-abacavir-marketed-ziagen-and-abacavir-containing-medications), use in NICE clinical guidelines as treatment evidence (https://www.nice.org.uk/guidance/cg71) or cited by WHO guidelines for health policy (who.int/nutrition/publications/guidelines/potassium_intake_printversion.pdf).

In Agriculture, Veterinary, and Food Science, 8.5% of the URLs were from European Union websites, including the European Food Safety Authority (efsa.europa.eu), the European Medicines Agency (ema.europa.eu), and other relevant EU sections of food, farming, and fisheries websites. In Biological Sciences, some cited URLs (4%) were from ClinicalTrials.gov, where many mentioned clinical trials for the safety or efficacy of new drugs or treatments (e.g., clinicaltrials.gov/ct2/show/nct01844986 and clinicaltrials.gov/ct2/show/nct01712074).

In main Panel B (Physical sciences, engineering and mathematics), a combination of news, online videos, and specialized websites were frequently used as impact evidence in case studies. For instance, in Earth Systems and Environmental Sciences 5% were from environment, marine, food, or fishery sections of the European Union website (europa.eu). Similarly, in Civil and Construction Engineering, and Chemistry about 3% and in General Engineering about 2% of the cited URLs were from the Office of Rail and Road, lika Technologies (a pioneer in solid state battery technology) and Rolls-Royce, respectively. Mathematics (3.5%) and Physics (3%) cited URLs were of YouTube videos, such as a 3D print of a mathematical sculpture (youtube.com/watch?v=MyUfAs30yZk), the origin of the Handbook of Mathematical Functions (youtube.com/watch?v=Exf02R1FnXY), a TEDx talk about the universe (youtube.com/watch?v=oCaR1uE3OV8), and a video about a new type of LCD display screen (youtube.com/watch?v=DdhYPL87LZQ). In Electrical and Electronic Engineering, Metallurgy, and Materials (5%) and in Computer Science and Informatics (4%) BBC news URLs were cited, such as for the development of Europe’s Galileo satellite-navigation system (bbc.co.uk/news/science-environment-17755205), “musical prescriptions” for patients (bbc.co.uk/news/uk-scotland-glasgow-west-11233452), and new software to help children with communication problems speak better (news.bbc.co.uk/1/hi/health/8084422.stm). In Aeronautical, Mechanical, Chemical, and Manufacturing Engineering more than a third (38%) of cited URLs in case studies were to technological companies, standards, or patents.

In Main Panel C (Social sciences), the most commonly cited URLs in the impact case studies were for the UK parliament (including the Scottish, Welsh, and Northern Ireland versions) and UK government websites. For instance, in Politics and International Studies 12%, in Law 10.2%, and in Business and Management Studies 6.7% of the URLs were for UK parliament websites. In Education, Sociology, and Economics and Econometrics, reports by the House of Commons about “Transforming education outside the classroom” (publications.parliament.uk/pa/cm200910/cmselect/cmchilsch/418/418.pdf), “Domestic violence, forced marriage and ‘honour’-based violence” (publications.parliament.uk/pa/cm200708/cmselect/cmhaff/263/263i.pdf), and “Principles of tax policy” (publications.parliament.uk/pa/cm201011/cmselect/cmtreasy/753/753.pdf) were cited as influences on policy-making. Moreover, UK government websites (ending with domain gov.uk) were also highly cited in most social science fields, such as in Architecture, Built Environment, and Planning (26.6%), Economics and Econometrics (24.9%) and Social Work and Social Policy (22.7%). The cited URLs in the impact case studies from UK government websites may include a range of different contents, such as press releases, reports, regulations, statistics, policies, guidelines, analyses, white papers, parliamentary transcripts, or other publications.

In Main Panel D (Arts and humanities), the most frequently cited sources were news stories, online videos, and social media and blogs. For instance, in History, Theology and Religious Studies, and English Language and Literature, about a quarter of the cited URLs were for news and media, including the BBC (25.9%), Guardian (24.5%) and Daily Telegraph (22.7%) and in other subjects ranging between 14.4% in Classics to 20% in Art and Design History, Practice, and Theory. For instance, in English Language and Literature and History, many book reviews were mentioned in the impact studies, mostly published by the Guardian (theguardian.com/books/2011/dec/07/britains-empire-richard-gott-review), Daily Telegraph (telegraph.co.uk/culture/books/books-life/7087391/The-Long-Song-by-Andrea-Levy-review.html), and Independent (independent.co.uk/arts-entertainment/books/reviews/the-arabs-and-the-holocaust-the-arabisraeli-war-of-narratives-by-gilbert-achcar-2305801.html). The most cited URLs of YouTube videos across all REF subjects were from Classics (7.7%), Music, Drama, Dance, and Performing Arts (5.6%), Area Studies (5.3%) and Modern Languages and Linguistics (4.9%), indicating that in these areas multimedia information is important for evidencing impacts. Examples of the importance of multimedia include a short film about Greek Comedy in Classics (youtube.com/watch?v=H-BvMbfkxcc), an audio lecture in Philosophy (bbc.co.uk/programmes/b00xnxl4), and a picture of international festival participants (festivalpoesianicaragua.com/wp-content/uploads/poesiagranada-319.jpg), all evidencing humanities research impacts. In Modern Languages and Linguistics (7.7%) and English Language and Literature (7.2%), social media sites (WordPress, Blogspot, Facebook and Twitter) were cited relatively more often.

In Art and Design History, Practice, and Theory many impact case studies cited information from galleries or museums, such as information an exhibition (tate.org.uk/whats-on/tate-britain/exhibition/turner-and-masters) or an artistic object (tate.org.uk/art/research-publications/gaudier-brzeska-wrestlers) in the Tate Modern. Other relevant artistic information was also cited, such as a review of a painting exhibition (youtube.com/watch?v=mlsn4Za5-as), and specific galleries or exhibitions (e.g., whitechapelgallery.org/exhibitions/john-latham-anarchive/). In Music, Drama, Dance, and Performing Arts, 6% of the cited URLs were for online videos, such as a theatre play preview (youtube.com/watch?v=br9tafybBXM), music performances at a festival (youtube.com/watch?v=-Z6H8jpd1fU), an interview with a Professor of Music (youtube.com/watch?v=D1EUurZ4s98), a commercial racing game soundtrack, Need for Speed Shift 2: Unleashed (youtube.com/watch?v=mB6X3LGIT30), and a computer-generated light and sound music performance (youtube.com/watch?v=cysjxHzCoh0).

6.1. Limitations

This study has several limitations. The cited URLs studied here only include online sources to corroborate impact. This ignores all cited offline or unpublished sources (e.g., letters, emails, reports, statements) that may give different insights into the types of evidence used. Because there is no practical way to classify a large number of URLs, an ad hoc method was used to categorize the broad type of the cited URLs. Although 720 most cited URLs from the initial systematic classification were manually checked and reclassified when necessary, about 12% of the URLs cited in the case studies were not classified. This was particularly common in Engineering and Computer Science, where a range of different commercial websites could be claimed as evidence of nonacademic benefits of engineering research. We could not find a practical method to classify the websites cited in the impact case studies, and these less common URLs may well give a different perspective. Moreover, this study did not assess the contexts or motivations for citing URLs and hence it is not clear how the online sources were used by the impact case studies. For instance, a news story cited by a clinical medicine case study might reflect a publicity claim (i.e., the news story is the impact) or may evidence uptake of an invention by a company (i.e., the news story reports the impact). Finally, the classification of the 20 most cited URLs in each of the 36 UoAs was not crosschecked by a second classifier and hence there could be disagreement about the characteristics of some websites such as the National Audit Office (nao.org.uk) which is an independent public spending watchdog scrutinizing public spending for parliament.

The results show that a wide range of nonacademic online sources have been used to corroborate the benefits of research, including news stories, online videos, government publications, parliamentary records, and social media websites, although there were disciplinary differences. In answer to the first research question, in Medical and Health Sciences, clinical guidelines and UK healthcare organizations were most frequently cited, whereas in the Arts and Humanities, news and media, and in Social Sciences, government and parliamentary publications were more commonly mentioned in impact case studies. In answer to the second and third research questions, there are large disciplinary differences in the websites most commonly cited in impact case studies across the 36 REF UoAs (Supplementary Information, Tables S1–S4). For instance, in Clinical Medicine the NICE clinical guidelines (8.3%), in Public Health, Health Services, and Primary Care WHO (10.5%), in Agriculture, Veterinary, and Food Science (8.5%) and Economics and Econometrics (5.9%) European Union websites were more frequently mentioned in case studies. Similarly, in History many URLs were from BBC News (9.7%) and the Guardian (4%), whereas in Law the UK Parliament (10.2%) and Ministry of Justice (5.4%) were more frequently mentioned. In Chemistry, Physics, and Mathematical Sciences about 3% of the URL citations were to YouTube, indicating widespread disciplinary differences in the online sources used to evidence the wider impacts of research.

In terms of practical implications for people compiling impact case studies or making impact claims for funding applications, promotions, grant summary reports, or other governmental purposes, identifying the common sources that academics use to evidence their nonacademic impacts may help direct them to possible solutions. Similarly, the results may help scientometricians, research impact officers and librarians to develop strategies and tools for capturing and analyzing societal impacts from known sources, including those that are not covered in the current altmetrics platforms (e.g., Altmetric.com, Plum Analytics, Dimensions, Overton). For instance, this analysis of URLs cited in REF 2014 impact studies showed that trials and guidelines were the most common online sources used in medical sciences to corroborate wider research impacts of research. Hence, future altmetric tools and methods can be developed to identify mentions of research in the references of all major national and international online clinical trials or medical guideline sources as well as trial research published by medical journals (e.g., the journals Trials, The American Journal of Clinical Oncology Cancer Clinical Trials, Contemporary Clinical Trials, or Controlled Clinical Trials). Similarly, developing tools to extract mentions of research (including DOIs) from news sources, such as BBC News, which was the most common source for arts and humanities impact evidence, could be useful to identify research results that could be of interest to the public.

Kayvan Kousha: Conceptualization; Investigation; Methodology; Writing—original draft; Writing—review & editing. Mike Thelwall: Software; Writing—review & editing. Mahshid Abdoli: Investigation; Writing—review & editing.

The authors have no competing interests.

No funding was provided for this research.

The shared data provides categorization of 29,830 cited URLs in the REF 2014 impact case studies across 36 UoAs and is available via https://doi.org/10.6084/m9.figshare.14447295.v1. The classification scheme of cited URLs might be used for analyzing the REF 2021 impact case studies when they are available online.

1 

The REF 2021 case studies will be online at some stage after 2021.

3 

The internet addresses of most UK ministerial departments have not changed and are included under the main gov.uk website.

Bornmann
,
L.
,
Haunschild
,
R.
, &
Adams
,
J.
(
2019
).
Do altmetrics assess societal impact in a comparable way to case studies? An empirical test of the convergent validity of altmetrics based on data from the UK Research Excellence Framework (REF)
.
Journal of Informetrics
,
13
(
1
),
325
340
.
Brook
,
L.
(
2018
).
Evidencing impact from art research: Analysis of impact case studies from the REF 2014
.
Journal of Arts Management, Law, and Society
,
48
(
1
),
57
69
.
Digital Science
. (
2016
).
Publication patterns in research underpinning impact in REF2014: A report to HEFCE by Digital Science
.
London, UK
:
Digital Science
.
Dinsmore
,
A.
,
Allen
,
L.
, &
Dolby
,
K.
(
2014
).
Alternative perspectives on impact: The potential of ALMs and altmetrics to inform funders about research impact
.
PLOS Biology
,
12
(
11
),
e1002003
. ,
[PubMed]
Greenhalgh
,
T.
, &
Fahy
,
N.
(
2015
).
Research impact in the community-based health sciences: An analysis of 162 case studies from the 2014 UK Research Excellence Framework
.
BMC Medicine
,
13
(
1
),
232
. ,
[PubMed]
Guthrie
,
S.
,
Krapels
,
J.
,
Lichten
,
C. A.
, &
Wooding
,
S.
(
2017
).
100 metrics to assess and communicate the value of biomedical research: An ideas book
.
Rand Health Quarterly
,
6
(
4
),
14
. ,
[PubMed]
Hanna
,
C. R.
,
Gatting
,
L. P.
,
Boyd
,
K. A.
,
Robb
,
K. A.
, &
Jones
,
R. J.
(
2020
).
Evidencing the impact of cancer trials: Insights from the 2014 UK Research Excellence Framework
.
Trials
,
21
,
1
13
.
Hughes
,
T.
,
Webber
,
D.
, &
O’Regan
,
N.
(
2019
).
Achieving wider impact in business and management: Analysing the case studies from REF 2014
.
Studies in Higher Education
,
44
(
4
),
628
642
.
Jordan
,
K.
, &
Carrigan
,
M.
(
2018
).
How was social media cited in 2014 REF Impact Case Studies?
Impact of Social Sciences blog
. https://eprints.lse.ac.uk/90751/1/Jordan_How-was-social-media_Author.pdf
King’s College London and Digital Science
. (
2015
).
The nature, scale and beneficiaries of research impact: An initial analysis of Research Excellence Framework (REF) 2014 impact case studies
.
HEFCE
. https://www.kcl.ac.uk/policy-institute/assets/ref-impact.pdf
Kousha
,
K.
(
2019
).
Web citation indicators for wider impact assessment of articles
. In
W.
Glänzel
,
H. F.
Moed
,
U.
Schmoch
, &
M.
Thelwall
(Eds.),
Springer handbook of science and technology indicators
.
Cham
:
Springer
.
Kuruvilla
,
S.
,
Mays
,
N.
,
Pleasant
,
A.
, &
Walt
,
G.
(
2006
).
Describing the impact of health research: A Research Impact Framework
.
BMC Health Services Research
,
6
(
1
),
1
18
. ,
[PubMed]
Marcella
,
R.
,
Lockerbie
,
H.
, &
Bloice
,
L.
(
2016
).
Beyond REF 2014: The impact of impact assessment on the future of information research
.
Journal of Information Science
,
42
(
3
),
369
385
.
Morrow
,
E. M.
,
Goreham
,
H.
, &
Ross
,
F.
(
2017
).
Exploring research impact in the assessment of leadership, governance and management research
.
Evaluation
,
23
(
4
),
407
431
.
Oliver
,
K.
,
Innvar
,
S.
,
Lorenc
,
T.
,
Woodman
,
J.
, &
Thomas
,
J.
(
2014
).
A systematic review of barriers to and facilitators of the use of evidence by policymakers
.
BMC Health Services Research
,
14
(
1
),
1
12
. ,
[PubMed]
Parks
,
S.
,
Ioppolo
,
B.
,
Stepanek
,
M.
, &
Gunashekar
,
S.
(
2018
).
Guidance for standardising quantitative indicators of impact within REF case studies
.
Santa Monica and Cambridge
:
RAND Europe
. https://www.rand.org/content/dam/rand/pubs/research_reports/RR2400/RR2463/RAND_RR2463.pdf.
Ravenscroft
,
J.
,
Liakata
,
M.
,
Clare
,
A.
, &
Duma
,
D.
(
2017
).
Measuring scientific impact beyond academia: An assessment of existing impact metrics and proposed improvements
.
PLOS ONE
,
12
(
3
),
e0173152
. ,
[PubMed]
Research Excellence Framework
. (
2014
).
REF 2014: Assessment framework and guidance on submissions
. https://www.ref.ac.uk/2014/media/ref/content/pub/assessmentframeworkandguidanceonsubmissions/GOS%20including%20addendum.pdf
Research Excellence Framework
. (
2019
).
REF Guidance, 2019. Guidance on submissions to REF 2021
. https://www.ref.ac.uk/publications/guidance-on-submissions-201901
Terämä
,
E.
,
Smallman
,
M.
,
Lock
,
S. J.
,
Johnson
,
C.
, &
Austwick
,
M. Z.
(
2016
).
Beyond academia—Interrogating research impact in the research excellence framework
.
PLOS ONE
,
11
(
12
),
e0168533
. ,
[PubMed]
Thelwall
,
M.
,
Haustein
,
S.
,
Larivière
,
V.
, &
Sugimoto
,
C. R.
(
2013
).
Do altmetrics work? Twitter and ten other social web services
.
PLOS ONE
,
8
(
5
),
e64841
. ,
[PubMed]
Thelwall
,
M.
,
Kousha
,
K.
,
Dinsmore
,
A.
, &
Dolby
,
K.
(
2015
).
Alternative metric indicators for funding scheme evaluations
.
Aslib Journal of Information Management
,
68
(
1
),
2
18
.
Wilkinson
,
C.
(
2019
).
Evidencing impact: A case study of UK academic perspectives on evidencing research impact
.
Studies in Higher Education
,
44
(
1
),
72
85
.
Wilsdon
,
J.
,
Allen
,
L.
,
Belfiore
,
E.
,
Campbell
,
P.
,
Curry
,
S.
, …
Johnson
,
B.
(
2015
).
The metric tide: Report of the independent review of the role of metrics in research assessment and management
.
Bristol
:
Higher Education Funding Council for England (HEFCE)
. https://kar.kent.ac.uk/81123/1/Metric_Tide_main_report.pdf.
Wooldridge
,
J.
, &
King
,
M. B.
(
2019
).
Altmetric scores: An early indicator of research impact
.
Journal of the Association for Information Science and Technology
,
70
(
3
),
271
282
.

Author notes

Handling Editor: Ludo Waltman

This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. For a full description of the license, please visit https://creativecommons.org/licenses/by/4.0/legalcode.

Supplementary data