Abstract
This study analyzes the adoption and use of researcher profile systems (ORCID, Scopus Author Profiles, Web of Science Researcher Profiles (formerly Publons), Google Scholar Profiles, and ResearchGate) across discipline and rank at the University of Manitoba (Winnipeg, Canada). The purpose of the study is to assess how many faculty members have registered for and use researcher profiles and whether there are any differences in use along discipline or academic rank. The adoption rate in the current study is compared with other international studies. At the University of Manitoba, there is variance in adoption between disciplines and ranks. When comparing profile systems by discipline, Google Scholar is the primary profile system for sciences and ORCID, Publons, and ResearchGate the primary profile systems for health sciences. There is variance of publication count between disciplines. Unsurprisingly, the number of publications increases as faculty are promoted. Among the studied profile systems, ORCID is not working as efficiently as it could be. Several recommendations to increase ORCID adoption are made, including mandatory public fields and suggestions for third-party integration. As part of increasing usage of profile systems, we see academic librarians as a key component of instruction and advocacy for graduate students and faculty.
PEER REVIEW
1. INTRODUCTION
An accurate record of scientific output is integral in academia. Maintaining accurate records of scientific output is important for documentation, discoverability, and assessment. Many aspects of academic life depend upon viewing research activity, including promotion, tenure, research grants, and service opportunities (Craft, 2020; HuMetricsHSS, 2022; Kjellberg & Haider, 2019). To maintain a digital academic presence, researchers use researcher profile systems, which fall into different types, including persistent identifiers (PIDs) (e.g., ORCID), researcher profiles (e.g., Web of Science Researcher Profiles/Publons, Google Scholar Profiles), and academic social networks (e.g., ResearchGate).
Researcher profile systems are increasing in prominence among academics and researchers. Reasons for using researcher profile systems include name disambiguation (distinguishing one researcher from another with the same name), promotion of scientific output, collaboration, maintaining an accurate list of research, tracking citations and research metrics (bibliometrics), and increasing the visibility of scholarly work. Researcher profiles increase the visibility of scholarly output, which leads to higher rates of citation, a significant factor—for good or ill—in academic life, especially in the promotion and tenure process (Shanks & Arlitsch, 2016).
It is important to know who is using what researcher profile system, including in various disciplines, time of career, and geographic region. It can be revealing to know who is using researcher profile systems to tailor academic library services, instruction, and promotion. Previous studies have documented populations using researcher profiles (Aman, 2018; Boudry & Durand-Barthez, 2020; Heusse & Cabanac, 2022; Mikki, Zygmuntowska et al., 2015; Morgan & Eichenlaub, 2018; Sandberg & Jin, 2016; Zhang & Li, 2020).
This study provides a baseline of researcher profile system use across all disciplines at the University of Manitoba, a midsized Canadian postsecondary institution. This study assesses researcher profile system use of all professors, including analysis of rank. In our analysis, we compare researcher profile adoption rates to previous studies’ rates of adoption.
2. LITERATURE REVIEW
2.1. Proliferation of Researcher Profile Systems
Researcher profile systems have been growing in popularity. Due to an increasing volume of worldwide scholarship, it becomes necessary to list scholarly output, increase discoverability, and track citations and metrics.
Some researcher profile systems are manually created (e.g., Open Researcher and Contributor ID (ORCID), Google Scholar Profiles), whereas some are automatically generated (e.g., Scopus Author Identifier, ResearcherID). Researchers may choose to actively maintain automatically generated profiles, such as a Scopus Author Profile, compared to others who allow Scopus to manage their online profile. Within Scopus, researchers can change affiliation, link their ORCID, merge duplicate profiles, and add Scopus-indexed work not currently attributed to their profile. This is similar to, but distinct from, researchers manually creating profiles such as ORCID.
There is discussion over the terminology to use to refer to researcher profiles. In the literature, distinctions are made between types of researcher profiles:
identifiers (e.g., the alphanumeric code for ORCID, ResearcherID, Scopus Author Identifier);
profiles, from external (e.g., ORCID, Google Scholar Profiles, Publons, Scopus Author Profile) and internal (e.g., institutional or organizational profiles) entities; and
academic social networks (e.g., ResearchGate, Academia.edu, Loop).
Author identifier services (Boudry & Durand-Barthez, 2020): ORCID, ResearcherID.
Academic social networks (Boudry & Durand-Barthez, 2020): ResearchGate, Academia.edu.
Academic profile websites (Zhang & Li, 2020): ResearchGate, Google Scholar Profiles, Academia.edu, and ORCID.
Academic profiling sites (Mikki et al., 2015): including, but not limited to, ResearchGate, Academia.edu, Google Scholar Profiles, ResearcherID, and ORCID.
Researcher persistent identifiers: ORCID, Scopus Author Identifier, ResearcherID.
Researcher profile systems (Craft, 2020): ORCID, Scopus Author Profile, Publons, Google Scholar.
2.2. Reasons for Using Researcher Profile Systems
Researchers identify a plethora of reasons why they use researcher profile systems. Shanks and Arlitsch (2016) found that researchers use academic profiles for unique identifiers and name disambiguation, discoverability of scholarly output, and promoting scholarly products. Name disambiguation, as discussed in prior studies (Boudry & Durand-Barthez, 2020; Heusse & Cabanac, 2022; Sandberg & Jin, 2016) is a significant reason for proliferation of researcher profile systems, as distinguishing one researcher from another with the same name is crucial to establishing an online academic identity.
Gruzd and Goertzen (2013) found the biggest reason why social sciences researchers use researcher profiles is to list scholarly output, followed by collaboration, socializing, and promotion of their work. Wu, Stvilia, and Lee (2017) confirmed these reasons by conducting semistructured interviews with researchers at American institutions hosting institutional repositories, in addition to literature-seeking activity. Francke (2022) provides evidence that researchers use profiles for promotion and as a list of scholarly output. When faculty use profiles to list scholarly output, this, Francke argues, builds trust and credibility.
In the Canadian context, work is currently under way for a national strategy for persistent identifiers in Canada. The Canadian Research and Knowledge Network (CRKN) and the Digital Research Alliance of Canada, working in collaboration, recommend developing a road map that outlines a long-term vision for Canadian research to decide which PIDs to adopt and push (Brown, Jones et al., 2022). Federal Tri-Agencies—the Canadian Institutes of Health Research (CIHR), Natural Sciences and Engineering Research Council (NSERC), and Social Sciences and Humanities Research Council (SSHRC)—funding currently does not require an applicant to have an ORCID profile; however this could soon be the case (Booth-Morrison, 2022); the requirements could be similar to national funding requirements in the United States, France, United Kingdom, Singapore, and Australia where an ORCID is mandatory upon submitting grant applications. Applications would not be accepted unless researchers have an ORCID and associated profile, whether information is entered publicly, privately, or not at all, to apply for funding for major government funding from the Tri-Agencies.
It is also becoming commonplace for publishers to require an author’s ORCID prior to publication, such as publishers like PLOS (Public Library of Science) and IEEE (Institute of Electrical and Electronics Engineers) (ORCID, n.d.). While not a common requirement, some researchers opt to include identifiers such as ScopusID or Web of Science ResearcherID on their CV or grant applications.
2.3. Assessing Research Impact
Researchers prioritize assessing scholarly activities and research impact, and one way to make scholarly output discoverable is using researcher profile systems. With researcher profile systems, researchers track citations and view metrics associated with their work. Research metrics such as the h-index, CiteScore, and SNIP (Source-Normalized Impact per Paper) are used as justification for the impact of a researcher’s work. Problematically, researchers and promotion committees benefit from a single number to show research impact. Roemer and Borchardt (2015, p. 61) note metrics “[present] for us something of an artificially clean perspective on the identification and calculation of impact, useful for creating rankings (which require clean calculations) and powerful for tenure file evaluations (which value objective data as a shorthand for intradisciplinary values.”
However, metrics are problematic in a number of ways: “true impact is essentially impossible to measure” (Roemer & Borchardt, 2015, p. 61). Nonquantitative qualities of a researcher’s impact are ignored, such as leadership, collaboration, and mentorship. Metrics also favor researchers in high-publishing disciplines and late-stage researchers who have had more time to publish.
Researchers are presented with different metrics, for the same indicator, in different locations. A researcher may have a high h-index for Google-Scholar-indexed work and a low h-index for Scopus-indexed work, reflected on their respective researcher profiles (Alonso, Cabrerizo et al., 2009). Other challenges include different types of work that are indexed. Google Scholar, for example, often indexes duplicate works and includes unconventional scholarly work, such as LibGuides.
However, there are alternative research metrics tools, such as Altmetrics, ImpactStory, and HuMetricsHSS (the Humane Metrics Initiative). These tools incorporate a broader range of measurement, including social media mentions, news outlet references, and locally created values-based frameworks (e.g., equity, openness, leadership, and collegiality). There are other tools available to provide more detailed and varied quantitative metrics, such as SciVal or Journal Citation Reports, but these pose financial challenges for some. Other issues can emerge as well, such as using SciVal with multiple unmerged Scopus Author Profiles.
2.4. Researcher Profile Systems
2.4.1. ORCID
ORCID, or Open Researcher and Contributor Identifier, is a nonprofit organization and provides a unique alphanumeric code, or persistent identifier, for every manually created ORCID profile. An ORCID is intended to provide a way for researchers to permanently disambiguate themselves. An ORCID profile displays a researcher’s affiliation, service, funding, and scholarly output, among other categories. ORCID profiles are created manually, and works can be imported using a variety of methods, including directly from databases or added manually. As a consequence of its completeness, some researchers append their ORCID number in their correspondence as a convenient way to promote their recent works. Established in 2012, ORCID is a leader in researcher profile systems and is increasingly necessary when applying for funding or submitting for publication. ORCID can be integrated into other systems such as Scopus or journal editing and peer-review platforms.
2.4.2. Scopus Author Profiles
Scopus is a bibliographic database run by Elsevier, containing more than 25,000 active journals and over 82 million works. An author is automatically given a ScopusID, a unique, 11-digit identifier used within Scopus to identify researchers, and a Scopus Author Profile when their work is indexed by Scopus. However, it is not until an author’s work is indexed two or more times within Scopus that their name is hyperlinked from the author search results page and linked to their Scopus Author Profile. Authors can request changes to their Scopus Author profile (merging multiple profiles; changing affiliation; adding work that should be indexed), but profiles are automatically generated. Authors can link their ORCID profile to their Scopus Author profile, further combining and extending their scholarly output’s reach.
2.4.3. Publons/Web of Science Researcher Profiles
Publons, now Web of Science Researcher Profiles, is managed by Clarivate and is integrated with the Web of Science bibliographic database platform. Prior to August 2022, Researcher Profiles was called Publons, a company acquired in 2017 by Clarivate. On this system, a profile is manually created and used to list peer review and editorial activity. In addition, scholarly works indexed in Web of Science can be added to a researcher’s profile. On Publons, an author had an automatically generated ResearcherID. Prior to Publons integration, authors had to sign up on Web of Science to receive a ResearcherID and accompanying profile. Like Scopus, when an author has work indexed in Web of Science, a unique ResearcherID is created, which is listed on a corresponding profile, if applicable. With the advent of Web of Science Researcher Profiles, similar to Scopus Author Profiles, a profile is automatically generated with an author’s works, which the author can then claim.
2.4.4. Google Scholar Profiles
Google Scholar, a Google platform, is a search engine that indexes scholarly work in a wide range of formats, including preprints, theses and dissertations, and websites. Google Scholar Profiles is a researcher profile to track scholarly work, citations, and research metrics. While an author’s work is automatically indexed, opt-in is required by the author to create a profile. An author must have at least one work indexed in Google Scholar to have a profile.
2.4.5. ResearchGate
Established in 2008, ResearchGate is a social-network like researcher profile system that allows researchers to document scholarly output, upload their work for others to read, add contacts and collaborators, and message other researchers. ResearchGate provides several metrics, including reads, h-index, and research interest (a proprietary metric). On ResearchGate, some researchers have profiles automatically created due to being listed as a coauthor on an indexed publication. Of the researcher profile systems mentioned, ResearchGate is one of the more controversial. For several years there have been copyright infringement allegations, resulting in a lawsuit barring ResearchGate from hosting papers that violated copyright law (Kwon, 2022).
Other researcher profile systems, such as Dimensions, Academia.edu, Loop, and the Social Science Research Network (SSRN), were not assessed in this study.
2.5. Researcher Profile Systems Case Studies
There are a number of prior studies measuring populations on researcher profile systems, as well as exploring publication counts, citations, and research metrics across platforms. We identified studies that align to our inclusion criteria, and a summary of findings is presented in Table 1:
Zhang and Li (2020); and
Study . | Country . | Target population . | Automatic (A) or Manual (M) data collection . | ORCID . | Scopus . | Google Scholar . | ResearchGate . | WoS researcher profiles/Publons . |
---|---|---|---|---|---|---|---|---|
Mikki et al. (2015) | Norway | University of Bergen researchers | A | 3% | 8% | 30% | ||
Sandberg and Jin (2016) | International | Authors from Cataloguing & Classification Quarterly, Perspectives of New Music, and IEEE Intelligent Systems | M | 14.5% | 99.7% | |||
Aman (2018) | International | Leibniz Prize laureates (1999–2016) | M | 20.7% | 97.4% | |||
Morgan and Eichenlaub (2018) | USA | Florida Southern College faculty | M | 12% | 86% | |||
Canada | Ryerson University faculty | 62% | 96% | |||||
Boudry and Durand-Barthez (2020) | France | University of Caen Normandy researchers | M | 38.7% (public & private profiles); 17.1% (public profiles) | 54.3% | |||
Zhang and Li (2020) | Canada | University of Saskatchewan Science faculty | M | 29.5% | 48.1% | 62.8% | ||
Heusse and Cabanac (2022) | France | Toulouse scientific area faculty, researchers, and other staff | A | 41.8% |
Study . | Country . | Target population . | Automatic (A) or Manual (M) data collection . | ORCID . | Scopus . | Google Scholar . | ResearchGate . | WoS researcher profiles/Publons . |
---|---|---|---|---|---|---|---|---|
Mikki et al. (2015) | Norway | University of Bergen researchers | A | 3% | 8% | 30% | ||
Sandberg and Jin (2016) | International | Authors from Cataloguing & Classification Quarterly, Perspectives of New Music, and IEEE Intelligent Systems | M | 14.5% | 99.7% | |||
Aman (2018) | International | Leibniz Prize laureates (1999–2016) | M | 20.7% | 97.4% | |||
Morgan and Eichenlaub (2018) | USA | Florida Southern College faculty | M | 12% | 86% | |||
Canada | Ryerson University faculty | 62% | 96% | |||||
Boudry and Durand-Barthez (2020) | France | University of Caen Normandy researchers | M | 38.7% (public & private profiles); 17.1% (public profiles) | 54.3% | |||
Zhang and Li (2020) | Canada | University of Saskatchewan Science faculty | M | 29.5% | 48.1% | 62.8% | ||
Heusse and Cabanac (2022) | France | Toulouse scientific area faculty, researchers, and other staff | A | 41.8% |
Tran and Lyon (2017) surveyed faculty to self-identify which researcher profile systems they are aware of and actively use. They found the majority (59%) do not use any profiles, with low levels of active use, including ORCID (15%), ScopusID (9%), and ResearcherID (7%) as the top responses. The researchers examine discipline in awareness—but not active use—of researcher profile systems. For what the authors term author identifiers (e.g., ORCID, ScopusID, ResearcherID), they found physical sciences, biological sciences, and health sciences faculty the most aware, and arts and humanities and mathematics faculty the least aware. For what the authors term researcher networking systems (e.g., Academia.edu, Google Scholar, ResearchGate), they found social sciences, engineering, and biological sciences faculty the most aware, and mathematics faculty the least aware. The current study does not compare its results with Tran and Lyon (2017) due to differences in data collection: Tran and Lyon surveyed faculty who self-identified their use on researcher profiles.
3. AIMS
This study assesses researcher profile system use at the University of Manitoba. We answer the following questions:
What percentage of faculty members at the University of Manitoba have accounts on five researcher profile systems (ORCID, ScopusID, Publons, Google Scholar Profiles, and ResearchGate) and how does this compare to other studies worldwide?
What are the differences between disciplines and researcher profile system use?
What are the differences between academic rank and researcher profile system use?
4. METHODS
4.1. Study Setting
The University of Manitoba is a midsized, public postsecondary institution, located in Winnipeg, Manitoba, Canada. With an undergraduate and graduate student population of 31,037 in Fall 2021, the UM offers a wide range of undergraduate, masters, and doctoral programs across arts and humanities, sciences, social sciences, and health sciences (University of Manitoba, 2021a). Indigenous students make up 8.4% of the overall student population and just under 22% of the student population are international students (University of Manitoba, 2022).
The University of Manitoba is a member of the U15, a group of top Canadian research universities (U15, n.d.). In the 2020/21 fiscal year, the university received a record-high amount of research funding, totaling C$231 million. This includes research areas related to the Covid-19 pandemic, reconciliation, and climate change (University of Manitoba, 2021c).
4.2. Demographic Data
We assessed University of Manitoba professor usage on five researcher profile systems (ORCID, Scopus, Publons, Google Scholar, and ResearchGate). As a public, government-funded postsecondary institution, the University of Manitoba publishes annually a disclosure document of individual employee salaries over a threshold of C$75,000. We used the 2020 Schedule of Public Sector Compensation (University of Manitoba, 2021b). At the University of Manitoba, professors hold one of three ranks: Assistant Professor, Associate Professor, and Professor. In 2020, the lowest salary range for professors at the University of Manitoba is C$73,038 to C$109,558 for the Assistant Professor rank. Therefore, a small percentage of faculty members who did not generate over C$75,000 in income for the 2020 calendar year are excluded from this study.
We considered various methods to categorize disciplines at UM. For example, we found examples ranging from Heusse and Cabanac’s (2022) discipline categorization: (a) astronomy, astrophysics, environment, (b) arts, humanities, social sciences, (c) chemistry, physics, (d) engineering, mathematics, computing, (e) health, biology, agronomy, and (f) law, economics, management, and Mikki et al.’s (2015) categorization: (a) humanities, (b) law, (c) mathematics and natural sciences, (d) medicine and dentistry, (e) psychology, and (f) social sciences.
Ultimately, we used the University of Manitoba Libraries’ divisions for categorization of disciplines: Arts & Humanities (AH), Social Sciences (SS), Health Sciences (HS), and Sciences (SCI). A full listing of discipline categorization can be found in Table S1 in the Supplementary material. We did not include faculty members that did not fit into our discipline categories, including librarians, counselors, legal counsel, marketing & communications faculty members, and faculty occupying nonprofessor roles at institutes and centers. We used a University of Manitoba staff directory to look up individual faculty members for assigning departmental affiliation, which was then categorized into our four discipline categories.
In total, there are 1,100 faculty members that fell within our inclusion criteria (155 (AH), 236 (SS), 385 (HS), 324 (SCI)). With 95% confidence intervals (95% CI) and 5% margin error on overall percentage use of profile, we needed at least a population of 288 in total for our sample. We used the SURVEYSELECT procedure with stratified simple random sampling method in SAS (9.4) to get equal numbers of samples by discipline and rank (stratum) (Table 2). We used a simple random sampling method to select the estimated number of samples from each discipline by rank. To keep a balanced sample size, we sampled a higher proportion of samples from smaller sample disciplines.
Discipline . | Rank . | Total . | ||
---|---|---|---|---|
Assistant Professor . | Associate Professor . | Professor . | ||
Arts & Humanities | 24 | 24 | 24 | 72 |
Health Sciences | 24 | 24 | 24 | 72 |
Sciences | 24 | 24 | 24 | 72 |
Social Sciences | 24 | 24 | 24 | 72 |
Total | 96 | 96 | 96 | 288 |
Discipline . | Rank . | Total . | ||
---|---|---|---|---|
Assistant Professor . | Associate Professor . | Professor . | ||
Arts & Humanities | 24 | 24 | 24 | 72 |
Health Sciences | 24 | 24 | 24 | 72 |
Sciences | 24 | 24 | 24 | 72 |
Social Sciences | 24 | 24 | 24 | 72 |
Total | 96 | 96 | 96 | 288 |
4.3. Data Collection: Matching Professors to Researcher Profiles
We manually collected data (see Table 3) by searching faculty member names in ORCID, Scopus, Publons, Google Scholar, and ResearchGate. Using published work and institutional affiliation, we matched faculty members to each profile. Searching was completed between May and June 2022. Of note, data was collected prior to Clarivate’s switch of Publons to Web of Science Researcher Profiles. Therefore, no scholarly work data was collected on Publons, as at the time it was primarily a peer review and journal editorial board documentation system.
Researcher profile system . | Profile . | Number of scholarly works . | Other information . |
---|---|---|---|
ORCID | ✓ | ✓ | Public or private/empty profile; Year last updated |
Scopus Author Profile | ✓ | ✓ | Linked to ORCID |
Publons | ✓ | N/A | N/A |
Google Scholar Profiles | ✓ | ✓ | Verified—Yes/No |
ResearchGate | ✓ | ✓ | N/A |
Researcher profile system . | Profile . | Number of scholarly works . | Other information . |
---|---|---|---|
ORCID | ✓ | ✓ | Public or private/empty profile; Year last updated |
Scopus Author Profile | ✓ | ✓ | Linked to ORCID |
Publons | ✓ | N/A | N/A |
Google Scholar Profiles | ✓ | ✓ | Verified—Yes/No |
ResearchGate | ✓ | ✓ | N/A |
It is difficult to disambiguate UM faculty members from researchers who share the same name. This is especially problematic when little information is available on the profile, as with ORCID and ResearchGate, where profiles are manually created. ORCID can be especially challenging to disambiguate between researchers with the same name. Because information is entered manually, identifying information may not be available and even if it is entered, the profile’s owner has the choice of hiding that information.
As seen in Figure 1, we used a combination of methods from prior studies (Boudry & Durand-Barthez, 2020; Heusse & Cabanac, 2022). Heusse and Cabanac (2022) count ORCID profiles with evidence of working in their target population (Toulouse scientific area in France) or no employment evidence but an absence of homonyms (i.e., same first and last names) in Google Scholar. Boudry and Durand-Barthez (2020) identify only those ORCID profiles with matching names and with identifying information (e.g., institutional affiliation or scholarly work).
In our study, profiles are coded as “private/empty” if the profile did not contain any information beyond the profile owner’s name and their ORCID identifier. Profiles are coded as “public” if additional information was available. It is challenging to know whether an ORCID profile is set to private or empty, meaning an ORCID profile exists for the faculty member, but no additional information is provided.
To disambiguate a UM faculty member on ORCID, first we searched the faculty member’s full name. Each faculty member’s ORCID profile status is coded in one of three ways:
0: no ORCID profile found with name search.
1: evidence of University of Manitoba faculty member (matching name, and (a) affiliation or (b) scholarly works verified via Scopus).
2: ORCID profile(s) with matching name, but no evidence of affiliation with the University of Manitoba.
We recorded when multiple Scopus Author Profiles displayed the same name, as there would be greater chance of a corresponding ORCID profile not being the correct UM faculty member due to academics sharing names. In these cases, attention was paid to the ORCID profile to ensure that only UM faculty were identified. If a researcher only had one Scopus-indexed publication, we coded the researcher as not having a Scopus Author Profile because Scopus requires researchers to have two or more Scopus-indexed publications before being assigned a profile.
We did not record whether a Google Scholar Profiles profile was set to private, as there is no way to tell if someone has a private Google Scholar Profiles profile: Their name will not be hyperlinked if no profile exists; the same is true if their profile is private.
5. RESULTS
5.1. Researcher Profile Systems Adoption
Adoption of researcher profile systems is high overall at the University of Manitoba, with a 60.1% adoption rate for ResearchGate, 45.5% for ORCID, 38.5% for Google Scholar Profiles, 21.2% for Researcher Profiles/Publons, and a presence rate of 85.4% for Scopus Author Profiles (Table 4).
Researcher profile system . | Number of profiles (N = 288) . |
---|---|
ORCID | 131 (45.5%) |
Scopus Author Profile | 246 (85.4%) |
Publons | 61 (21.2%) |
Google Scholar Profiles | 111 (38.5%) |
ResearchGate | 173 (60.1%) |
Researcher profile system . | Number of profiles (N = 288) . |
---|---|
ORCID | 131 (45.5%) |
Scopus Author Profile | 246 (85.4%) |
Publons | 61 (21.2%) |
Google Scholar Profiles | 111 (38.5%) |
ResearchGate | 173 (60.1%) |
As mentioned in Section 4, we include only public and indisputable ORCID profiles in our percentage (45.5%). The total ORCID profiles, including 85 disputable profiles (29.5% of total), is 216 (75%). Of the Scopus Author Profiles (n = 246), 176 profiles (71.5%) are not linked to an ORCID profile while 70 profiles (28.5%) are linked.
5.2. Researcher Profile Systems Adoption by Discipline
Google Scholar Profiles is used most often by sciences faculty (56.9%), followed by social sciences (43.1%), health sciences (37.5%), and arts and humanities (16.7%) (Figure 2).
ORCID is used most often by health sciences faculty (72.2%), followed by sciences (50%), arts and humanities (34.7%), and social sciences (25%) faculty. Social sciences professors have low ORCID adoption, and almost half (48.6%) of those profiles are marked as having disputable identities due to incomplete or private profiles.
Overall, arts and humanities professors do not use researcher profile systems as much as compared to their peers in other disciplines. Surprisingly, even ResearchGate use by arts and humanities professors has low uptake (36.1%).
Not only do arts and humanities professors have lower uptake of ORCID (36.1%), the ORCID profiles that exist are not regularly updated, as evidenced by a mean “last updated” date of 2019.6. This is much lower than social sciences (2022), sciences (2021.6), and health sciences (2021.3). This could possibly be explained by a much lower publication rate in ORCID for arts and humanities faculty (Figure 5).
ResearchGate is adopted by sciences (63.9%), social sciences (68.1%), and health sciences (72.2%) faculty, and not highly adopted for arts and humanities (36.1%) faculty. There are very similar ORCID last updated dates based on stage of career among Assistant (2021.1), Associate (2021.3), and Professor (2021.1) ranks.
Publons has very low uptake by arts and humanities professors (6.9%), as well as other disciplines (social sciences (20.8%); sciences (20.8%); health sciences (36.1%)). A possible reason for low uptake is the publication habits of disciplines; if fewer journal articles are published in a specific discipline, there is less chance for peer review and taking editorial board positions: a primary reason for using Publons. Another possible reason is that Publons was, previous to integration to Web of Science Researcher Profiles, primarily a peer-review activity documentation site. Adoption may be more prevalent after integration into a wider researcher profile, as well as integrated directly into peer review platforms, such as ScholarOne and Open Journal Systems.
There is a very high presence of sciences (98.6%) and health sciences (97.2%) faculty on Scopus Author Profiles. Although these profiles are automatically created based on author Scopus-indexed publications, 35% of sciences faculty and 49% of health sciences faculty have an ORCID profile linked to their Scopus Author Profile. While Scopus has extensive discipline coverage, it has more health sciences (30.4%) and physical sciences (28%) coverage than social sciences (26.2%) and life sciences (15.4%) (Scopus, 2020). Scopus includes arts and humanities coverage in the social sciences discipline.
A chi-squared test of independence showed there was statistically significant association between discipline and researcher profile systems for ORCID (χ2 (3, N = 288) = 33.6, p < .0001), Google Scholar Profiles (χ2 (3, N = 288) = 25.5, p < .0001), Publons (χ2 (3, N = 288) = 18.4, p = .0004), and ResearchGate (χ2 (3, N = 288) = 24, p < .0001).
When we adjusted the p-value for statistically significant associations between profile systems and discipline using Bonferroni calculations, we found several disciplines more likely than others to adopt specific profiles (see Table S2 in the Supplementary material for calculations of Bonferroni-adjusted p-values):
For Google Scholar, sciences faculty are 6.6 times more likely than arts and humanities (p < .0001), 2.2 times more likely than health sciences (p = .02), and 2.2 times more likely than social sciences faculty (p = .0001) to have an account.
For ORCID, health sciences faculty are 8.7 times more likely than arts and humanities (p < .0001) and 3.2 times more likely than social sciences faculty (p = .0002) to have an account. Sciences is 4.8 times more likely than arts and humanities faculty (p = .0003) to have an ORCID profile.
For ResearchGate, health sciences (4.6 times, p < .0001), sciences (3.1 times, p = .001), and social sciences (2.7 times, p < .0001) faculty are all more likely than arts and humanities faculty to have a profile.
5.3. Researcher Profile Systems Adoption by Rank
Overall, there was no statistically significant data based on rank for any of the profile systems included in this study. A chi-squared test of independence was performed and the relationship between variables was not significant in Google Scholar (χ2 (2, N = 288) = 1.7, p = .43), ORCID (χ2 (2, N = 288) = .9, p = .64), Publons (χ2 (2, N = 288) = 3.2, p = .21), or ResearchGate (χ2 (2, N = 288) = 1.8, .41).
As seen in Figure 3, Publons is most used by Assistant professors (26%), compared to Associate (21.9%) and full Professors (15.6%). ORCID profiles do not deviate much from overall adoption among Assistant (51%), Associate (40.6%), and full Professors (45.8%). Scopus and ResearchGate uptake increases as one moves through the ranks (longer career).
Unsurprisingly, Scopus Author Profiles presence increases according to career stage and rank (Figure 3). Scopus is a profile system that is automated based on indexed publications in the Scopus database; thus, it makes sense that with more publications comes more chance of having a profile.
5.4. Number of Publications
Figure 4 shows the mean of the count of scholarly works listed in each of the four profile systems that include publications. Google Scholar Profiles (88.7) has the highest mean, followed by ResearchGate (59.2), Scopus Author Profiles (39.3), and ORCID (32.8).
One explanation of higher publication counts on Google Scholar and ResearchGate is that these profile systems index a wider-range of scholarly work than ORCID and Scopus.
Interestingly, ORCID has the lowest average publication count. One reason for this is that researchers must manually add their scholarly work or import works to ORCID from other databases via a mediated and self-initiated process.
5.5. Number of Publications by Discipline
A one-way analysis of variance (ANOVA) showed there was statistically significant association between discipline and publication counts for ORCID (p = .0004), Google Scholar Profiles (p = 0.0005), and ResearchGate (p < .0001). We calculated differences using pairwise comparison and the p-value was adjusted using Tukey’s test.
Tukey’s Test for multiple comparisons found that the mean value of publications is significantly different in Google Scholar between health sciences (n = 27) and arts and humanities (n = 12) faculty, with a difference of 85.4 publications (p = .0005, 95% C.I. = 10, 160.7). As well, health sciences (n = 27) (p = .0005, 95% C.I. = 18.9, 133.2) and sciences (n = 41) (p = .0005, 95% C.I. = 10.2, 113.5) faculty have significantly higher publications in Google Scholar compared with social sciences (n = 31) faculty, with a difference of 76 and 61.8 publications respectively.
Tukey’s Test for multiple comparisons found that the mean value of publications is significantly different in ORCID between sciences (n = 36) and arts and humanities (n = 25) faculty, with a difference of 47.3 publications (p = .0004, 95% C.I. = 14.6, 80). As well, there is significant difference between health sciences (n = 51) and arts and humanities faculty, with a difference of 43.3 publications (p = .0004, 95% C.I. = 12.6, 73.9).
Tukey’s Test for multiple comparisons found that sciences (n = 46) (p < .0001, 95% C.I. = 36.9, 116) and health sciences (n = 52) (p < .0001, 95% C.I. = 26.6, 104.1) faculty have significant differences in ResearchGate for the mean value of publications compared to arts and humanities (n = 26) faculty, with a difference of 76.4 and 65.4 publications respectively.
As seen in Figure 5, sciences and health sciences faculty have the highest average publications counts, with health sciences the highest on Scopus (60.8) and Google Scholar (124.4) and Sciences the highest on ORCID (48) and ResearchGate (90). Sciences and health sciences have very similar publication numbers on ORCID and Scopus.
Arts and humanities has the lowest average publication counts across all researcher profile systems, which can be partially attributed to disciplinary publishing norms (i.e., sciences faculty publishing journal articles and humanities faculty publishing monographs) (Yair, Goldstein et al., 2022). As well, arts and humanities faculty are more likely to produce research objects, creative works, and other formats of research that many researcher profile systems cannot accommodate.
5.6. Number of Publications by Rank
A one-way analysis of variance (ANOVA) showed there was statistically significant association between rank and publication counts for ORCID (p = 0.0268), Google Scholar Profiles (p < .0001), and ResearchGate (p < .0001). We calculated using pairwise comparison and the p-value was adjusted using Tukey’s Test.
Unsurprisingly, average publication counts trend upward based on rank for all profile systems (Figure 6).
6. DISCUSSION
6.1. Adoption of Researcher Profile Systems
Our study finds that specific disciplines have primary preference for researcher profile systems. In terms of popularity among disciplines, Google Scholar is the primary profile system for sciences faculty, and ORCID, Publons, and ResearchGate are the primary profile systems for health sciences faculty. Scopus is a popular option for both health sciences and sciences faculty.
It is interesting to compare our results with Heusse and Cabanac (2022) and Boudry and Durand-Barthez (2020), as our sample population and methodology most closely follow their studies. Heusse and Cabanac found 41.8% and Boudry and Durand-Barthez found 17.1% of their populations with ORCID profiles. Boudry and Durand-Barthez’s analysis includes only matched names with public profiles and does not include matched names with private or empty profiles. When Boudry and Durand-Barthez include both public and private profiles, 38.7% of their population are using ORCID.
After collecting data in 2018, Zhang and Li (2020) found that sciences faculty had an ORCID adoption rate of 29.5% at the University of Saskatchewan. This is substantially lower than the University of Manitoba’s 50% for the same discipline. We did not explore the difference between these two numbers, but we hypothesize it may be a result of the sciences faculty population size (n = 129) or the years (4) between the studies taking place. Our institutional adoption rates for Google Scholar (56.9%) and ResearchGate (63.9%) align with Zhang and Li’s, with 48.1% and 62.8% respectively. In Morgan and Eichenlaub’s (2018) study of Ryerson University faculty, ORCID (62%) and Scopus (96%) adoption is substantially higher than at the University of Manitoba, with 45.5% and 85.4% respectively.
University of Manitoba faculty are overrepresented on ORCID with health sciences and sciences faculty and underrepresented with arts and humanities and social sciences faculty. This aligns with findings from Heusse and Cabanac (2022). Writing about ORCID, Heusse and Cabanac (2022, p. 9) provide reasons why sciences and health sciences are overrepresented on ORCID and arts and humanities and some social sciences disciplines are underrepresented: “STEM publishers are generally large-sized firms (e.g., Elsevier, Springer, Wiley). Most of them are members of the ORCID consortium and they integrated ORCID into their peer-review and production system,” something not seen with some humanities publishers, who are typically smaller, decentralized, and not integrated with ORCID.
If the goal is to increase ORCID adoption more widely, attention needs to be given to nonintegrated publishers. It may become necessary for these smaller publishers to have, as a requirement, an ORCID profile for submitting authors. ORCID integration also ensures correct metadata of scholarly work and limits errors of personal and professional information.
As identified in prior studies (Aman, 2018; Boudry & Durand-Barthez, 2020; Morgan & Eichenlaub, 2018; Sandberg & Jin, 2016), one drawback to ORCID is empty or private profiles that cannot be disambiguated between researchers. This is especially problematic if many researchers share first and last names. If ORCID profiles are not maintained, nor contain any identifiable information about the researcher, these profiles may be used solely for access to platforms that require an ID. This is seen with publishers, grant agencies, and registries. Faculty members who wish to publish or apply to a granting agency where an ORCID profile is required may see faculty create an ORCID account, but not fill out any additional information. It seems that academics and researchers are creating ORCID profiles but not fully utilizing the profile system to have an open and public record of research output. This is challenging because it contradicts the primary purpose of ORCID: an open profile with a persistent identifier that unambiguously identifies the researcher and their work (Boudry & Durand-Barthez, 2020).
Our recommendation is to make specific ORCID fields mandatory, such as mandating full names and institutional or professional affiliation and making those fields public. Another option would be to require a minimum number of scholarly works to ensure researchers can be disambiguated. In addition, integrating contents of ORCID profiles into institutional faculty profiles may encourage researchers to more widely adopt and regularly update their ORCID profiles. As long as ORCID does not allow for confirmation of publicly available affiliation and for researchers not actively updating their ORCID profile, it is difficult to confirm authorship and improve access to—and accuracy of—scholarly work and bibliometrics.
6.2. Publications on Researcher Profile Systems
Our results demonstrate the differences in indexing among researcher profile systems. There are high Google Scholar publication numbers due to liberal overindexing. Google Scholar indexes a wide range of scholarly works, including conference presentations, preprints, and websites (LibGuides, for example).
Similarly, ResearchGate allows researchers to add their own scholarly work, including conference posters and white papers, not included in other databases such as Scopus or Web of Science. ORCID also has functionality to manually add a wide range of scholarly work, although ResearchGate has a higher rate of adoption. As well, ResearchGate allows access to full text, so potential audiences can read the work, a feature not available in ORCID, which only provides metadata. Conversely, one can contrast Google Scholar and ResearchGate with the controlled and less liberal indexing of the Scopus and Web of Science (Researcher Profiles/Publons) platforms. Overall, when looking at publications by faculty rank, average publication counts increase. This shows the obvious conclusion that the longer you have to publish—over the course of an entire career, for example—the more scholarly output you create.
Using ORCID to list scholarly work is not effective because there are few people using it, and those that do, do not use it for that purpose, which is evident from looking at average publication counts on ORCID. In addition, empty ORCID profiles add to the problem of name disambiguation. ORCID is not working as intended due to an abundance of private or empty profiles; if someone cannot differentiate a researcher based on incomplete information, how do they find out who that profile belongs to? This negates the purpose of creating an ORCID profile in the first place. In our study, 47.7% (103 of 216) of public ORCID profiles do not list publications and/or institutional affiliation. This is higher than Boudry and Durand-Barthez (2020) reported in their study with 32.4% (58 of 179) empty ORCID profiles. Our number is slightly lower than Heusse and Cabanac (2022), who report 51.7% of their sample without any scholarly work.
ORCID is far less effective than using Scopus, Google Scholar, or ResearchGate to list scholarly work. One of the reasons researchers opt to use profile systems is to make their scholarly work more accessible. The fact that there are fewer publications listed in ORCID signifies it is more difficult to use compared to other profile systems. As suggested by Zhang and Li (2020), ORCID should consider an easier way to add scholarly work to profiles. Our recommendation is that during registration of a new ORCID profile, ORCID suggests—but not mandate—third-party authorization, such as Crossref Metadata Search and Scopus. This would encourage new registrants to add scholarly works upon registration and could decrease the number of empty ORCID profiles.
6.3. Academic Library Services for Researcher Profile Systems
Academic library services often include support for creating and maintaining researcher profiles on various systems. In addition, academic librarians are involved in outreach for researcher profiles, such as providing workshops, one-on-one consultations, tutorials, and web content.
Our results will inform client-focused academic library services at the University of Manitoba and hopefully beyond. Work has already begun at the University of Manitoba, with us leading library instructional workshops on researcher profile systems, geared towards graduate students and faculty, summarized in Monnin and Fuhr (2024). Our study assesses discipline-specific use of researcher profile systems before and after attending instructional workshops at the University of Manitoba, which found ORCID among the most popular profile systems postworkshop.
Towards our goal of decreasing empty ORCID profiles, not only at our institution but worldwide, instructional sessions should touch on the importance of creating a profile and not leaving it empty. As instructors, we encourage attendees to enter their institutional affiliation and add any scholarly works at the time of ORCID profile registration.
7. CONCLUSION
The objective of our study was to measure researcher profile system use at a midsized Canadian university and compare our results with prior studies around the world, adding to the literature of worldwide adoption figures. Manually searching for University of Manitoba faculty gives this study a high level of precision and accuracy. One limitation to our study is the irregularity of matching identities on profile systems. Some identities are more challenging to match than others, given lack of affiliation in ORCID or using different names on various profile systems.
Our overall adoption rates are 60.1% for ResearchGate, 45.5% for ORCID, 38.5% for Google Scholar Profiles, 21.2% for Web of Science Researcher Profiles/Publons, and an occurrence rate for Scopus Author Profiles is 85.4%, with variance based on discipline and stage of career. When comparing profile systems by discipline, Google Scholar is the primary profile system for sciences and ORCID, Publons, and ResearchGate the primary profile systems for health sciences. The University of Manitoba has higher rates of adoption compared to previous studies, although this could be attributed to the growing acceptance of researcher profile systems—albeit at a glacial pace.
Faculty and researchers cannot be expected to create and maintain an ever-growing cache of profiles. It is our opinion that efforts should be focused on ORCID due to its global position providing open infrastructure for persistent identifiers and integration in other systems to capture accurate scholarly work. The Canadian ORCID consortium (ORCID-CA) also has as one of its goals to have a profile for all active Canadian researchers (Canadian Research Knowledge Network, n.d.).
Based on our study, we recommend
Smaller arts and humanities publishers integrate with ORCID and require submitting authors to have an ORCID profile.
Implement ORCID mandatory fields (e.g., full name and institutional or professional affiliation) and/or a minimum number of scholarly works.
Integrate contents of ORCID profiles to institutional or organizational profiles.
Upon registration, ORCID suggests authorization of third-party integrations (e.g., Crossref Metadata Search and Scopus).
Academic librarians continue to instruct and advocate for researcher profile systems.
ACKNOWLEDGMENTS
We are indebted to Dr. Rasheda Rabbani of the University of Manitoba’s George & Fay Yee Centre for Healthcare Innovation for providing a data collection sample and performing statistical analysis. Massive thanks to Dr. Guillaume Cabanac for reviewing an early version of our manuscript, offering invaluable suggestions, and making an invisible college more visible.
AUTHOR CONTRIBUTIONS
Justin Fuhr: Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Software, Validation, Visualization, Writing—original draft, Writing—review & editing. Caroline Monnin: Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Software, Validation, Visualization, Writing—original draft, Writing—review & editing.
COMPETING INTERESTS
The authors have no competing interests.
FUNDING INFORMATION
We acknowledge the generous financial support from The Ada M. Ducas & Nicole Michaud-Oystryk Librarians Research Endowment Fund from the University of Manitoba Libraries.
DATA AVAILABILITY
Research data and analyses are available at https://osf.io/f6kar/?view_only=3e5aa6bdcf3645eaaf8c10f9905fe276.
REFERENCES
Author notes
Handling Editor: Vincent Larivière