At present, performance assessment in science focuses mainly on scientific publications. Teaching is not given the same attention in the reward and recognition system, even though it is an integral part of science. Open Educational Resources (OER) are open teaching and learning materials, the creation and publication of which can, under certain conditions, be considered equivalent to other scientific output. A scientometric analysis of OER is in principle possible. However, as an object of measurement, OER show a number of peculiarities. These include often missing comprehensible quality assessments procedures, the treatment of OER versions regarding a sufficient level of creation, and the consideration of different granularities. This work discusses the special OER features in detail and suggests an applicable definition for OER in scientometric contexts. OER can have many different forms. To simplify measurability, the categories Dedicated Learning Content and Learning Design Content are proposed for use in scientometric analysis of OER. A third category, OER Ecosystem, is suggested for capturing services such as consulting, providing funding, or operating OER infrastructures that create an OER-conducive environment. In addition, practical challenges are considered, such as the separate collection of attribution and citation data for OER.

Currently, performance assessment in academia is mainly focused on research output. Other areas of performance, such as academic teaching, are neglected (European Commission, 2017; German Research Foundation, 2022a). There is growing resistance to this one-sided assessment and recognition of service provision in the academic world. Activities in nonresearch contexts cost researchers a great deal of time, which they do not then have to devote to research projects and the production of scientific outputs relevant to performance evaluation. Thus, a high level of commitment to academic teaching can be detrimental to career development in academia. Movements such as the Room for Everyone’s Talent initiative in Europe (VSNU, NFU et al., 2019) and the Coalition for Advancing Research Assessment (2022) address this issue and aim to consider a broader range of work products. With Academic Careers Understood through Measurement and Norms (ACUMEN Consortium, 2014), the Open Science Career Assessment Matrix (OS-CAM; European Commission, 2017) or the Indicators for the Evaluation of Educational Achievements (Utrecht University, 2019), there are concrete initiatives that also explicitly recognize teaching achievements in academic CVs.

Openness plays an increasingly central role in science (German Research Foundation, 2022b; UNESCO, 2021). The Open Access publication format makes a significant contribution to Open Science by providing free access to scientific publications such as research articles and conference papers. Openness in teaching is expressed in the concept of Open Education or Open Pedagogy. Open Educational Resources (OER) are open teaching and learning resources that can be reused by other teachers and learners. As such, they can be understood as the equivalent of Open Access publications (Mayrberger & Thiemann, 2018). While OER do not represent teaching in its entirety, they represent a part of the teaching services provided and serve to implement open educational practices (Ehlers, 2011; Hegarty, 2015; Wiley & Hilton, 2018).

UNESCO considers the deployment of “appropriate research mechanisms to measure the effectiveness and efficiency of OER policies and incentives against defined objectives” (UNESCO, 2019, p. 10) as an important means of supporting a culture that promotes OER in member states. Qualitative as well as quantitative means are explicitly mentioned. Scientometric indicators are used in academia to quantitatively measure work performance and to explore research structures at different levels. The development of scientometric indicators of teaching performance based on OER responds to UNESCO’s request, as they can contribute to the visualization of teaching performance in a similar way to publication-based indicators in research. A prerequisite for the development of OER indicators is a detailed investigation of OER as an object of measurement.

The subject of OER is examined in the literature from many different perspectives. The discussion is dominated by the view of the pedagogical possibilities (e.g., Ehlers, 2011; Tillinghast, 2020) as well as the potentials and barriers associated with the development and use of OER (e.g., D’Antoni, 2009; Luo, Hostetler et al., 2019; Mishra, 2017; Riar, Mandausch et al., 2020). Particularly in the context of OER infrastructures, categorizations play a role and are the subject of several studies. These mainly focus on quality aspects (Atenas & Havemann, 2013; Santos-Hermosa, Ferran-Ferrer, & Abadal, 2017) as well as on the functional scope of OER infrastructures (Heck, Kullmann et al., 2020; Hiebl, Kullmann et al., 2023). Admiraal (2022), Schröder and Donat (2022), and Weller, de los Arcos et al. (2018) have investigated the reuse behavior of teachers and developed different user typologies. Typologies for OER have been proposed by various authors. The focus here has been particularly on the legal openness of OER, which grants third parties different reuse rights for individual OER depending on their characteristics. Worth highlighting is Wiley (2021), who vividly describes different reuse scenarios with the 5 Rs (as the rights to Retain, Reuse, Revise, Remix, and Redistribute). The characteristics of OER have been studied particularly in relation to quality issues. Zawacki-Richter and Mayrberger (2017), building on extensive theoretical and methodological work by others, compiled a total of 161 quality criteria for OER and organized them into a quality criteria model. The model has been incorporated into the Instrument for Quality Assurance of OER (IQOer; Müskens, Zawacki-Richter, & Dolch, 2022). Austrian higher education institutions are able to obtain OER certifications. The certification process is carried out by the Verein Forum Neue Medien in der Lehre Austria (FNMA). OER activities at higher education institutions and continuing education programs on OER are eligible for certification. A personal certificate for OER authors is also offered (Verein Forum Neue Medien in der Lehre Austria, 2022). Scientometric methods have been used to analyze OER-related literature (Shettar, Hadagali, & Bulla, 2019; Shettar, Hadagali, & Shokeen, 2021). However, OER have not yet been considered as an object of measurement for scientometrics.

OER are a complex object of measurement and are as diverse as academic teaching itself. Due to their specific characteristics, OER are more difficult to grasp than, for example, scientific publications, such as research articles or conference papers. In the context of this work, the following questions are therefore investigated:

  1. Which conceptual definition of OER is most effective in representing teaching performance for scientometric purposes?

  2. Which categories of OER can be distinguished for scientometric purposes?

  3. What are the challenges of OER as scientometric measurement objects?

The methodological approach of this work is presented in Section 2. Sections 3 and 4 discuss common OER definitions and OER categories in relation to OER as a scientometric measurement object and thus answer research questions (1) and (2). Research question (3) is discussed in Section 4. Section 5 summarizes the findings and provides an outlook.

To develop a definition of OER that is sufficiently specific for scientometric purposes, that distinguishes OER from other objects of measurement known in scientometrics, and that is at the same time suitable for achieving scientometric measurement goals like, in particular, making teaching performance visible, the following steps were taken.

First, established OER definitions were reviewed with regard to the relevant properties and characteristics of OER. In a second step, a literature review was conducted to identify the state of research in the field of scientometric OER indicators on the one hand, and further characteristics and properties of OER on the other. The databases used were FIS Bildung1, ERIC2, BASE3, and Google Scholar4. English and German search terms were chosen. They were Open Educational Resources in combination with character* resp. Charakter*, definition*, typ*, and property* resp. Eigenschaft*. In addition, Open Educational Resources was searched together with scientomet* or Szientomet*, and measur* resp. Mess*. The specific search queries, using Boolean operators, were adapted to the requirements of the databases. The search results were examined for their relevance to the questions about existing indicators of OER and about the characteristics and properties of OER, based on the titles and abstracts. In addition, the AI Research Assistant Elicit5 was used to search for relevant literature. The search queries were What types of Open Educational Resources (OER) do exist? and What are scientometric indicators for OER? Thirdly, to investigate the characteristics and properties of OER in practice, OER repositories were included in the analysis. OER repositories serve to store OER and make them openly available. The analysis focused on metadata used to describe the essential characteristics and properties of OER. The search for suitable repositories was carried out via Google. German and U.S.-American OER repositories providing OER for higher education and different disciplines were in the center of interest. The choice of these two regions is due to the significant OER movements in each of these countries (Marín & Villar-Onrubia, 2022). The search terms were Open Educational Resources together with repository, Germany or Deutschland, and USA. The search led to the cross-state working group of OER repositories in higher education, named OER Repo-AG6. One of the work areas of the OER Repo-AG is the development of overarching OER metadata profiles. The website provides links to a metadata profile based on the Learning Object Metadata Standard (LOM), the LOM for Higher Education OER Repositories (OER Repo-AG, 2021), and another profile based on schema.org/Learning Resource Metadata Initiative (LRMI), the General Metadata Profile for Educational Resources (Allgemeines Metadatenprofil für Bildungsressourcen (AMB); Kompetenzzentrum Interoperable Metadaten, 2023). The profiles have been developed in collaboration with the OER Metadata Group of the Competence Centre for Interoperable Metadata (Kompetenzzentrum Interoperable Metadaten (KIM)) and are used by national OER repositories operated by the federal states in Germany. No initiative comparable to OER Repo-AG could be identified in the United States. However, the search led to large OER repositories in the higher education sector. The OER Commons7 and Merlot8 websites, which ranked high in the search results, are comprehensive OER repositories that collect a wide range of materials from different educational institutions in the United States and around the world. As such, they are comparable to German national OER repositories. For this reason, they were selected for further investigation of metadata for OER.

The steps listed served to obtain a comprehensive picture of existing definitions and important characteristics of OER based on the existing literature. OER practice was considered by analyzing OER infrastructures and, in particular, the relevant metadata standards used there. The comprehensive descriptions of OER characteristics and features in the literature were incorporated into an OER definition suitable for scientometric purposes, which was developed on this basis. The procedure also aimed to capture the various representations of OER to develop a categorization. This serves to simplify the handling of the large number of OER representations in the context of scientometric analysis.

It should be noted that the literature search also yielded numerous sources in languages other than German or English. In particular, literature in Spanish should be mentioned here, which is not considered due to the language barrier. However, as the focus of this work is specifically and exclusively on the scientometrically relevant characteristics of OER as a measurement object and not on political or educational influences or the effects of OER in individual countries or educational systems, a significant distortion of the results should not be assumed.

The term OER was coined in 2002 as part of the UNESCO Forum on Open Courseware and documented in the 2012 Paris Declaration on OER. It defines OER as:

teaching, learning and research materials in any medium, digital or otherwise, that are in the public domain or have been released under an open license that permits free access, use, adaptation and redistribution by others with no or limited restrictions. Open licenses are built within the existing framework of intellectual property rights as defined by relevant international conventions and respect the authorship of the work (UNESCO, 2012, p. 1).

OER are therefore materials created explicitly for teaching/learning purposes. However, they also include all other resources that are in principle suitable for teaching and learning and have been made available under an open license. This comprises, for example, research articles and conference papers, as well as research data or software products.

The William and Flora Hewlett Foundation also defines OER as “teaching, learning and research resources that are in the public domain or have been released under an intellectual property license that permits their free use or repurposing by others” (Atkins, Brown, & Hammond, 2007, p. 4). Compared to the UNESCO definition, this definition includes an explicit enumeration of possible types of materials: “Open educational resources include full courses, course materials, modules, textbooks, streaming videos, tests, software and any other tools, materials or techniques used to support access to knowledge” (Atkins et al., 2007, p. 4). By including tools and techniques used to access knowledge, the concept of OER is broadened to, for example, infrastructure components such as Learning Management Systems (LMS).

As defined by the OECD Centre for Educational Research and Innovation (CERI), OER are

digital learning resources made freely and openly available online (although sometimes in print) for teachers, educators, students and independent learners to use, share, combine, adapt and extend in teaching, learning and research. They include learning content, software tools for development, use and distribution, and implementation resources such as open licenses. Learning content is educational material of a wide variety, from complete courses to smaller units such as diagrams or test questions. It may include text, images, audio, video, simulations, games, portals and the like (Orr, Rimini, & van Damme, 2015, p. 17).

In contrast to the other two definitions, only learning resources are included here under OER. Infrastructure components are explicitly included. In addition to legal openness, the OECD-CERI definition also considers technical openness as an OER necessity (Orr et al., 2015).

In all three definitions, openness is seen as an essential characteristic of OER. In terms of use as teaching/learning objects, the dominant position is that all learning and research materials can be potential OER, provided they are openly licensed. The inclusion of research materials leads to a very broad definition of OER. According to this understanding, OER are created at the point of use in a teaching/learning context. The concept of resources is similarly broad. Thus, in addition to typical teaching/learning materials (such as lecture slides and instructional videos), research data, software products, or scientific publications can also be OER. According to the definitions of The William and Flora Hewlett Foundation and OECD-CERI, the ecosystem around OER is also classified as OER.

Based on these definitions, the relevant characteristics of OER are identified below to arrive at a scientometrically usable concept and, at the same time, to identify the challenges of OER as an object of measurement in scientometrics.

3.1. Degrees of Openness

The aspect of openness is central in the context of OER. Kerres and Heinen (2015) divide OER into weak and strong OER, based on the degree of openness of the resources. Weak OER are resources that meet the so-called 2 A (Availability and Accessibility). Availability in this context means that a resource (or its content) is publicly available and can be used free of charge. Accessibility focuses on the absence of technical barriers to practical use. Strong OER are resources that, in addition to the 2 As, grant third parties the right to reuse, revise, link to other resources, and redistribute (4 Rs). Legal openness is not a new issue with the Open Source and Open Science movements. For the OER sector, Creative Commons (CC) licensing models play a particularly important role in the realization of strong OER. Strong OER can be reused to a very high degree.

CC-0 (all rights granted to third parties), CC-BY (attribution required), and CC-BY-SA (in addition to attribution, a modified/remixed resource may only be licensed under the license of the original resource) are considered the “most open.” The least open CC license is CC-BY-NC-ND, which does not allow subsequent users to use a resource commercially or to make any modifications and is therefore very close to full copyright protection (copyright “All rights reserved”). This form of licensing corresponds to weak OER, which are freely available and accessible, but do not grant further reuse rights. The spectrum of licenses shown in Figure 1 can also be represented as classes of OER openness (see Table 1).

Figure 1.

License spectrum of Creative Commons (Shaddim, 2016).

Figure 1.

License spectrum of Creative Commons (Shaddim, 2016).

Close modal
Table 1.

Classes of OER openness

ClassOpennessLicensing
High (strong OERCC-0 
CC-BY 
CC-BY-SA 
Medium CC-BY-NC 
CC-BY-NC-SA 
Low CC-BY-ND 
CC-BY-NC-ND 
Free of charge (weak OERFull copyright protection, but free of charge usage (2 A
ClassOpennessLicensing
High (strong OERCC-0 
CC-BY 
CC-BY-SA 
Medium CC-BY-NC 
CC-BY-NC-SA 
Low CC-BY-ND 
CC-BY-NC-ND 
Free of charge (weak OERFull copyright protection, but free of charge usage (2 A

OER with extensive reuse rights (strong OER) differ in their life cycle in several ways from other scientific output such as publications in the form of research articles or conference papers. This results in a typical life cycle for OER that can be described by the 5 Rs formulated by Wiley (2021). After the creation and publication of an OER, it is ideally initially adopted by others as a copy in one’s own collection of teaching/learning resources (Retain). In a further step, the original OER can be reused unchanged (Reuse) or modified (Revise and Remix). Modifications always create a new version of the original OER, which can take place as simple corrections of content errors but also through more extensive content revisions and adaptations as well as redesigns. New versions are also possible through remixing, where several modified or unmodified OER are combined to a new OER (Heck et al., 2020; Schröder & Pfänder, 2020). Glahn, Kalz et al. (2010) provide a graphical representation of this life cycle (see Figure 2). To continue the life cycle, new OER versions should in turn be published and made available to others (Redistribute).

Figure 2.

OER Life cycle from Glahn et al. (2010).

Editing and/or remixing creates new versions of the original resource (see Table 2). Except for public domain or CC-0 (or equivalent licensed) material, third parties must explicitly acknowledge the authors of the reused OER, which is comparable to citations of scientific publications. In the case of CC licenses, this is called attribution, which should be structured according to the so-called TASL rule. TASL requires that the title, author, source, and license of the reused OER can be identified. It also requires a link to the original work, personal information about the author and the chosen license terms. The attribution should also state whether the original work has been modified. However, no detailed information about the modifications is necessary. In principle, the recommended structure is the same for all types of material, but there are specific features for the individual use cases of reuse, revise, and remix, as well as for work types and license types (Creative Commons, 2022). The American Psychological Association (APA) recommends a different way of referencing OER. Here, it is required to indicate the author, the year, the title, the source, or reference, as well as an exact indication of the time of the download (American Psychological Association, 2020).

Table 2.

OER reuse scenarios according to Wiley (2021) 

Type of reuseEmergence of new versions
Retain Make, own, and control copies of the content No 
Reuse Use of the content in a wide range of ways No 
Revise Adapt, adjust, modify, or alter the content itself Yes 
Remix Combine the original or revised content with other open content to create something new Yes 
Redistribute Share copies of the original content, revisions, or remixes with others No 
Type of reuseEmergence of new versions
Retain Make, own, and control copies of the content No 
Reuse Use of the content in a wide range of ways No 
Revise Adapt, adjust, modify, or alter the content itself Yes 
Remix Combine the original or revised content with other open content to create something new Yes 
Redistribute Share copies of the original content, revisions, or remixes with others No 

For weak OER, the OER life cycle does not apply. Here, copies of an OER can be stored (retained). For these openly accessible but very restrictively licensed learning objects, revision is only allowed to a limited extent. Instead of being able to make direct changes to the content, citations must be used, analogous to scientific publications. The remixing of weak OER is also very limited. According to German copyright law, up to 15% of a work may be used for teaching purposes (Urh, §60a). Proper citation is a prerequisite for this. Remixing is also difficult in the case of OER that are open but do not have the same license. When redistributed as an OER, the license terms must be checked very carefully. The same applies to licensing a new version after revising an OER. The license of the OER underlying the revision may prohibit certain licensing of the new version as a whole. In this case, the relevant license must be shown separately for each incompatible licensed part.

In addition to legal openness, technical openness is important for the free reuse of OER (Orr et al., 2015; UNESCO, 2021). The Gold Standard for OER Materials (Fabri, Flecks et al., 2020) details for different types of OER what needs to be considered in technical detail when developing and making available different representations, such as slides, podcasts, or entire online courses, to facilitate technical reuse. Technical reusability is also discussed in relation to quality issues (Müskens et al., 2022).

3.2. Resource Type

OER are as diverse in their representations as teaching is in different disciplines. This results in a wide range of resource types, including research outputs, depending on the underlying definition of OER. Margulies (2005, as cited in OECD, 2007, p. 31; see Figure 3) focuses on teaching/learning contexts and has identified several components of OER. These are tools, content and implementation resources. Tools include software for developing and delivering resources (content and learning management systems such as CMS/LMS, and social software such as Wikis, development tools). Content is the actual learning content. This may take the form of learning objects or courseware, or it may be simple references to learning content (e.g., Google Scholar or Wikis).

Figure 3.

Types of OER (Margulies, 2005, as cited in OECD, 2007, p. 31).

Figure 3.

Types of OER (Margulies, 2005, as cited in OECD, 2007, p. 31).

Close modal

Margulies’ representation has a technical character and is reminiscent of standards such as the Sharable Content Object Reference Model (SCORM)9 of the Advanced Distributed Learning Initiative (ADL)10, a reference model for exchangeable digital learning objects. The actual learning content is one category. Additional infrastructure components such as LMS (tools) and technical, legal, and administrative frameworks (implementation resources) are considered as enablers of OER, which also reflects the view of others (Atkins et al., 2007; UNESCO, 2019). Transferred to the world of scientometric indicators for OER, the basic idea of these three categories can be adopted. In most cases, it is not the individual representations of the objects of analysis of scientometric assessments that are of interest, but the performance classes. The aim of OER indicators is to make teaching performance visible and assessable through the creation of teaching/learning materials, thus putting them on an equal footing with scientific output such as research articles. For this reason, the categories Dedicated Learning Content for didactically prepared teaching/learning materials and Learning Design Content for materials that support teachers in designing courses are proposed. A third category, the OER Ecosystem, includes all the tools and support services that enable the creation, delivery and reuse of OER and that create an OER-friendly environment for OER authors and users. Figure 4 presents these three overarching OER categories.

Figure 4.

Overarching OER categories for scientometric analysis.

Figure 4.

Overarching OER categories for scientometric analysis.

Close modal

Specific representations of OER were identified above. Table 3 maps typical examples of OER representations to the three suggested overarching categories.

Table 3.

OER representations

OER categoryDescriptionRepresentation (examples)
Dedicated Learning Content 
  • Concrete learning content

  • Didactically prepared material that can be used for independent learning

 
  • Textbook (Open Textbook, Open Access Textbook)

    • ○ 

      Book series

    • ○ 

      Book

    • ○ 

      Book chapter

  • Course/Course unit

  • Instructional text

    • ○ 

      Lecture notes

    • ○ 

      Tutorial

    • ○ 

      Set of slides

    • ○ 

      Comic

    • ○ 

      Other text

  • Learning checks with solutions

    • ○ 

      Task collections/pool

    • ○ 

      Exercise unit/sheets

    • ○ 

      Quiz

    • ○ 

      Multiple/single choice tasks

    • ○ 

      Drill and practice

  • Video

    • ○ 

      Instructional video

    • ○ 

      Lecture recording

  • Visualizations (pictures/photos/illustrations/graphics)

    • ○ 

      Simulation

    • ○ 

      Animation

    • ○ 

      3D model

  • Audio object

    • ○ 

      Audiobook

    • ○ 

      Podcast

    • ○ 

      Audio collection

  • Case study

  • ePortfolio

  • Glossary

  • Wiki

  • Website

  • Software application

  • Sheet music

  • Educational game

  • Augmented/virtual reality learning

  • Experiment

 
Learning Design Content 
  • Material for the development of teaching/learning units by teachers

 
  • Didactic recommendation/best practice

  • Teaching sketch/lesson outline

  • Syllabus

  • Literature list

  • H5P object

  • Assessment tool

  • Data set prepared for teaching purposes

  • Exam material such as exam question collections/pools and solutions

  • Maker space template

 
OER Ecosystem, e.g.,
  • Infrastructure

  • Framework

  • Services

  • Funding

 
  • Support services that enable the development, publication and subsequent use of OER

 
  • OER Infrastructure

    • ○ 

      Infrastructure development (repositories, referatoriums)

    • ○ 

      Infrastructure operations

    • ○ 

      Development tools for OER

    • ○ 

      Software Systems for the Management of OER (Content Management Systems, Learning Management Systems, etc.)

    • ○ 

      Development environments for teaching/learning offers

  • Standards

    • ○ 

      Licensing systems

    • ○ 

      Technical standards

    • ○ 

      Indexing techniques (e.g., metadata)

  • OER Policies

  • OER Funding

    • ○ 

      Projects

    • ○ 

      Individual support services (help with technical implementations, metadata generation, etc.)

    • ○ 

      OER research

  • OER Services

    • ○ 

      Information offers

    • ○ 

      Training offers

    • ○ 

      Consulting services

    • ○ 

      Editorials

    • ○ 

      Quality assessments

  • Community work/networking

    • ○ 

      Conferences

    • ○ 

      Workshops

 
OER categoryDescriptionRepresentation (examples)
Dedicated Learning Content 
  • Concrete learning content

  • Didactically prepared material that can be used for independent learning

 
  • Textbook (Open Textbook, Open Access Textbook)

    • ○ 

      Book series

    • ○ 

      Book

    • ○ 

      Book chapter

  • Course/Course unit

  • Instructional text

    • ○ 

      Lecture notes

    • ○ 

      Tutorial

    • ○ 

      Set of slides

    • ○ 

      Comic

    • ○ 

      Other text

  • Learning checks with solutions

    • ○ 

      Task collections/pool

    • ○ 

      Exercise unit/sheets

    • ○ 

      Quiz

    • ○ 

      Multiple/single choice tasks

    • ○ 

      Drill and practice

  • Video

    • ○ 

      Instructional video

    • ○ 

      Lecture recording

  • Visualizations (pictures/photos/illustrations/graphics)

    • ○ 

      Simulation

    • ○ 

      Animation

    • ○ 

      3D model

  • Audio object

    • ○ 

      Audiobook

    • ○ 

      Podcast

    • ○ 

      Audio collection

  • Case study

  • ePortfolio

  • Glossary

  • Wiki

  • Website

  • Software application

  • Sheet music

  • Educational game

  • Augmented/virtual reality learning

  • Experiment

 
Learning Design Content 
  • Material for the development of teaching/learning units by teachers

 
  • Didactic recommendation/best practice

  • Teaching sketch/lesson outline

  • Syllabus

  • Literature list

  • H5P object

  • Assessment tool

  • Data set prepared for teaching purposes

  • Exam material such as exam question collections/pools and solutions

  • Maker space template

 
OER Ecosystem, e.g.,
  • Infrastructure

  • Framework

  • Services

  • Funding

 
  • Support services that enable the development, publication and subsequent use of OER

 
  • OER Infrastructure

    • ○ 

      Infrastructure development (repositories, referatoriums)

    • ○ 

      Infrastructure operations

    • ○ 

      Development tools for OER

    • ○ 

      Software Systems for the Management of OER (Content Management Systems, Learning Management Systems, etc.)

    • ○ 

      Development environments for teaching/learning offers

  • Standards

    • ○ 

      Licensing systems

    • ○ 

      Technical standards

    • ○ 

      Indexing techniques (e.g., metadata)

  • OER Policies

  • OER Funding

    • ○ 

      Projects

    • ○ 

      Individual support services (help with technical implementations, metadata generation, etc.)

    • ○ 

      OER research

  • OER Services

    • ○ 

      Information offers

    • ○ 

      Training offers

    • ○ 

      Consulting services

    • ○ 

      Editorials

    • ○ 

      Quality assessments

  • Community work/networking

    • ○ 

      Conferences

    • ○ 

      Workshops

 

3.3. Authorship

An author is the person responsible for creating and publishing a work as the originator. Authorship can be shared if more than one person has worked on an artifact (multiauthorship). In scientific publications, the main author responsible for the publication is usually listed first. Coauthors are placed second, third or elsewhere in the list of authors according to the extent of their contributions. Deviations from this are possible if, for example, the work was divided equally among all authors and therefore the names are listed alphabetically. Authorships under a group name do also exist but are rare. In terms of the original resource, authorship for OER is handled in a similar way. However, there are special features for new versions of the original resource resulting from subsequent use.

Camilleri, Ehlers, and Pawlowski (2014) have identified different types of authorship for OER. User-generated or individually authored OER are resources that can, in principle, be created and shared by anyone, not just field experts. Authorship can be clearly attributed to specific originators, at least in their original form. Organizationally produced OER are developed in professionalized contexts, such as projects, with a comparatively high input of human and technical resources. This type of OER typically involves the creation of several OER of low granularity (e.g., modules) that are bundled into one or more large units as, for example, courses. Often, individual authorship is also indicated for specific parts of the overall (big) OER that is published under the name of the funding organization. Organizationally produced big OER must be distinguished from crowd-sourced or peer-produced OER. These are created and developed by a larger community over an unlimited period of time. The involvement of a large number of authors and the ongoing, mostly asynchronous, process of further editing leads to a large number of versions that have been created by many different authors which can only be individually attributed with enormous effort. Weller (2010) has divided OER into individually produced little OER and institutionally produced big OER. Little OER can also be developed by nonteaching individuals with limited resources, which is why they may not always be of sufficient quality. Big OER are produced as highly granular teaching/learning objects in professional development contexts, such as dedicated projects. Due to the high level of human and material resources involved in their development and production, Weller attributes a higher quality of content and design to them.

In the case of weak OER, authorship remains clearly identifiable. With strong OER, authors assign rights to the original resource through an open license. This allows new versions to be created through revisions or remixes, with the creator of these versions being identified as the new author. The creator of the original resource receives attribution (except CC-0 licensing). With regard to the OER life cycle, there are some peculiarities concerning the minimum level of creation required by copyright law as well as CC licenses for authorship to be recognized. Theoretically, it is possible that only minimal changes are made to the original resource in the course of a revision (e.g., correction of spelling mistakes). In a remix, unchanged OER can be combined into a new resource without any further personal contribution. In this case, a minimum level of creation as required by copyright law may be lacking, which can be seen as problematic for fair attribution of services rendered. With the advent of AI-based applications such as ChatGPT, it should be noted that no human authorship can be claimed for teaching/learning materials created solely by an AI. These materials are considered to be in the public domain under German copyright law (Hoeren, 2023).

Another peculiarity of OER is the consideration of contributions that are not related to the authorship of the content. For example, the German LOM metadata profile for higher education OER repositories (OER Repo-AG, 2021) distinguishes between authors, publishers, and other contributors who have been involved in the creation of a teaching/learning object. These roles include, for example, graphic designers, technical implementers, educational validators, and subject matter experts. German OER repositories using the LOM metadata profile for higher education OER repositories (OER Repo-AG, 2021) sometimes make an even finer classification of contributions. For example, a distinction is made between contributors to the OER itself and contributors to the metadata describing an OER, each in a different role (see Figure 5).

Figure 5.

Details of contributions based on LOM (OER Repo-AG, 2021) (source: ZOERR14).

Figure 5.

Details of contributions based on LOM (OER Repo-AG, 2021) (source: ZOERR14).

Close modal

3.4. Quality

In theory, OER can be created by anyone and made available to others for use. The competence of the creators may be unclear. Teaching/learning objects are subject to high quality requirements, as learners cannot be expected to assess it. Quality of content through correctly presented learning material is the minimum requirement. In university teaching, lecturers are generally responsible for the selection and creation of teaching/learning materials. There is usually no external review of the individually selected and self-produced materials, as it is assumed that the lecturers have sufficient expertise for their teaching subject. Based on this practice, the authorship of an OER can be indirectly inferred to be of sufficient quality. Concerning OER, more formal approaches to direct verification are also discussed. These include, for example, the Open Educational Resources (OER) Quality Framework11. The topic of quality also plays a major role for OER infrastructures. In some cases, they have their own peer review procedures, which is the case, for example, with Merlot12. In Germany, a working group has been set up within the Standing Conference of the Ministers of Education and Cultural Affairs of the Länder in the Federal Republic of Germany (Kultusministerkonferenz, KMK)13 to look at quality assurance from systemic, process and, in particular, content perspectives. The systemic perspective deals with the optimal framework conditions for quality assurance assessments. The procedural view deals with the promotion of quality through accompanying measures for the qualification and support of OER authors. The content perspective refers to the quality of the OER themselves and can be considered from different angles. A very comprehensive quality assurance tool and a maximum requirement that considers all aspects of OER is the Instrument for Quality Assurance of OER (IQOer; Müskens et al., 2022). IQOer includes a pedagogical-didactic dimension with the subcategories Content and Didactic design, and a technical dimension with the subcategories Accessibility and Usability. The subcategory Content includes the criteria Scientific basis, Target group orientation, and Reusability of content. The subdomain Didactic design includes the criteria Alignment, Collaboration and Interaction, Application and Transfer, Assistance and Support as well as Assessment and Motivation. Accessibility is measured by the CC license criteria, which measure legal openness; accessibility for people with disabilities, reliability, and compatibility; and technical reusability. Usability is measured by the criteria of structure, navigation and orientation, design and readability, and interactivity. There is no distinction between mandatory and optional criteria. In practice, IQOer has not yet been widely applied. The materials available in higher education OER repositories are usually created by lecturers at universities or by dedicated OER development projects. Before publication, most of these materials undergo editorial review, including a formal quality check that takes authorship into account. By checking authorship in this way, it can be assumed that the quality of OER made available through institutional OER repositories is at least equivalent in terms of content to other materials used in university teaching. Another option is to take user ratings into account. However, in the case of a completely open-ended quality assessment (e.g., through the awarding of stars), the assessment criteria used and the competence of the assessors remain unclear.

3.5. Granularity

An important feature of OER is granularity. Kerres and Heinen (2015) introduce four different levels of granularity: Textbook, Learning Unit, Learning Material, and Learning Asset. The Textbook represents the most complex resource, which can consist of different Learning Units and additional material. The Learning Material covers a wider range of topics and includes, for example, the content to be taught in an entire semester. The Learning Material level represents material that can be used for a specific learning objective. It might be a worksheet or a collection of tasks on a particular topic. Learning assets include artifacts that are primarily created for teaching/learning purposes (e.g., videos, photos, and documents). It also includes scientific publications and research data.

The LOM metadata profile for Higher Education OER repositories (OER Repo-AG, 2021) includes four different levels of granularity. Aggregation level 1 is for single atomic materials. Level 2 is for multiple atomic materials that together form a unit. Level 3 contains larger collections of Level 2 learning objects. Level 4 is the highest level and represents an entire set of courses leading to a degree.

For scientometric purposes, the four LOM levels of aggregation can be adopted (see Table 4). Textbooks should be included at Level 3.

Table 4.

Comparison of granularities

Kerres & HeinenLOM aggregation levelExamples
Learning asset Single, atomic learning objects (Level 1) Photo, graphic 
Learning material Units from atomic learning objects from Level 1 (Level 2) Exercise sheet 
Learning unit Collections of units from Level 2 (Level 3) Chapter, learning module, textbook as a collection of chapters 
Textbook 
– Set of courses leading to a degree (Level 4) Certificate course 
Kerres & HeinenLOM aggregation levelExamples
Learning asset Single, atomic learning objects (Level 1) Photo, graphic 
Learning material Units from atomic learning objects from Level 1 (Level 2) Exercise sheet 
Learning unit Collections of units from Level 2 (Level 3) Chapter, learning module, textbook as a collection of chapters 
Textbook 
– Set of courses leading to a degree (Level 4) Certificate course 

3.6. Pedagogical Characteristics

Pedagogical characteristics can be represented by different types of information about an educational resource that teachers find helpful (Tischler, Heck, & Rittberger, 2022). In Germany, the very sophisticated subject classification system of the Statistisches Bundesamt, the federal statistical office of Germany, is often used to assign OER to a specific subject area (Statistisches Bundesamt, 2023). International repositories such as Merlot or OER Commons use categories with varying levels of detail. There is no general international standard for pedagogical characteristics.

The General Metadata Profile for Educational Resources (Allgemeines Metadatenprofil für Bildungsressourcen (AMB); Kompetenzzentrum Interoperable Metadaten, 2023) provides OER authors with a wide range of options for the pedagogical description of educational resources through categories on audience, educational level, and competences. For example, the category audience includes roles (values) administration, general public, mentor, parent, peer tutor, professional, student, and teacher. Under Educational Level, information about the intended educational level can be provided. The vocabulary includes the values early childhood education, primary education, lower or upper secondary education, postsecondary nontertiary education, short-cycle tertiary education, university (bachelor or equivalent, master or equivalent, doctorate or equivalent), preparatory service, and advanced training. In addition, information can be provided on competence requirements (competencyRequired), competences to be taught (teaches), or competences to be assessed (assesses). Elsewhere, pedagogical information is much leaner. The LOM metadata profile for Higher Education OER repositories (OER Repo-AG, 2021) provides only the resource type in addition to the subject matter. The international repository OER Commons provides information under Primary Use with the values Student, Teacher, Administrator, Parent, Librarian, and Other, which are comparable to the AMB Audience. The U.S. repositories also have educational use categories.

3.7. Usage Data

The downloading and reuse of OER in new teaching/learning contexts (storage and/or unmodified use), or a revision or remix with other OER by third parties generates usage data in the form of views and downloads, citations, or attributions. In terms of scientometric analysis, these data are particularly interesting for evaluation purposes. OER infrastructures should allow users to provide feedback on OER (Atenas & Havemann, 2013; Heck et al., 2020; Motz & Tansini, 2014). From the perspective of scientometrics, user feedback is of interest, especially with regard to resonance indicators and altmetrics. Not all OER repositories offer such facilities. The U.S. repository OER Commons can be referenced as a good example for user feedback functions. Views and downloads are displayed for each OER. Users also have the opportunity to comment and rate quality through ratings or structured reviews through a standardized process. However, there is a great deal of variation between repositories in terms of user feedback features.

As outlined in the previous sections, OER are a complex object with peculiarities that impose challenges for scientometric assessments. Owing to the varying definitions, which understanding of OER will be the basis for the further work on scientometric OER indicators needs to be clarified. OER are intended to make performance in academic teaching visible. This requires that the objects of measurement are created specifically for teaching/learning purposes and cannot, in principle, be used only in teaching/learning situations. An overly broad interpretation of OER to include research outputs would not achieve this measurement objective. The focus should therefore be on materials created explicitly for teaching/learning purposes.

The minimum requirement for OER should be free access and free availability of a resource (weak OER according to Kerres & Heinen, 2015), which makes a wider range of teaching efforts accessible through OER indicators. Openly licensed teaching/learning objects that are only available to a limited group of people with specific access rights (e.g., university members with LMS accounts) cannot therefore be considered OER. OER are to be equated with scientific publications, which are only recognized if they are published.

The type of resource is a key feature of OER. In addition to teaching/learning resources, a supporting OER Ecosystem is important and should be recognized in scientometric assessments. From a definitional point of view, it is not the individual representation of OER that is of interest, but the categories of types. A suggestion for three OER categories (Dedicated Learning Content, Learning Design Content, OER Ecosystem) is given in Table 3 and is used below.

OER are often not aimed at a specific target group. However, they are also aimed at learners. They cannot be expected to provide independent quality control. Thus, sufficient quality of OER is crucial. The minimum requirement is the accuracy of the content of the teaching/learning material. This mandatory criterion can be ensured in different ways. First, it can be assumed that all materials published by an educational institution have undergone some form of quality assurance process and therefore meet the minimum requirement. Secondly, quality can be verified and documented in a traceable way through a formal quality assurance process (e.g., IQOer). In addition to or instead of such a quality assurance process, user feedback on resources can be taken into account. However, due to the lack of traceability of evaluators’ competences and the lack of evaluation criteria in most cases, this is associated with limitations. In the context of OER quality, it needs to be clarified whether positive results of a formal quality assessment should have an impact on the scientometric recording of OER. A higher value for high-quality OER could be realized through higher weighting. In this case, an appropriate factor should be defined.

Authorship is of central importance for scientometric purposes, as it is the only way to attribute services to the actual service providers. The OER life cycle refers to openly licensed OER that can be modified or remixed with other OER by third parties as part of reuse. In these cases, a new resource is created under the authorship of the subsequent user. This is the case even if minimal changes are made to the original resource. To ensure fair attribution, a sufficient level of creation for resources is mandatory. This requires an appropriate qualitative process that examines versions of OER created through revision or remixing. Such verification is necessary to prevent the mass production of OER without significant personal contribution. Criteria need to be defined for assessing when there is a sufficient level of authorship.

Against this background the following OER definition for scientometric assessments is supposed:

OER are publicly available, freely accessible materials that have been created specifically for teaching/learning purposes and are of sufficient quality and level of creation. OER are divided into the categories of Dedicated Learning Content for learning materials primarily intended for learners, and Learning Design Content for supporting materials for teachers. OER also include contributions to a supportive OER Ecosystem that facilitates the creation, use and dissemination of OER. This includes, for example, infrastructure elements developed for open teaching/learning purposes, such as OER repositories, OER-supportive working environments, editorial work, consulting services, training and other support services for OER authors.

With the identified characteristics of OER come challenges and necessary decisions on how scientometric assessments should be designed in detail. The legal openness of OER can vary due to different licensing options. The CC licenses, which are already divided into four classes of openness (see Table 1), are suitable for capturing the different degrees of openness. The four classes may be of interest in the context of exploratory analyses of the distribution of OER across different degrees of legal openness. In evaluative measurements, the question of whether OER with a higher degree of openness should be classified as having a higher value needs to be clarified. If so, a weighting factor should be determined. It should be noted that when other licensing models are used, there is a need to ensure consistent treatment of differently licensed OER. This can be done by mapping licenses that are comparable in their degree of openness.

The type of authorship is very interesting from a scientometric point of view for both exploratory and also evaluative purposes. In addition to the authors responsible for the content, there may be other contributors, such as quality managers or programmers, depending on the context of creation and the complexity of an OER. In the case of professionally produced OER, this may also include funding institutions. How these different groups of contributors should be addressed in the context of exploratory and evaluative analyses needs to be clarified. In this context, the OER sustainability models identified by Fahrer and Heck (2023), which include 12 different funding options, are interesting. These include the government model, the institutional model, the contract model, the network model, and the foundation model, in which funders at different levels contribute to the funding of OER, either individually or together with other actors. Funding models based on donations (donation model, crowdfunding model) are also relevant, as well as those that are partly commercial (freemium model, subscription model, service model, sponsorship model). These models can be helpful in classifying institutional donors.

Different granularities of OER raise the question of the level of production effort. Although it is not possible to draw a general conclusion from granularity to the level of production effort, it can generally be assumed that higher granularity OER were more time consuming to produce than lower granularity OER. As OER indicators are intended to make teaching performance visible, it makes sense to capture different levels of granularity in order to be able to recognize higher levels of effort in OER production. It is therefore useful to categorize OER according to their granularity. In relation to granularity, it should be clarified whether highly granular OER should be weighted more heavily. If so, a weighting factor should be defined. Granularity also raises the question of whether OER or parts of OER can be taken into account more than once in a scientometric assessment. High-granularity OER, such as whole courses (e.g., massive open online courses: MOOCs), are usually made up of many smaller elements that can also be used individually as low-granularity OER. There are two ways of measuring them: Teaching/learning objects can be considered only once (as low-granularity OER or as part of larger OER); or alternatively to consider all OER, regardless of their potential multiple uses, as long as there is open access to them. Given that open learning resources are intended for reuse, and that it is time-consuming in practice to check whether a low-granularity OER is included in a higher granularity OER, the option of considering all published OER with open access is the best approach.

For consistent measurement, it makes sense to develop an overarching subject classification system to which the various subject specifications in individual metadata standards can be mapped. This also applies to pedagogical data, which is particularly interesting for research purposes.

Teaching varies greatly between subject areas and even within a subject area in terms of specific teaching areas. This is expressed in characteristic resource types and granularities, as well as in different update cycles for teaching/learning materials. A scientometric assessment of OER needs to take into account the specificities of teaching between and within subjects, so as not to disadvantage teachers of certain subjects in the case of cross-curricular areas.

Reuse of OER requires attribution, similar to citation. In the case of scientific publications, citation is an ingrained practice. The citation culture actually practiced in the case of teaching/learning materials has not yet been researched. It is therefore unclear whether reused teaching/learning materials are also attributed or cited to the same extent. The actual practice of attribution/citation of OER is an important issue that needs to be clarified, especially for the collaboration and resonance indicators.

In developing OER indicators, different levels of assessment (e.g., personal achievement of individuals or groups of individuals and institutional achievement) should be distinguished against the background of the three proposed categories of dedicated learning content, learning design and content, and the OER Ecosystem.

Currently, OER are almost exclusively published via specific OER infrastructures. The metadata used there does not yet include the separate representation of attributions and/or citations. As a result, analyses of relationships between OER and/or other scientific artifacts such as research articles are not possible without considerable effort. This deficiency in the metadata profiles, as well as the lack of indexing in the reference systems generally used for scientometric analyses, such as Scopus or Web of Science, poses significant practical challenges for scientometric analysis, for which a solution is needed.

OER as achievements in teaching can basically be understood as the equivalent of scientific publications (research articles, conference papers) in research. They can be equally recorded via scientometric analysis within the scientific reward and recognition system. OER are defined in different ways in the literature. In most cases, a very broad understanding of the term is used, classifying OER as any reusable material that can be used for teaching/learning purposes, including research material. In addition, services that support the creation, provision, and reuse of OER (OER Ecosystem) are predominantly counted as OER. For scientometric purposes aimed at the assessment of teaching performance as an equivalent to performance assessment in research, a narrower definition of OER is needed that includes the surrounding ecosystem but focuses on materials created specifically for teaching/learning purposes. A definition of OER suitable for scientometric purposes is suggested in Section 4.

From a scientometric perspective, legal openness is of particular interest. In the field of OER, the open licenses of CC have prevailed. Depending on the license, there are different reuse rights that reflect the degree of openness. Four classes of openness can be defined for OER (see Table 1). In the course of scientometric analysis it has to be clarified if a higher degree of openness should have positive consequences.

OER can have many different representations. Cross-cutting classes can be formed for scientometric measurements. At the top level, the categories Dedicated Learning Content, Learning Design Content, and OER Ecosystem are proposed. A collection of typical OER representations in these three categories is presented in Table 3.

OER are complex artifacts. The OER life cycle results in special features for scientometric measurements compared to scientific publications. An important point is a sufficient level of creation for OER. Only authors who have invested personal effort in the original creation of new or the revision or remix of existing OER should be rewarded. Regarding authorship, it needs to be clarified how to deal with different forms of contributions (authorship vs. supporting tasks). Different levels of contribution (individual vs. institutional) should be taken into account.

OER do not always have a clear target audience and in many cases can be useful for both teachers and learners. As learners cannot be expected to carry out independent quality control, there is a need to ensure sufficient quality, at least in terms of content accuracy. Only OER that meet a minimum level of quality and are therefore suitable for use as learning materials should be included in scientometric analysis.

Depending on their design, OER can have different granularities, from which the scope of an OER and the efforts for its creation can be derived. To ensure comparable measurement, it is useful to consider granularity in scientometric studies. The LOM level of aggregation (see Table 4) provides a manageable classification that can be applied.

Usage data such as views, downloads, and citations or attributions are important for both evaluative (resonance) and explorative (collaboration structures) analyses.

Depending on the objectives of a scientometric analysis, other properties of OER may be of interest. These include, for example, subject classifications or pedagogical characteristics, such as the main target group. Figure 6 summarizes the main OER characteristics.

Figure 6.

Main OER features of interest for scientometric analysis.

Figure 6.

Main OER features of interest for scientometric analysis.

Close modal

As an established method of performance assessment, scientometrics is currently under criticism. At the same time as there are calls for a broadening of the portfolio of performance assessment, there is a move towards qualitative methods of measurement. It is undisputed that performance recording and measurement should not be exclusively quantitative. It is also agreed that scientometric methods need to be applied and interpreted with expertise, which requires metrics literacy (see the Leiden Manifesto: Hicks, Wouters et al., 2015). However, it is equally clear that the science system cannot achieve a fair assessment of scientific performance through qualitative assessments alone, especially in the context of the ever-increasing volume of output. Quantitative methods will therefore continue to play an essential role in performance assessment and research contexts. Against this background, the development of indicators for OER as a teaching-related scholarly output class is worthwhile. In this way, teaching efforts can be valorized and contribute to a better visibility of teaching performance. OER could be given a place in existing reward and recognition systems alongside other forms of scientific output such as research articles or conference papers. This should also be reflected in the fact that OER are considered as artifacts in the data models of citation databases, in order to make them easily indexable and accessible for scientometric analysis. For a stable foundation, the development of scientometric OER indicators must take into account the manifold specificities of OER.

The author has no competing interests.

No funding was received for this research.

Not applicable.

2

Education Resources Information Center (ERIC): https://eric.ed.gov/.

3

Bielefeld Academic Search Engine (BASE): https://www.base-search.net/.

4

Google Scholar: https://scholar.google.de/.

5

Elicit: https://elicit.com.

6

OER Repo AG: https://www.oer-repo-ag.de/.

7

OER Commons: https://oercommons.org/.

9

Sharable Content Object Reference Model (SCORM): https://adlnet.gov/past-projects/scorm/.

10

Advanced Distributed Learning (ADL): https://adlnet.gov/.

11

Open Educational Resources (OER) Quality Framework: https://oercommons.org/courseware/lesson/89736/overview?section=2.

13

KMK Working Group OER Quality: https://www.oer-repo-ag.de/oer-qualitaet/.

Admiraal
,
W.
(
2022
).
A typology of educators using open educational resources for teaching
.
International Journal on Studies in Education
,
4
(
1
),
1
23
.
American Psychological Association (APA)
. (
2020
).
Open educational resource references
. https://apastyle.apa.org/style-grammar-guidelines/references/examples/open-educational-resource-references
Atenas
,
J.
, &
Havemann
,
L.
(
2013
).
Quality assurance in the open: An evaluation of OER repositories
.
International Journal for Innovation and Quality in Learning
,
1
(
2
),
22
34
. https://core.ac.uk/download/pdf/141219719.pdf
Atkins
,
D.
,
Brown
,
J.
, &
Hammond
,
A.
(
2007
).
A review of the open educational resources (OER) movement: Achievements, challenges, and new opportunities
.
The William and Flora Hewlett Foundation
. https://www.hewlett.org/wp-content/uploads/2016/08/ReviewoftheOERMovement.pdf
Camilleri
,
A.
,
Ehlers
,
U.
, &
Pawlowski
,
J.
(
2014
).
State of the art review of quality issues related to open educational resources (OER)
.
Publications Office of the European Union
.
Creative Commons
. (
2022
).
Best practices for attribution
. https://wiki.creativecommons.org/wiki/Best_practices_for_attribution#Attributing_text
D’Antoni
,
S.
(
2009
).
Open Educational Resources: Reviewing initiatives and issues
.
Open Learning
,
24
(
1
),
3
10
.
Ehlers
,
U.
(
2011
).
Extending the territory: From open educational resources to open educational practices
.
Journal of Open, Flexible, and Distance Learning
,
15
(
2
),
1
10
.
European Commission
. (
2017
).
Evaluation of research careers fully acknowledging Open Science practices: Rewards, incentives and/or recognition for researchers practicing Open Science
.
Publications Office of the European Union
.
Fabri
,
B.
,
Flecks
,
J.
,
Koruca
,
N.
, &
Nguyen
,
P.
(Eds.) (
2020
).
Der Gold-Standard für OER-Materialien
.
Verlag ZLL21
.
Fahrer
,
S.
, &
Heck
,
T.
(
2023
).
E 12 open educational resources
. In
R.
Kuhlen
,
D.
Lewandowski
,
W.
Semar
, &
C.
Wormser-Hacker
(Eds.),
Grundlagen der Informationswissenschaft
(pp.
735
744
).
De Gruyter Saur
.
German Research Foundation (Deutsche Forschungsgemeinschaft, DFG)
. (
2022a
).
Academic publishing as a foundation and area of leverage for research assessment: Challenges and fields of action
.
Zenodo
.
German Research Foundation (Deutsche Forschungsgemeinschaft, DFG)
. (
2022b
).
Open Science as part of research culture: Positioning of the German Research Foundation
.
Zenodo
.
Glahn
,
C.
,
Kalz
,
M.
,
Gruber
,
M.
, &
Specht
,
M.
(
2010
).
Supporting the reuse of open educational resources through open standards
. In
T.
Hirashima
,
A. F. M.
Ayub
,
L.-F.
Kwok
,
S. L.
Wong
,
S. C.
Kong
, &
F.-Y.
Yu
(Eds.),
Workshop Proceedings of the 18th International Conference on Computers in Education
. https://oerknowledgecloud.org/content/supporting-reuse-open-educational-resources-through-open-standards
Heck
,
T.
,
Kullmann
,
S.
,
Hiebl
,
J.
,
Schröder
,
N.
,
Otto
,
D.
, &
Sander
,
P.
(
2020
).
Designing open informational ecosystems on the concept of open educational resources
.
Open Education Studies
,
2
(
1
),
252
264
.
Hegarty
,
B.
(
2015
).
Attributes of open pedagogy: A model for using open educational resources
.
Educational Technology
,
55
(
4
),
3
13
. https://www.jstor.org/stable/44430383
Hicks
,
D.
,
Wouters
,
P.
,
Waltman
,
L.
,
de Rijcke
,
S.
, &
Rafols
,
I.
(
2015
).
Bibliometrics: The Leiden Manifesto for research metrics
.
Nature
,
520
,
429
431
. ,
[PubMed]
Hiebl
,
J.
,
Kullmann
,
S.
,
Heck
,
T.
, &
Rittberger
,
M.
(
2023
).
Reflecting open practices on digital infrastructures: Functionalities and implications of knowledge
. In
D.
Otto
,
G.
Scharnberg
,
M.
Kerres
, &
O.
Zawacki-Richter
(Eds.),
Distributed learning ecosystems: Concepts, resources, and repositories
(pp.
203
225
).
Wiesbaden
:
Springer VS
.
Hoeren
,
T.
(
2023
).
Rechtsgutachten zum Umgang mit KI-Software im Hochschulkontext
. In
P.
Salden
&
J.
Leschke
(Eds.),
Didaktische und rechtliche Perspektiven auf KI-gestütztes Schreiben in der Hochschulbildung
(pp.
22
41
).
Ruhr-University Bochum
.
Kerres
,
M.
, &
Heinen
,
R.
(
2015
).
Open informational ecosystems: The missing link for sharing resources for education
.
International Review of Research in Open and Distributed Learning
,
16
(
1
).
Kompetenzzentrum Interoperable Metadaten (KIM)
. (
2023
).
Allgemeines Metadatenprofil für Bildungsressourcen (AMB)
. https://dini-ag-kim.github.io/amb/draft/
Luo
,
T.
,
Hostetler
,
K.
,
Freeman
,
C.
, &
Stefaniak
,
J.
(
2019
).
The power of open: Benefits, barriers, and strategies for integration of open educational resources
.
Open Learning: The Journal of Open, Distance and e-Learning
,
35
(
2
),
140
158
.
Margulies
,
A. H.
(
2005
).
MIT OpenCourseWare—A new model for open sharing
. In
Presentation at the OpenEd Conference at Utah State University
,
September
.
Marín
,
V. I.
, &
Villar-Onrubia
,
D.
(
2022
).
Online infrastructures for open educational resources
. In
Handbook of open, distance and digital education
(pp.
1
20
).
Singapore
:
Springer
.
Mayrberger
,
K.
, &
Thiemann
,
S.
(
2018
).
Jenseits von Selbstreferenzialität: Awareness for Openness @ UHH
. In
Synergie – Fachmagazin für Digitalisierung in der Lehre
,
#5
. https://www.synergie.uni-hamburg.de/de/media/ausgabe05/synergie05-beitrag18-mayrberger-thiemann.pdf
Mishra
,
S.
(
2017
).
Open educational resources: Removing barriers from within
.
Distance Education
,
38
(
3
),
369
380
.
Motz
,
R.
, &
Tansini
,
L.
(
2014
).
Evaluating OER repositories
. In
Interacción ’14: Proceedings of the XV International Conference on Human Computer Interaction
(
Article No. 95
, pp.
1
4
).
Müskens
,
W.
,
Zawacki-Richter
,
O.
, &
Dolch
,
C.
(
2022
).
Quality assurance tool for OER – IQOer – Development version 17
.
OECD
. (
2007
).
Giving knowledge for free. The emergence of open educational resources
.
Paris
:
OECD Publishing
.
OER Repo-AG
. (
2021
).
LOM for higher education OER repositories
. https://dini-ag-kim.github.io/hs-oer-lom-profil/latest
Orr
,
D.
,
Rimini
,
M.
, &
van Damme
,
D.
(
2015
).
Open educational resources: A catalyst for innovation
.
Paris
:
Educational Research and Innovation, OECD Publishing
.
Riar
,
M.
,
Mandausch
,
M.
,
Henning
,
P.
, &
Voss
,
H.-P.
(
2020
).
Incentives and barriers to the use and publication of OER in higher education: A literature review and empirical investigation
. In
Higher education didactics as a professional link between research, policy and practice
(pp.
109
123
).
Santos-Hermosa
,
G.
,
Ferran-Ferrer
,
N.
, &
Abadal
,
E.
(
2017
).
Repositories of open educational resources: An assessment of reuse and educational aspects
.
International Review of Research in Open and Distributed Learning
,
18
(
5
).
Schröder
,
N.
, &
Donat
,
S.
(
2022
).
Practices of university teachers in dealing with open educational resources
.
MedienPädagogik: Zeitschrift für Theorie und Praxis der Medienbildung
,
96
112
.
Schröder
,
N.
, &
Pfänder
,
P.
(
2020
).
Using GitHub for open educational resources
. In
Gesellschaft für Informatik e. V.
(Ed.),
DELFI 2020 – The 18th Educational Technologies Conference of the Gesellschaft für Informatik e. V
(pp.
337
342
). https://dl.gi.de/handle/20.500.12116/34180
Shaddim
. (
2016
).
Creative commons license spectrum
(Graphic)
. https://commons.wikimedia.org/wiki/File:Creative_commons_license_spectrum.svg
Shettar
,
I.
,
Hadagali
,
G. S.
, &
Bulla
,
S. D.
(
2019
).
A scientometric analysis on the world literature on MOOCs
. In
Tumkur University
(Ed.),
Library in the life of the user: Proceedings of 9th KSCLA National Conference
(pp.
582
587
).
Shettar
,
I.
,
Hadagali
,
G. S.
, &
Shokeen
,
A.
(
2021
).
A scientometric analysis of global literature on open educational resources
. In
B. R.
Babu
,
C.
Krishnamurthy
, &
G. S.
Hadagali
(Eds.),
Libraries and resource management in the knowledge society
(pp.
341
356
).
Shree Publishers
.
Statistisches Bundesamt
. (
2023
).
Bildung und Kultur. Studierende an Hochschulen. Fächersystematik
.
Fachserie 11, Reihe 4.1
. https://www.destatis.de/DE/Methoden/Klassifikationen/Bildung/studenten-pruefungsstatistik.pdf?__blob=publicationFile
Tillinghast
,
B.
(
2020
).
Developing an open educational resource and exploring OER-enabled pedagogy in higher education
.
IAFOR Journal of Education: Technology in Education
,
8
(
2
).
Tischler
,
F.
,
Heck
,
T.
, &
Rittberger
,
M.
(
2022
).
Nützlichkeit und Nutzbarkeit von Metadaten bei der Suche und Bereitstellung von offenen Bildungsressourcen
.
Information – Wissenschaft & Praxis
,
73
(
5–6
),
253
263
.
UNESCO
. (
2012
).
2012 Paris Declaration: World Open Educational Resources (OER) Congress
. https://unesdoc.unesco.org/ark:/48223/pf0000246687
UNESCO
. (
2019
).
Recommendation on open educational resources (OER)
. https://unesdoc.unesco.org/ark:/48223/pf0000373755/PDF/373755eng.pdf.multi.page=10
UNESCO
. (
2021
).
UNESCO Recommendation on open science
. https://unesdoc.unesco.org/ark:/48223/pf0000379949.locale=en
Verein Forum Neue Medien in der Lehre Austria (FNMA)
. (
2022
).
OER-Zertifikate
. https://www.oer-zertifikat.at/oer/de/zertifizierung
VSNU, NFU, KNAW, NWO, & ZonMw
. (
2019
).
Room for everyone’s talent: Towards a new balance in the recognition and rewards of academics
. https://www.nwo.nl/sites/nwo/files/media-files/2019-Recognition-Rewards-Position-Paper_EN.pdf
Weller
,
M.
(
2010
).
Big and little OER
. In
Open Ed 2010 Proceedings
. https://hdl.handle.net/10609/4851
Weller
,
M.
,
de los Arcos
,
B.
,
McAndrew
,
P.
, &
Pitt
,
R.
(
2018
).
Identifying categories of open educational resource users
.
International Journal of Open Educational Resources
,
1
(
1
).
Wiley
,
D.
(
2021
).
Defining the “Open” in open content and open educational resources
. In
Y.
Arts
,
H.
Call
,
M.
Cavan
,
T. P.
Holmes
,
R.
Rogers
,
S.
Tuiloma
,
L.
West
, &
R.
Kimmons
(Eds.),
An introduction to open education
.
EdTech Books
. https://edtechbooks.org/open_education/defining_the_open
Wiley
,
D.
, &
Hilton
,
J. L.
(
2018
).
Defining OER-enabled pedagogy
.
International Review of Research in Open and Distance Learning
,
19
(
4
).
Zawacki-Richter
,
O.
, &
Mayrberger
,
K.
(
2017
).
Qualität von OER: Internationale Bestandsaufnahme von Instrumenten zur Qualitätssicherung von Open Educational Resources (OER) – Schritte zu einem deutschen Modell am Beispiel der Hamburg Open Online University
. In
Synergie – Fachmagazin für Digitalisierung in der Lehre
. https://www.synergie.uni-hamburg.de/media/sonderbaende/qualitaet-von-oer-2017.pdf

Author notes

Handling Editor: Vincent Larivière

This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. For a full description of the license, please visit https://creativecommons.org/licenses/by/4.0/legalcode.