Comments to Jean-Claude Burgelman's article Politics and Open Science: How the European Open Science Cloud Became Reality (the Untold Story)

Is the evolution of how science is done primarily heuristic or driven by policy makers? Reading the intriguing story of the inception of the European Open Science Cloud (EOSC) by J-C Burgelman one could assume the latter. But the story also reveals that the policy makers’ shaping of concepts and planning of funding models is a mere reaction to the turns of the broad stream of research. Looking from the science side the trend towards data intensive science started for sure already in the 90’s with advancements in particularly sensor technologies rapidly increasing the amount of data collected. The development of Charge-coupled Devices (CCDs) and its impact on astronomy is the often-cited example but it was a trend building up all over experimental research taking benefit of developments within fast electronics and communication, or leaps in efficiency when shifting experimental technology, e.g. going from chromatography methods to fluoresces based methods for genome sequencing. On top of it the introduction of parallel supercomputers almost completely built out of commercial off-the-shelf components enabled the possibility to actually analyse and produce data in an unprecedented cost-efficient way.

Is the evolution of how science is done primarily heuristic or driven by policy makers? Reading the intriguing story of the inception of the European Open Science Cloud (EOSC) by J-C Burgelman one could assume the latter. But the story also reveals that the policy makers' shaping of concepts and planning of funding models is a mere reaction to the turns of the broad stream of research. Looking from the science side the trend towards data intensive science started for sure already in the 90's with advancements in particularly sensor technologies rapidly increasing the amount of data collected. The development of Charge-coupled Devices (CCDs) and its impact on astronomy is the often-cited example but it was a trend building up all over experimental research taking benefit of developments within fast electronics and communication, or leaps in efficiency when shifting experimental technology, e.g. going from chromatography methods to fluoresces based methods for genome sequencing. On top of it the introduction of parallel supercomputers almost completely built out of commercial off-the-shelf components enabled the possibility to actually analyse and produce data in an unprecedented cost-efficient way.
The resulting obvious need to combine and interoperate various networked resources was met by computer science and maybe the most successful work was concluded in the "The GRID, Blueprint for a New Computing Infrastructure"  . This result of work during the 90's by Foster and Kesselman became immensely popular even in industry. Though, on the other side of the IT-bubble industry took another route. Introduction of Service Oriented Architecture (SOA), Web-services, and cloud computing became the industry standard implementation of the ubiquitous computing promise in "The GRID". Soon it also became part of the researchers' toolbox. Meaning that more complex workflows could be realised in connecting

Comments to Jean-Claude Burgelman's article Politics and Open Science: How the European Open Science Cloud Became Reality (the Untold Story)
remote experiments, data resources, computing facilities, and whatever needed to reach the objectives of and advanced the research enterprise.
The European Commission supported a large number of Grid-projects during the first decade of the millennium and the technology and concepts were particularly adopted by the high-energy physics community which still use them for analysing and simulating data related to the Large Hadron Collider (LHC) through a global network, or grid of sites.
It deserves a more careful investigation but it is my feeling that particularly the life-science community was leading in taking another route and jumped on the train of SOA, Web-services, and cloud computing when it started to move with some speed somewhere after 2005. The inherit need of the specific research and the smart utilisation of the available technologies have resulted in a very well organised community when it comes to gathering of common open core data resources; sharing results that are findable, accessible, interoperable, and reusable; and very much of everything else that we are seeking with EOSC. In this aspect, a life-science, or definitely a bioinformatics EOSC implementation is already in place and also heavily used by the pharmaceutical industry which can access many of the core data resources at the same terms and conditions as publicly funded research.
What is the point in looking backwards for some more than a decade old advances? Just to show that where we are now is not a new place. Yes, we experience another turn in how research is done and we are now looking for a societal response to that change. How can the society at large support, but also take optimal benefit of where research is right now? EOSC is, if we succeed, part of the answer, but we need to make sure that it can stay creative and innovative in enabling opportunities for research and society at large to thrive together.
High-energy physics and life-sciences are mentioned as areas driving the development of concepts for enabling of research, but examples can be found elsewhere. More important is what research areas and research needs will show the direction for the next leap, maybe humanities or environmental sciences? I would put my bet there somewhere but the important message is that inspiration and innovation will, and must come from the research needs. Only then can growth minded policy makers, in their best moments, accelerate the development by introducing new frameworks and steering of funding, that will ultimately benefit research and the whole society.

Comments to Jean-Claude Burgelman's article Politics and Open Science: How the European Open Science Cloud Became Reality (the Untold Story)
AUTHOR BIOGRAPHY Per Öster is Director at CSC -IT Center for Science Ltd where he leads the Business Insights and Growth area with the aim to create long-term value through the discovery, analysis, and facilitation of growth opportunities. Important part of the activity is to engage in European and other international and national initiatives advancing ICT and policies for research. Examples are the European Open Science Cloud (EOSC), EUDAT, EuroHPC, and Research Data Alliance (RDA). Öster represents Finland and CSC on the board of the bio-informatics research infrastructure ELIXIR and he is the Chair of the Board of directors of EUDAT Ltd. Further he represents CSC in Knowledge Exchange, a partnership to promote open scholarship and improve services for higher education and research in Europe. Öster is one of the founders of the European Grid Initiative (EGI) and was its first Council chair. Per Öster has more than 20 years of experience in computational science from both academia and industry. He has a background in theoretical atomic physics and received, in 1990, adoctorate in physics from the Department of Physics,University of Gothenburg/Chalmers University of Technology. ORCID: 0000-0001-5836-8850