Reflections on the Impact of Science Conference 2019
How is impact perceived and evaluated in national and regional science systems nowadays? Which stakeholders are involved and what examples exist on a national and international level? In this post, Grischa Fraumann reports on the Impact of Science conference from 5 until 7 June 2019 in Berlin.
The impact of science is discussed in different formats and venues nowadays, and an example that stands out is the Impact of Science conference that was organised by the Network for Advancing and Evaluating the Societal Impact of Science (AESIS Network). The importance of impact of science becomes obvious, by taking into account, that this conference has been held six times before coming to Berlin.
The conference offered plenary sessions and several parallel sessions with a smaller number of participants, such as those on international collaboration (chaired by Beverley Damonse) and social media (chaired by Tamika Heiden, Knowledge Translation Australia and Ger Hanley, Write Fund, Ireland). Over two hundred fifty conference participants from more than thirty countries came from a wide range of institutions: researchers in several academic disciplines from research institutes and universities; representatives from public foundations, research funders, librarians, publishers, representatives from industry and governmental institutions.
What I liked about the Impact of Science conference was the fact that examples from Europe were accompanied by many non-European examples, such as initiatives in: Australia, Canada, Egypt, Japan, Kenya, South Africa, Uganda, and the US. For example, the presentations covered case studies in the South African (by Beverley Damonse, South Africa's National Research Foundation, NRF), American (by Toby Smith, Association of American Universities, AAU) and Australian (by Sarah Howard, Australian Research Council, ARC) science systems.
Talking about some impactful presentations
Sarah Foxen (UK Parliament) discussed knowledge exchange within the Parliament, a project that brings together researchers and policy makers. A related initiative was presented by Susanne Baltes (Citizen-Centred-Government, Federal Chancellery of Germany), who leads studies on evidence-based policy making together with citizens in Germany so that policies are designed more aligned with the needs of citizens. Volker Meyer-Guckel (Stifterverband) presented an approach on Strategic Openness. Research impact tools, such as Researchfish, were discussed in several sessions. Vera Hazelwood (Researchfish) mentioned that Researchfish initially started as an initiative by the UK’s Medical Research Council, and is now used by several research funders around the world to track the impact of funded research. Ongoing projects funded by the EU Framework Programme for Research and Innovation, Horizon 2020, were also presented, such as Data4Impact by Vilius Stančiauskas (Public Policy and Management Institute, PPMI). The consortium’s members plan to use big data to assess the societal impact of EU and national research projects that relate to health, demographic change and wellbeing.
The conference also included the opinions of funders, such as Wolfgang Rohe (Stiftung Mercator), who looked at the impact of funded projects through case studies on successful interactions with policy makers, among others. This formed part of a broader session reflecting on Key Performance Indicators (KPIs), which was chaired by Paul Wouters (Leiden University). Research funders, such as the German Research Foundation (DFG), also took part in other sessions. Roland A. Fischer (DFG), for example, described DFG’s position on research funding as a focus on excellence without impact as an assessment criterion. Discipline-specific examples formed part of a presentation by James Wilsdon (University of Sheffield) on social sciences and humanities. In the same panel, Richard van de Sanden (Eindhoven University of Technology) presented an advisory reportby the Royal Netherlands Academy of Arts and Sciences (KNAW) on how impact is assessed within the Dutch science system.
Some challenging topics were also discussed, such as blockchain for science, technology and innovation funding by Luc Soete (Maastricht University). Additionally, the role of open access in generating impact was mentioned several times during the conference and was also addressed in a separate session on ‘Open Science and Governance’, which was chaired by Benedikt Fecher (Alexander von Humboldt Institute for Internet and Society, HIIG) and Hans de Jonge (Dutch Research Council, NWO). Some sessions were held in a workshop format, and recommendations were formulated for the panel at the end. The audience was asked to rank the recommendations in a wrap-up session. The best recommendation was from the ‘Understanding Impact’ session by Dietmar Harhoff (Max Planck Institute for Innovation and Competition), Wiljan van den Akker (Utrecht University), Isabel Roessler (Centre for Higher Education, CHE) and Toby Smith. The recommendation is: ‘Speak the language of the group that you want to reach’.
Reflecting on the conference
What can we learn from such a conference? While conducting a research internship at the Centre for Science and Technology Studies (CWTS), Leiden University, in 2015, I first heard about AESIS. It is interesting to see, how the network has developed over time. To sum up, the Impact of Science conference was an educating event where delegates could get an overview on current opinions on the concept of impact in the German science system and best practices on an international level, as well as network with peers. It remains to be seen how the impact of science in Germany and internationally is evaluated in future, but the topic is definitely on the agenda.
The final discussion was chaired by Luc Soete and the panellists were Matthias Graf von Kielmansegg (German Federal Ministry of Education and Research, BMBF), Dietmar Harhoff, Sarah Howard and Paul Wouters. For the German science system, it was suggested during the concluding discussion that Germany finds its own way to experiment with an approach that reflects the characteristics of the national and regional system(s) of Germany. At the same time, there are also many critical opinions on this kind of new assessment criterion, and differences between types of higher education institutions and academic disciplines need to be taken into account. How researchers engage with the wider society is an area that should be investigated; for example, in case studies on open science. This is a solution that could gain momentum.
During the panel discussions, questions were also raised about the limits of impact assessments; for example, by David Kaldewey (University of Bonn). In that vein, Gabi Lombardo (European Alliance for Social Sciences and Humanities, EASSH) mentioned that researchers are also impacted the other way around; for example, by policies. This might also be related to academic research that generates negative impact, a concept known as Grimpact that was introduced at the 23rd International Conference on Science and Technology Indicators (STI) 2018 in Leiden. The statement ‘science is an endless frontier’, referenced during the conference by Paul Wouters, goes back to a 1945 report by the US National Science Foundation (NSF). I believe that this principle of science is still valid today, and it will hopefully remain so in the future. Reflecting on the conference, I would recommend to everybody with an interest in the impact of science to attend next time if possible.
Acknowledgement: I would like to thank Humboldt-Universität zu Berlin and the VolkwagenStiftung for providing me a fellowship to attend the Impact of Science conference. For another blog article about the conference, see Ger Hanley's report as part of EARMA (European Association of Research Managers and Administrators).