If you have followed the discussions about science policy in recent years, you might think that the criticism about rewarding based on publications and citations is broadly supported. The often heard complaint of publish or perish has led to influential initiatives such as Science in Transition, Open Science and the San Francisco Declaration on Research Assessment. At NWO, all references to the h-index are currently being removed from grant applications and criteria. It is therefore interesting that just 34 percent of researchers at Dutch knowledge institutions seem to want to deviate from the system that uses bibliometric data to assess researchers. One third are positive about the current system, and a further third are neutral about it. Among men and researchers in the natural sciences, seven out of ten are positive or neutral about the current system. It is mainly women (41 percent) and researchers in the social sciences and humanities (46 percent) who are critical about the current evaluation and reward system.
Allocating 100 percent
We also asked the researchers how they wanted to be assessed. They could allocate 100 percent across the following aspects: scientific quality, societal impact, education, leadership and talent development, and other. At present, scientific quality is pivotal for being awarded a grant. Contributions to education as well as leadership and talent development are mainly considered at universities and sometimes for indirect government funding (for example, leadership is a condition for the Vici grant). Except for NWO Domain Applied and Engineering Sciences and the Dutch National Research Agenda, the societal impact of knowledge usually plays a small role in the awarding of grants.
Contribution to education counts
What does the average researcher think? According to the survey, the quality of the research should carry a weighting of fifty percent in the assessment of researchers. For the evaluation by the employer, this should be 47 percent and for the evaluation by NWO or other granting bodies that figure is higher: 60 percent. That seems to be logical as, after all, it concerns an application for research funding. For the assessment by the employer, the respondents consider education (22 percent) and leadership (14 percent) to be more important than in an evaluation for a grant application (8 and 10 percent respectively). Nevertheless, it is interesting that when researchers – from all fields – apply for a research grant they also want to be assessed for their contribution to education (16 percent) and leadership (12 percent).
Relevance for society
For a grant application, societal impact – described in the questionnaire as the solving of a societal issue, the utilisation and/or sharing of knowledge, valorisation, science communication and opinion formation – should also be considered according to 19 percent of the respondents. Researchers in social and behavioural sciences and the applied engineering sciences put most emphasis on the importance for society (21 and 26 percent respectively). The need for research with societal impact has steadily increased since the 1990s (for example, the Societal Challenges from Brussels, the Top Sectors policy, and the Dutch National Research Agenda) and this seems to be able to count on the approval of researchers. It is not just the government or society asking academics for useful knowledge; researchers find it so important that they want it to carry a weighting of 20 percent in the assessment and they probably would also like to be rewarded for it as well.
Team science gaining ground
In the category "other criteria" (3 percent) the resistance against the Matthew effect is striking (the phenomenon in which a lot of funding is awarded to a limited number of top researchers). The respondents typed: ‘Place a limit on the number of NWO-funded positions within a group to correct the current imbalance’; ‘Consider grants already awarded and give newcomers more chance’ and even: ‘Every assessment system should avoid the rich get richer phenomena’.
Furthermore, there was clearly a growing demand for acknowledging working in teams. Science is increasingly teamwork; the current system is not considered to pay sufficient attention to that. Therefore, we also submitted the following proposition to respondents: ‘Besides individual performances, collaboration and performances of the team must count as well.’ If it concerns the assessment by the employer, then 78 percent of respondents agree with this. Two-thirds believe that collaboration should play a role in the evaluation of a grant application.
Valid preferred to objective
As NWO and ZonMw are currently exploring types of assessment that are not based on publications, we asked researchers what they consider to be important for the development of new criteria. Should they mainly measure what you want to measure (be valid), should they be objective and mutually comparable, or are other aspects important? To force respondents to set priorities, they were allowed to give a maximum of two answers. Of the 748 respondents, 441 believe that the assessment must mainly be valid. That is considerably more than objective (292) or mutually comparable (254). Here, an interesting correlation was found: respondents who are positive about the current system, consider objectivity to be more important (50 percent) than respondents who are negative about this (26 percent). This concurs with the evaluations and experiences around the current system: it does not measure exactly what you want to measure, but it is highly objective.
Distinction between disciplines
One-sixth of the respondents stated that other aspects were important too. In the explanations, the use of tailored approaches was the most frequent response. The assessment must be "contextual": dependent on the field, the discipline and the researcher’s profile. ‘Match the assessment to the specific route and performances of the applicant. It is important to realise a diversity of selected applicants, which reflects the pluralism of academic work and society as a whole’. The spontaneous answers closely match the proposition in the last question: ‘I believe that the assessment criteria may differ per field’. No less than 86 percent agrees with that. A clear signal.
Which assessment criteria deserve more attention? Four perspectives
Christine Espin, Professor of Learning Problems and Specialised Interventions in Education, Leiden University
‘Sometimes students come to me and say: you have changed my way of thinking or from now on I'll tackle it differently. That is fantastic because then I notice I have an influence. Such a response reflects the translation of my research and my teaching experience, and hopefully, the students will take that with them later when they work with pupils or teachers.’
‘I believe that education and teachers need to improve continuously. Two colleagues and I recently started a study into how education can be improved for students with learning problems, such as dyslexia, autism and ADHD. We still know too little about students who learn in a different way. That is a challenge for the university. I would like more attention to be paid to this.’
For researchers, education is important because it is a way of translating knowledge into practice
‘I think that education is very important for researchers. It is a way of translating knowledge into practice. Over the past 15 to 20 years, I have noticed a growth in the attention for educational quality. I think that universities are increasingly showing that both research and education are important. For example, there is talk of promoting people from assistant to associate professorships based on their educational achievements. Of course, at a university research will continue to be important. Nevertheless, the balance between the two could differ per person.’
Monica Wagner, PhD student, Donders Institute for Brain, Cognition and Behaviour, Nijmegen
‘Family and friends often ask what I do and what the point of it is. Some people don't really understand the profession of researcher. They think that all science is applied research to solve problems. I always try to explain that people are curious and that research can be fascinating. Take a photo of a black hole, for example: everybody finds that fascinating even if they do not know exactly why it is useful.’
Fundamental research also has value without concrete objectives
‘I investigate why some people can speak a second language more fluently than others and without an accent. Some people sound like a native speaker, whereas others will never achieve that. Experience is important, but talent also plays a role. A question that then arises, for example, is whether people with a good pronunciation are mainly good imitators and why that is the case. It is a complex problem and to be honest, too big for four years. That’s the nature of the beast. You find answers and, in particular, more questions.’
‘When I submitted my PhD grant application I also had to state possible applications. It is good to think about that, but fundamental research also has value without those concrete goals because applications of knowledge can also arise later. You cannot predict that.’
Bart Knols, Medical entomologist, Science, Management and Innovation, Radboud University
‘Besides my work at the university, I am a consultant and also a co-owner of a social enterprise in Uganda, with printed patterns on mosquito nets. If you print a logo from the football club Manchester United on a mosquito net, then thousands of Ugandan boys will beg their parents to buy one. You do not need any hard-core science for that, but it does provide unprecedented opportunities in malaria control.’
‘It is fantastic to meet young people in Nijmegen, Tanzania and Uganda who are enthusiastic about using the knowledge they have to make a difference in the world. That motivation arose from my frustration about the fact that thousands of malaria papers are published each year, but in Africa, they still use DDT to control mosquitoes. I no longer wanted to stand on the sidelines. I could not stomach the idea of a career of many publications and honorary prizes that failed to make a difference in practice. I want to come up with solutions.’
I could not stomach the idea of a career of many publications and honorary prizes that failed to make a difference in practice. I want to come up with solutions.
Within science, we still look far too much at publications and citations. I would much rather see a relevance or impact index instead of a citation index. Though I think you should not judge researchers at all on the basis of indexes or milestones. I used to think about industry being full of get rich quick types, but now I often prefer business-oriented thinking to academic thinking. In the business world, everybody is on the same page, whereas at a university, little kingdoms compete with each other. I go back and forth between those two worlds, and I don’t think that’s a bad thing.’
‘I could not stomach the idea of a career of many publications and honorary prizes that failed to make a difference in practice. I want to come up with solutions.’
Marcel van den Hout, Professor of Clinical Psychology, Utrecht University
‘Supervising PhDs is one of the most pleasant aspects of my work. You work with talented, motivated people who dedicate several years of their lives to science. They enter the field with a lot to learn, but if all goes well, they no longer need you by the time they finish. They develop into colleagues. That is highly satisfying and worth celebrating.’
‘As far as I know, research management and supervising PhD students are not things you gain a lot of credit for. Yet whichever criteria you choose: publications, PhDs, education – when you reward achievements, you do not want people to start behaving accordingly. In the past things got a little bit out of hand. Too much depended on publications when recruiting personnel and during visitations. Counting papers is straightforward, but it has a range of side effects if you build careers solely based on that. I think that in recent years, a mixture of other achievements and indices are also being examined to obtain a more nuanced idea.’
When you reward achievements, you do not want people to start behaving accordingly
‘I would like to see a premium on quality for a new generation of researchers. Let researchers write as much as they want, but make clear in advance that decisions about an appointment or promotion will be taken based on the best publication per two years. That is the opposite of the tactic that young researchers are currently seduced by.’
In April 2019, the agency Markteffect carried out an online survey on behalf of NWO. They approached researchers who submitted an application to NWO during the first three months of 2019, for example for Veni and Vici grants, the open competition of NWO Science and NWO Applied and Engineering Sciences, and PhDs in the Humanities. On 5 April, 1940 researchers received an email inviting them to participate. On 16 April, 748 of them had completed the online survey anonymously. The response rate was therefore 39 percent, as a result of which the answers are generalisable for the target group with a certainty of 95 percent and an accuracy of 2.8 percent.