Main outcomes

International peer review conference, 29-30 June 2017

Almost all research councils worldwide are confronted with an increase in the number of applications and a decrease in success rates, which undermines their effectiveness and credibility. On 29–30 June 2017 about 95 representatives from 25 countries participated in an international conference, hosted by the Dutch research council NWO (Netherlands Organisation for Scientific Research). Our collective aim was to discuss challenges related to the peer-review process and work towards improving it by sharing our experiences and ideas.

Topics discussed

1 | Effectiveness of current assessment procedures: high numbers of applications
2 | Efficiency of current assessment procedures. The workload of applicants and reviewers and the duration of the peer-review
3 | Evaluating and adjusting the peer-review process
4 | Sandpit and alternative review methods

For more detailed information on the discussions and outcomes, see the appendix.

Cover of report with Main outcomes of the international peer review conference date 29 and 30 June 2017

More information


Peer-review still necessary

These are some of the conclusions reached at the conference:

  • Most participants think that peer-review is still necessary. Review procedures are very similar internationally, indicating a strong consensus that the process works well for the selection of academic quality.
  • In general, participants of the conference as well as academics have faith in the peer-review system. They have no faith in any kind of lottery, for instance, or in procedures with large error bars. These feel ‘unfair’. There are only a few occasions when lotteries are used to simplify selection, for example when applications differ so much that it is difficult to compare them. Panels are then more likely to be biased and lottery may be the better option. Raffling in the ‘grey zone’ would save the reviewers time, but not the applicants. An impediment to using a lottery as part of the peer-review process is that it has to be accepted by the academic community. Judging by the outcomes of the Dutch national conference, this could be a challenge. Furthermore, in certain countries it is legally forbidden to decide this kind of a selection process by lottery.
  • A multi-step procedure is seen as a guarantee for quality, both by academics as well as policy officers.
  • Researchers do not consider writing a proposal a waste of time; it actually helps them to sort their ideas and receive valuable feedback. Therefore the selection process has merit in itself.

Challenges to the peer-review process

  • There are many criteria for success and we do not have sufficient data on how to measure it. What exactly is ‘impact’, for instance? We constantly have to ask ourselves if we are funding the best proposals or the most innovative projects.
  • The peer-review system is thought to be more challenging for certain types of research, such as very innovative, high-risk or multidisciplinary research. In these cases, novelty should certainly be one of the criteria.
  • The duration of peer-review processes seems reasonable in general and does not affect academics too much, with the exception of young researchers in a delicate phase of their career. A pre-award system could help them. Moreover it seems impossible to shorten the process easily without compromising on quality.
  • For a trustworthy system it is important to allow applicants to respond to peer-review reports, either by interviews or in the form of a written rebuttal. Interviews are particularly important for young researchers, because discussing their motivation helps them to confirm their ownership.
  • To prevent certain groups – mainly white middle-aged males – from dominating the process, a diversity of reviewers is crucial. Under-represented groups carry an extra burden in the selection process. Gender was mostly discussed in this context; in some disciplines women are severely under-represented and the few women that do work in such fields are already overwhelmed with committee work.

Make reviewing more efficient and gratifying

The conference found that reviewers should be given more attention. As volunteers they deserve to have
their job made as efficient and gratifying as possible.

  • It would help to make evaluation forms more user-friendly. Often reviewers are uncertain as to what is expected of them. Keep the forms as brief as possible and have policy officers available for questions.
  • Questions posed to reviewers by different research councils could and should perhaps be standardised.
  • Reviewers are often untrained and some would like a short course on how to deliver a fair and wellwritten report.
  • Reviewers like to get feedback, whether or not the proposals they reviewed receive funding. They also appreciate a ‘thank you’ note by the president of the research council.
  • Transparent communication with the applicants, the referees as well as the panel members creates trust in the referee process.

Share

Many participants expressed the desire to share knowledge and information about the peer-review process.
Participants applaud the idea of an international online platform to share data.

It helps to look at your own data and methodology and learn from each other.

  • There is a need for more research on methodologies. Most councils only evaluate research projects, rather than their own peer-review processes. Is there any kind of unconscious bias? Did projects get granted that were not successful?
  • More collaboration and sharing is needed between funding agencies. It would be helpful to share good practices and look at cultural differences. Is it possible to redistribute the burden of peer-reviewing among countries? Meanwhile, research councils should complement rather than compete with each other.
  • Standardisation of procedures is recommended, for instance regarding the payment of peer-reviewers.
  • An important role for research councils could be to identify similar ideas held by different researchers, connect them and encourage them to collaborate.

Potential improvements

Lastly, the participants discussed potential ways of improving the system. These are the most prominent
findings:

  • Many ideas involve pre-selection or application restrictions, and many examples were given that are already in practice, such as a maximum number of applications allowed per university (imposed in the UK and Ireland, for example). Another possibility is a quarantine, which prohibits reapplication for those who ended up in the bottom xx% in a previous round. Alternatively, in a two-step process, one could not send proposals with low external rates from reviewers to the committee/panel. Pre-selection can also be based on the CV or an outline. All these solutions help to decrease the workload for both reviewers and applicants.
  • Research councils are advised to engage in active dialogue with universities and research institutes. These carry a responsibility for the selection of proposals that are submitted from their ranks and should act accordingly.
  • Someone suggested using artificial intelligence to determine ranking.
  • Someone else suggested having an interview instead of a pre-proposal, or even a pre-proposal in the form of a video pitch.
  • Another proposed possibility was to perhaps give members of the public a part in the selection process, such as patients, target groups or future users of the knowledge generated by the research project.
  • Besides adjusting traditional methods, many alternative methods were discussed as well. We should experiment more, it was said, for instance with sandpits. However, when introducing alternatives this should never be at the expense of trust and confidence. Drastic changes in the process require clear communication and timely announcements.

Building trust

Ultimately, the treasure to be cherished most is trust. Therefore a dialogue with applicants, reviewers,
universities and research institutes is essential: as long as there is trust, applicants will not feel as if they
are being subjected to some obscure process, which is frequently heard criticism. Hopefully developing an
international platform and maintaining a positive exchange between research councils will contribute to
building trust. The International Peer-Review Conference in Amsterdam could be the beginning of a fruitful
tradition.


Contact

Dhr. O.R. (Olivier) Morot Dhr. O.R. (Olivier) Morot t: +31 (0)6 53175 377 o.morot@nwo.nl