1
|
Düzgüneş N. 'Science by consensus' impedes scientific creativity and progress: A simple alternative to funding biomedical research. F1000Res 2024; 11:961. [PMID: 38798304 PMCID: PMC11126901 DOI: 10.12688/f1000research.124082.3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 02/19/2024] [Indexed: 05/29/2024] Open
Abstract
The very low success rates of grant applications to the National Institutes of Health (NIH) and the National Science Foundation (NSF) are highly detrimental to the progress of science and the careers of scientists. The peer review process that evaluates proposals has been claimed arbitrarily to be the best there is. This consensus system, however, has never been evaluated scientifically against an alternative. Here we delineate the 15 major problems with the peer review process. We challenge the Science Advisor to the President, and the leadership of NIH, NSF, the U.S. National Academy of Sciences and other funding agencies throughout the world to refute each of these criticisms. We call for the implementation of more equitable alternatives that will not constrain the progress of science. We propose a system that will fund at least 80,000 principal investigators, including young scientists, with about half the current NIH budget, seven-times as many as the current number of NIH "research project grants," and that will forego the cumbersome, expensive, and counterproductive "peer" review stage. Further, we propose that the success of the two systems over 5-10 years be compared scientifically.
Collapse
Affiliation(s)
- Nejat Düzgüneş
- Department of Biomedical Sciences, University of the Pacific - San Francisco Campus, San Francisco, CA, 94103, USA
| |
Collapse
|
2
|
Dresler M. FENS-Kavli Network of Excellence: Postponed, non-competitive peer review for research funding. Eur J Neurosci 2023; 58:4441-4448. [PMID: 36085597 DOI: 10.1111/ejn.15818] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2022] [Revised: 08/17/2022] [Accepted: 08/29/2022] [Indexed: 11/29/2022]
Abstract
Receiving research grants is among the highlights of an academic career, affirming previous accomplishments and enabling new research endeavours. Much of the process of acquiring research funding, however, belongs to the less favourite duties of many researchers: It is time consuming, often stressful and, in the majority of cases, unsuccessful. This resentment towards funding acquisition is backed up by empirical research: The current system to distribute research funding, via competitive calls for extensive research applications that undergo peer review, has repeatedly been shown to fail in its task to reliably rank proposals according to their merit, while at the same time being highly inefficient. The simplest, fairest and broadly supported alternative would be to distribute funding more equally across researchers, for example, by an increase of universities' base funding, thereby saving considerable time that can be spent on research instead. Here, I propose how to combine such a 'funding flat rate' model-or other efficient distribution strategies-with quality control through postponed, non-competitive peer review using open science practices.
Collapse
Affiliation(s)
- Martin Dresler
- Donders Institute for Brain, Cognition and Behaviour, Nijmegen, The Netherlands
- Radboud University Medical Center, Nijmegen, The Netherlands
| |
Collapse
|
3
|
Kneale D, Kjaersgaard A, de Melo M, Joaquim Picardo J, Griffin S, French RS, Burchett HED. Can cash transfer interventions increase contraceptive use and reduce adolescent birth and pregnancy in low and middle income countries? A systematic review and meta-analysis. PLOS GLOBAL PUBLIC HEALTH 2023; 3:e0001631. [PMID: 37943721 PMCID: PMC10635429 DOI: 10.1371/journal.pgph.0001631] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/04/2022] [Accepted: 08/22/2023] [Indexed: 11/12/2023]
Abstract
Becoming pregnant and giving birth under the age of 20 is associated with a range of adverse social, socioeconomic and health outcomes for adolescent girls and their children in Low and middle income countries. Cash transfers are an example of a structural intervention that can change the local social and economic environment, and have been linked with positive health and social outcomes across several domains. As part of a wider review of structural adolescent contraception interventions, we conducted a systematic review on the impact of cash transfers on adolescent contraception and fertility. Fifteen studies were included in the review with eleven studies providing evidence for meta-analyses on contraception use, pregnancy and childbearing. The evidence suggests that cash transfer interventions are generally ineffective in raising levels of contraceptive use. However, cash transfer interventions did reduce levels of early pregnancy (OR 0.90, 95% CI 0.81 to 1.00). There was suggestive evidence that conditional, but not unconditional, cash transfers reduce levels of early childbearing. Given that much of the evidence is drawn from interventions providing cash transfers conditional on school attendance, supporting school attendance may enable adolescent girls and young women to make life choices that do not involve early pregnancy.
Collapse
Affiliation(s)
- Dylan Kneale
- EPPI-Centre, UCL Social Research Institute, University College London, London, United Kingdom
| | - Abel Kjaersgaard
- EPPI-Centre, UCL Social Research Institute, University College London, London, United Kingdom
| | - Malica de Melo
- International Centre for Reproductive Health Mozambique (ICRH-M), Maputo, Mozambique
| | | | - Sally Griffin
- International Centre for Reproductive Health Mozambique (ICRH-M), Maputo, Mozambique
| | - Rebecca S. French
- Department of Public Health, Environments and Society, Faculty of Public Health & Policy, London School of Hygiene & Tropical Medicine, London, United Kingdom
| | - Helen E. D. Burchett
- Department of Public Health, Environments and Society, Faculty of Public Health & Policy, London School of Hygiene & Tropical Medicine, London, United Kingdom
| |
Collapse
|
4
|
Qussini S, MacDonald RS, Shahbal S, Dierickx K. Blinding Models for Scientific Peer-Review of Biomedical Research Proposals: A Systematic Review. J Empir Res Hum Res Ethics 2023; 18:250-262. [PMID: 37526052 DOI: 10.1177/15562646231191424] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 08/02/2023]
Abstract
Objective: The aim of this systematic review is to estimate: (i) the overall effect of blinding models on bias; (ii) the effect of each blinding model; and (iii) the effect of un-blinding on reviewer's accountability in biomedical research proposals. Methods: Systematic review of prospective or retrospective comparative studies that evaluated two or more peer review blinding models for biomedical research proposals/funding applications and reported outcomes related to peer review efficiency. Results: Three studies that met the inclusion criteria were included in this review and assessed using the QualSyst tool by two authors. Conclusion: Our systematic review is the first to assess peer review blinding models in the context of funding. While only three studies were included, this highlighted the dire need for further RCTs that generate validated evidence. We also discussed multiple aspects of peer review, such as peer review in manuscripts vs proposals and peer review in other fields.
Collapse
Affiliation(s)
- Seba Qussini
- Medical Research Center, Hamad Medical Corporation, Doha, Qatar
| | - Ross S MacDonald
- Distributed eLibrary, Weill Cornell Medicine - Qatar, Education City, Doha, Qatar
| | - Saad Shahbal
- Department of Medicine, Hamad Medical Corporation, Doha, Qatar
| | - Kris Dierickx
- Centre for Biomedical Ethics and Law, Faculty of Medicine, KU Leuven, Leuven, Belgium
| |
Collapse
|
5
|
Recio-Saucedo A, Crane K, Meadmore K, Fackrell K, Church H, Fraser S, Blatch-Jones A. What works for peer review and decision-making in research funding: a realist synthesis. Res Integr Peer Rev 2022; 7:2. [PMID: 35246264 PMCID: PMC8894828 DOI: 10.1186/s41073-022-00120-2] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2021] [Accepted: 02/01/2022] [Indexed: 11/17/2022] Open
Abstract
Introduction Allocation of research funds relies on peer review to support funding decisions, and these processes can be susceptible to biases and inefficiencies. The aim of this work was to determine which past interventions to peer review and decision-making have worked to improve research funding practices, how they worked, and for whom. Methods Realist synthesis of peer-review publications and grey literature reporting interventions in peer review for research funding. Results We analysed 96 publications and 36 website sources. Sixty publications enabled us to extract stakeholder-specific context-mechanism-outcomes configurations (CMOCs) for 50 interventions, which formed the basis of our synthesis. Shorter applications, reviewer and applicant training, virtual funding panels, enhanced decision models, institutional submission quotas, applicant training in peer review and grant-writing reduced interrater variability, increased relevance of funded research, reduced time taken to write and review applications, promoted increased investment into innovation, and lowered cost of panels. Conclusions Reports of 50 interventions in different areas of peer review provide useful guidance on ways of solving common issues with the peer review process. Evidence of the broader impact of these interventions on the research ecosystem is still needed, and future research should aim to identify processes that consistently work to improve peer review across funders and research contexts. Supplementary Information The online version contains supplementary material available at 10.1186/s41073-022-00120-2.
Collapse
Affiliation(s)
- Alejandra Recio-Saucedo
- Wessex Institute, National Institute of Health Research Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Alpha House, Enterprise Road, Southampton, Southampton, SO16 7NS, UK.
| | - Ksenia Crane
- Wessex Institute, National Institute of Health Research Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Alpha House, Enterprise Road, Southampton, Southampton, SO16 7NS, UK
| | - Katie Meadmore
- Wessex Institute, National Institute of Health Research Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Alpha House, Enterprise Road, Southampton, Southampton, SO16 7NS, UK
| | - Kathryn Fackrell
- Wessex Institute, National Institute of Health Research Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Alpha House, Enterprise Road, Southampton, Southampton, SO16 7NS, UK
| | - Hazel Church
- Wessex Institute, National Institute of Health Research Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Alpha House, Enterprise Road, Southampton, Southampton, SO16 7NS, UK
| | - Simon Fraser
- Wessex Institute, National Institute of Health Research Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Alpha House, Enterprise Road, Southampton, Southampton, SO16 7NS, UK.,School of Primary Care, Population Sciences and Medical Education, Faculty of Medicine, University of Southampton, Southampton, SO17 1BJ, UK
| | - Amanda Blatch-Jones
- Wessex Institute, National Institute of Health Research Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Alpha House, Enterprise Road, Southampton, Southampton, SO16 7NS, UK
| |
Collapse
|
6
|
Aczel B, Szaszi B, Holcombe AO. A billion-dollar donation: estimating the cost of researchers' time spent on peer review. Res Integr Peer Rev 2021; 6:14. [PMID: 34776003 PMCID: PMC8591820 DOI: 10.1186/s41073-021-00118-2] [Citation(s) in RCA: 48] [Impact Index Per Article: 16.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2021] [Accepted: 11/03/2021] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND The amount and value of researchers' peer review work is critical for academia and journal publishing. However, this labor is under-recognized, its magnitude is unknown, and alternative ways of organizing peer review labor are rarely considered. METHODS Using publicly available data, we provide an estimate of researchers' time and the salary-based contribution to the journal peer review system. RESULTS We found that the total time reviewers globally worked on peer reviews was over 100 million hours in 2020, equivalent to over 15 thousand years. The estimated monetary value of the time US-based reviewers spent on reviews was over 1.5 billion USD in 2020. For China-based reviewers, the estimate is over 600 million USD, and for UK-based, close to 400 million USD. CONCLUSIONS By design, our results are very likely to be under-estimates as they reflect only a portion of the total number of journals worldwide. The numbers highlight the enormous amount of work and time that researchers provide to the publication system, and the importance of considering alternative ways of structuring, and paying for, peer review. We foster this process by discussing some alternative models that aim to boost the benefits of peer review, thus improving its cost-benefit ratio.
Collapse
Affiliation(s)
- Balazs Aczel
- Present address: Institute of Psychology, ELTE, Eotvos Lorand University, Izabella u. 46, Budapest, 1064, Hungary.
| | - Barnabas Szaszi
- Present address: Institute of Psychology, ELTE, Eotvos Lorand University, Izabella u. 46, Budapest, 1064, Hungary.
| | - Alex O Holcombe
- School of Psychology, University of Sydney, Sydney, Australia
| |
Collapse
|
7
|
De Peuter S, Conix S. The modified lottery: Formalizing the intrinsic randomness of research funding. Account Res 2021; 29:324-345. [PMID: 33970719 DOI: 10.1080/08989621.2021.1927727] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
Competition for research funds has, in the recent decade, become hypercompetitive. Commonly, to determine which proposals receive funding, a system of peer review is used, which is broadly accepted, easily understood, and broadly trusted among researchers. It is often considered the best system in use, but it suffers from important shortcomings, and adaptations to overcome these shortcomings have small and often short-lived effects. Hence, the preference for peer review does not mean it necessarily outperforms all other systems. In fact, it is time for an open discussion about alternative allocation mechanisms. Random allocation of research funding may be a viable alternative to the current peer review system. In particular the "organized randomness" of a modified lottery is interesting, combining the benefits of randomization with some of the most valuable aspects of peer review. Still, many questions remain and this is certainly not a plea to allocate all research funds using lotteries without further research. But we need to be prepared to consider alternatives, even though they are not perfect, and modified lotteries should be part of the solution.
Collapse
Affiliation(s)
- Steven De Peuter
- Faculty of Psychology and Educational Sciences, KU Leuven, Leuven, Belgium
| | - S Conix
- Centre for Logic and Philosophy of Science, Institute of Philosophy, KU Leuven, Leuven, Belgium
| |
Collapse
|
8
|
Pina DG, Buljan I, Hren D, Marušić A. A retrospective analysis of the peer review of more than 75,000 Marie Curie proposals between 2007 and 2018. eLife 2021; 10:59338. [PMID: 33439120 PMCID: PMC7806263 DOI: 10.7554/elife.59338] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2020] [Accepted: 12/30/2020] [Indexed: 11/26/2022] Open
Abstract
Most funding agencies rely on peer review to evaluate grant applications and proposals, but research into the use of this process by funding agencies has been limited. Here we explore if two changes to the organization of peer review for proposals submitted to various funding actions by the European Union has an influence on the outcome of the peer review process. Based on an analysis of more than 75,000 applications to three actions of the Marie Curie programme over a period of 12 years, we find that the changes – a reduction in the number of evaluation criteria used by reviewers and a move from in-person to virtual meetings – had little impact on the outcome of the peer review process. Our results indicate that other factors, such as the type of grant or area of research, have a larger impact on the outcome.
Collapse
Affiliation(s)
- David G Pina
- Research Executive Agency, European Commission, Brussels, Belgium
| | - Ivan Buljan
- Department for Research in Biomedicine and Health, University of Split School of Medicine, Split, Croatia
| | - Darko Hren
- Department of Psychology, University of Split School of Humanities and Social Sciences, Split, Croatia
| | - Ana Marušić
- Department for Research in Biomedicine and Health, University of Split School of Medicine, Split, Croatia
| |
Collapse
|
9
|
Meadmore K, Fackrell K, Recio-Saucedo A, Bull A, Fraser SDS, Blatch-Jones A. Decision-making approaches used by UK and international health funding organisations for allocating research funds: A survey of current practice. PLoS One 2020; 15:e0239757. [PMID: 33151954 PMCID: PMC7644005 DOI: 10.1371/journal.pone.0239757] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2020] [Accepted: 09/11/2020] [Indexed: 12/14/2022] Open
Abstract
Innovations in decision-making practice for allocation of funds in health research are emerging; however, it is not clear to what extent these are used. This study aims to better understand current decision-making practices for the allocation of research funding from the perspective of UK and international health funders. An online survey (active March-April 2019) was distributed by email to UK and international health and health-related funding organisations (e.g., biomedical and social), and was publicised on social media. The survey collected information about decision-making approaches for research funding allocation, and covered assessment criteria, current and past practices, and considerations for improvements or future practice. A mixed methods analysis provided descriptive statistics (frequencies and percentages of responses) and an inductive thematic framework of key experiences. Thirty-one responses were analysed, representing government-funded organisations and charities in the health sector from the UK, Europe and Australia. Four themes were extracted and provided a narrative framework. 1. The most reported decision-making approaches were external peer review, triage, and face-to-face committee meetings; 2. Key values underpinned decision-making processes. These included transparency and gaining perspectives from reviewers with different expertise (e.g., scientific, patient and public); 3. Cross-cutting challenges of the decision-making processes faced by funders included bias, burden and external limitations; 4. Evidence of variations and innovations from the most reported decision-making approaches, including proportionate peer review, number of decision-points, virtual committee meetings and sandpits (interactive workshop). Broadly similar decision-making processes were used by all funders in this survey. Findings indicated a preference for funders to adapt current decision-making processes rather than using more innovative approaches: however, there is a need for more flexibility in decision-making and support to applicants. Funders indicated the need for information and empirical evidence on innovations which would help to inform decision-making in research fund allocation.
Collapse
Affiliation(s)
- Katie Meadmore
- Wessex Institute, University of Southampton, Southampton, United Kingdom
- * E-mail:
| | - Kathryn Fackrell
- Wessex Institute, University of Southampton, Southampton, United Kingdom
| | | | - Abby Bull
- Wessex Institute, University of Southampton, Southampton, United Kingdom
| | - Simon D. S. Fraser
- Wessex Institute, University of Southampton, Southampton, United Kingdom
- School of Primary Care, Population Sciences and Medical Education, Faculty of Medicine, University of Southampton, Southampton, United Kingdom
| | | |
Collapse
|
10
|
Abstract
BACKGROUND Fifty years ago, the groundbreaking British sketch series Monty Python's Flying Circus premiered on BBC One and forever changed the world of comedy. The humour transcended mere absurdity by poking a subversive finger in the eye of buttoned-up British society. Here, we commemorate this cultural milestone and simultaneously call attention to an emerging concept in the health sciences, termed simplified peer review. The union of these disparate subjects motivates a formal gait analysis based on one of the troupe's most iconic sketches, "The Ministry of Silly Walks", a satire of bureaucratic inefficiency. RESEARCH QUESTION The sketch portrays peer review as exceedingly efficient, lasting all of 20 s. But was it fair? The answer depends on how one measures silliness. If silly walking can be defined as deviations from typical walking, then it can be quantified using video-based gait analysis. METHODS To assess the quality of peer review at the Ministry of Silly Walks, we measured knee flexion in the sagittal plane of motion and calculated the Gait Variable Score (GVS) for three gait cycles, those of the Minister (n = 2) and Mr. Pudey (n = 1), an applicant for a Research Fellowship. RESULTS For the Minister, we found large deviations from typical walking across two gait cycles (GVSknee(1) = 33.6, GVSknee(2) = 23.3), whereas the gait of Mr Pudey produced an intermediate score (GVSknee = 16.3). By this measure, Mr Pudney's walk is 3.3 times more variable than typical walking, whereas an exemplary silly walk is 6.7 and 4.7 times more variable, respectively, than typical walking. SIGNIFICANCE Our analysis corroborates the Minister's assessment: Mr Pudey is a promising applicant and deserving of a Research Fellowship to advance his silly walk. We suggest that the sketch holds special resonance and uncanny prescience for researchers in the health sciences today.
Collapse
|
11
|
Frampton GK, Shepherd J, Pickett K, Griffiths G, Wyatt JC. Digital tools for the recruitment and retention of participants in randomised controlled trials: a systematic map. Trials 2020; 21:478. [PMID: 32498690 PMCID: PMC7273688 DOI: 10.1186/s13063-020-04358-3] [Citation(s) in RCA: 40] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2019] [Accepted: 04/28/2020] [Indexed: 12/14/2022] Open
Abstract
BACKGROUND Recruiting and retaining participants in randomised controlled trials (RCTs) is challenging. Digital tools, such as social media, data mining, email or text-messaging, could improve recruitment or retention, but an overview of this research area is lacking. We aimed to systematically map the characteristics of digital recruitment and retention tools for RCTs, and the features of the comparative studies that have evaluated the effectiveness of these tools during the past 10 years. METHODS We searched Medline, Embase, other databases, the Internet, and relevant web sites in July 2018 to identify comparative studies of digital tools for recruiting and/or retaining participants in health RCTs. Two reviewers independently screened references against protocol-specified eligibility criteria. Included studies were coded by one reviewer with 20% checked by a second reviewer, using pre-defined keywords to describe characteristics of the studies, populations and digital tools evaluated. RESULTS We identified 9163 potentially relevant references, of which 104 articles reporting 105 comparative studies were included in the systematic map. The number of published studies on digital tools has doubled in the past decade, but most studies evaluated digital tools for recruitment rather than retention. The key health areas investigated were health promotion, cancers, circulatory system diseases and mental health. Few studies focussed on minority or under-served populations, and most studies were observational. The most frequently-studied digital tools were social media, Internet sites, email and tv/radio for recruitment; and email and text-messaging for retention. One quarter of the studies measured efficiency (cost per recruited or retained participant) but few studies have evaluated people's attitudes towards the use of digital tools. CONCLUSIONS This systematic map highlights a number of evidence gaps and may help stakeholders to identify and prioritise further research needs. In particular, there is a need for rigorous research on the efficiency of the digital tools and their impact on RCT participants and investigators, perhaps as studies-within-a-trial (SWAT) research. There is also a need for research into how digital tools may improve participant retention in RCTs which is currently underrepresented relative to recruitment research. REGISTRATION Not registered; based on a pre-specified protocol, peer-reviewed by the project's Advisory Board.
Collapse
Affiliation(s)
- Geoff K. Frampton
- Southampton Health Technology Assessments Centre (SHTAC), Wessex Institute, Faculty of Medicine, University of Southampton, Alpha House, Southampton Science Park, Southampton, SO16 7NS UK
- Wessex Institute, Faculty of Medicine, University of Southampton, Alpha House, Southampton Science Park, Southampton, SO16 7NS UK
| | - Jonathan Shepherd
- Southampton Health Technology Assessments Centre (SHTAC), Wessex Institute, Faculty of Medicine, University of Southampton, Alpha House, Southampton Science Park, Southampton, SO16 7NS UK
- Wessex Institute, Faculty of Medicine, University of Southampton, Alpha House, Southampton Science Park, Southampton, SO16 7NS UK
| | - Karen Pickett
- Southampton Health Technology Assessments Centre (SHTAC), Wessex Institute, Faculty of Medicine, University of Southampton, Alpha House, Southampton Science Park, Southampton, SO16 7NS UK
- Wessex Institute, Faculty of Medicine, University of Southampton, Alpha House, Southampton Science Park, Southampton, SO16 7NS UK
| | - Gareth Griffiths
- Southampton Clinical Trials Unit, University of Southampton and Southampton University Hospital NHS Foundation Trust, Southampton General Hospital, Southampton, SO16 6YD UK
| | - Jeremy C. Wyatt
- Wessex Institute, Faculty of Medicine, University of Southampton, Alpha House, Southampton Science Park, Southampton, SO16 7NS UK
| |
Collapse
|
12
|
What makes an effective grants peer reviewer? An exploratory study of the necessary skills. PLoS One 2020; 15:e0232327. [PMID: 32401806 PMCID: PMC7219739 DOI: 10.1371/journal.pone.0232327] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2019] [Accepted: 04/13/2020] [Indexed: 11/20/2022] Open
Abstract
This exploratory mixed methods study describes skills required to be an effective peer reviewer as a member of review panels conducted for federal agencies that fund research, and examines how reviewer experience and the use of technology within such panels impacts reviewer skill development. Two specific review panel formats are considered: in-person face-to-face and virtual video conference. Data were collected through interviews with seven program officers and five expert peer review panelists, and surveys from 51 respondents. Results include the skills reviewers’ consider necessary for effective review panel participation, their assessment of the relative importance of these skills, how they are learned, and how review format affects skill development and improvement. Results are discussed relative to the peer review literature and with consideration of the importance of professional skills needed by successful scientists and peer reviewers.
Collapse
|
13
|
Morgan B, Yu LM, Solomon T, Ziebland S. Assessing health research grant applications: A retrospective comparative review of a one-stage versus a two-stage application assessment process. PLoS One 2020; 15:e0230118. [PMID: 32163468 PMCID: PMC7067561 DOI: 10.1371/journal.pone.0230118] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2019] [Accepted: 02/21/2020] [Indexed: 11/29/2022] Open
Abstract
Background Research funders use a wide variety of application assessment processes yet there is little evidence on their relative advantages and disadvantages. A broad distinction can be made between processes with a single stage assessment of full proposals and those that first invite an outline, with full proposals invited at a second stage only for those which are shortlisted. This paper examines the effects of changing from a one-stage to a two-stage process within the UK’s National Institute for Health Research’s (NIHR) Research for Patient Benefit (RfPB) Programme which made this change in 2015. Methods A retrospective comparative design was used to compare eight one-stage funding competitions (912 applications) with eight two-stage funding competitions (1090 applications). Comparisons were made between the number of applications submitted, number of peer and lay reviews required, the duration of the funding round, average external peer review scores, and the total costs involved. Results There was a mean number of 114 applications per funding round for the one-stage process and 136 for the two-stage process. The one-stage process took a mean of 274 days and the two-stage process 348 days to complete, although those who were not funded (i.e. the majority) were informed at a mean of 195 days (mean 79 days earlier) under the two-stage process. The mean peer review score for full applications using the one-stage process was 6.46 and for the two-stage process 6.82 (5.6% difference using a 1–10 scale (with 10 being the highest), but there was no significant difference between the lay reviewer scores. The one-stage process required a mean of 423 peer reviews and 102 lay reviewers and the two-stage process required a mean of 208 peer reviews and 50 lay reviews (mean difference of 215 peer reviews and 52 lay reviews) per funding round. Overall cost per funding round changed from £148,908 for the one-stage process to £105,342 for the two-stage process saving approximately £43,566 per round. Conclusion We conclude that a two-stage application process increases the number of applications submitted to a funding round, is less burdensome and more efficient for all those involved with the process, is cost effective and has a small increase in peer reviewer scores. For the addition of fewer than 11 weeks to the process substantial efficiencies are gained which benefit funders, applicants and science. Funding agencies should consider adopting a two-stage application assessment process.
Collapse
Affiliation(s)
- Ben Morgan
- National Institute for Health Research Central Commissioning Facility, Twickenham, England, United Kingdom
- * E-mail:
| | - Ly-Mee Yu
- Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, England, United Kingdom
| | - Tom Solomon
- Faculty of Health and Life Sciences, University of Liverpool, Liverpool, England, United Kingdom
| | - Sue Ziebland
- Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, England, United Kingdom
| |
Collapse
|
14
|
The CTSA External Reviewer Exchange Consortium (CEREC): Engagement and efficacy. J Clin Transl Sci 2019; 3:325-331. [PMID: 31827906 PMCID: PMC6885993 DOI: 10.1017/cts.2019.411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2019] [Revised: 08/26/2019] [Accepted: 08/28/2019] [Indexed: 11/05/2022] Open
Abstract
Introduction Many institutions evaluate applications for local seed funding by recruiting peer reviewers from their own institutional community. Smaller institutions, however, often face difficulty locating qualified local reviewers who are not in conflict with the proposal. As a larger pool of reviewers may be accessed through a cross-institutional collaborative process, nine Clinical and Translational Science Award (CTSA) hubs formed a consortium in 2016 to facilitate reviewer exchanges. Data were collected to evaluate the feasibility and preliminary efficacy of the consortium. Methods The CTSA External Reviewer Exchange Consortium (CEREC) has been supported by a custom-built web-based application that facilitates the process and tracks the efficiency and productivity of the exchange. Results All nine of the original CEREC members remain actively engaged in the exchange. Between January 2017 and May 2019, CEREC supported the review process for 23 individual calls for proposals. Out of the 412 reviews requested, 368 were received, for a fulfillment ratio of 89.3%. The yield on reviewer invitations has remained consistently high, with approximately one-third of invitations being accepted, and of the reviewers who agreed to provide a review, 88.3% submitted a complete review. Surveys of reviewers and pilot program administrators indicate high satisfaction with the process. Conclusions These data indicate that a reviewer exchange consortium is feasible, adds value to participating partners, and is sustainable over time.
Collapse
|
15
|
Abstract
The return on research investment resulting from new breakthrough scientific discoveries may be decreasing over time due to the law of diminishing returns, the relative decrease of research funding in terms of purchasing power parity, and various activities gaming the system. By altering the grant-review process, the scientific community may directly address the third problem. There is evidence that peer reviews of research proposals may lack reliability and may produce invalid or inconsistent ratings. In addition, extreme focus on grantsmanship threatens to uproot a cornerstone principle that scientific-value should be the key driver in funding decision-making. This opinion provides (1) a justification of the need to consider alternative strategies to boost the impact of public investment in innovative scientific discovery, (2) proposes a framework for flipping the traditional front-loaded peer-review approach to allocation of research funding, into a new back-loaded assessment of scholarly return on investment, and (3) provokes the scientific community to accelerate the debate on alternative funding mechanisms, as the stakes of inaction may be very high.
Collapse
Affiliation(s)
- Ivo D. Dinov
- Statistics Online Computational Resource, University of Michigan, Ann Arbor, MI, USA
- Department of Health Behavior and Biological Sciences, School of Nursing, University of Michigan, Ann Arbor, MI, USA
- Department of Computational Medicine and Bioinformatics, School of Medicine, University of Michigan, Ann Arbor, MI, USA
- Institute for Health Policy and Innovation, University of Michigan, Ann Arbor, MI, USA
| |
Collapse
|
16
|
Barnett AG, Glisson SR, Gallo S. Do funding applications where peer reviewers disagree have higher citations? A cross-sectional study. F1000Res 2018; 7:1030. [PMID: 30345025 PMCID: PMC6171721 DOI: 10.12688/f1000research.15479.2] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 10/10/2018] [Indexed: 01/08/2023] Open
Abstract
Background: Decisions about which applications to fund are generally based on the mean scores of a panel of peer reviewers. As well as the mean, a large disagreement between peer reviewers may also be worth considering, as it may indicate a high-risk application with a high return. Methods: We examined the peer reviewers' scores for 227 funded applications submitted to the American Institute of Biological Sciences between 1999 and 2006. We examined the mean score and two measures of reviewer disagreement: the standard deviation and range. The outcome variable was the relative citation ratio, which is the number of citations from all publications associated with the application, standardised by field and publication year. Results: There was a clear increase in relative citations for applications with a better mean. There was no association between relative citations and either of the two measures of disagreement. Conclusions: We found no evidence that reviewer disagreement was able to identify applications with a higher than average return. However, this is the first study to empirically examine this association, and it would be useful to examine whether reviewer disagreement is associated with research impact in other funding schemes and in larger sample sizes.
Collapse
Affiliation(s)
- Adrian G Barnett
- Institute of Health and Biomedical Innovation & School of Public Health and Social Work, Queensland University of Technology, Brisbane, QLD, 4059, Australia
| | - Scott R Glisson
- American Institute of Biological Sciences, Reston, Virginia, VA 20191, USA
| | - Stephen Gallo
- American Institute of Biological Sciences, Reston, Virginia, VA 20191, USA
| |
Collapse
|