1
|
Roche DG, O'Dea RE, Kerr KA, Rytwinski T, Schuster R, Nguyen VM, Young N, Bennett JR, Cooke SJ. Closing the knowledge-action gap in conservation with open science. CONSERVATION BIOLOGY : THE JOURNAL OF THE SOCIETY FOR CONSERVATION BIOLOGY 2022; 36:e13835. [PMID: 34476839 PMCID: PMC9300006 DOI: 10.1111/cobi.13835] [Citation(s) in RCA: 13] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/04/2021] [Revised: 07/23/2021] [Accepted: 08/27/2021] [Indexed: 06/13/2023]
Abstract
The knowledge-action gap in conservation science and practice occurs when research outputs do not result in actions to protect or restore biodiversity. Among the diverse and complex reasons for this gap, three barriers are fundamental: knowledge is often unavailable to practitioners and challenging to interpret or difficult to use or both. Problems of availability, interpretability, and useability are solvable with open science practices. We considered the benefits and challenges of three open science practices for use by conservation scientists and practitioners. First, open access publishing makes the scientific literature available to all. Second, open materials (detailed methods, data, code, and software) increase the transparency and use of research findings. Third, open education resources allow conservation scientists and practitioners to acquire the skills needed to use research outputs. The long-term adoption of open science practices would help researchers and practitioners achieve conservation goals more quickly and efficiently and reduce inequities in information sharing. However, short-term costs for individual researchers (insufficient institutional incentives to engage in open science and knowledge mobilization) remain a challenge. We caution against a passive approach to sharing that simply involves making information available. We advocate a proactive stance toward transparency, communication, collaboration, and capacity building that involves seeking out and engaging with potential users to maximize the environmental and societal impact of conservation science.
Collapse
Affiliation(s)
- Dominique G. Roche
- Canadian Centre for Evidence‐Based Conservation, Department of Biology and Institute of Environmental and Interdisciplinary ScienceCarleton UniversityOttawaOntarioCanada
- Institut de BiologieUniversité de NeuchâtelNeuchâtelSwitzerland
| | - Rose E. O'Dea
- Evolution & Ecology Research Centre and School of Biological and Environmental SciencesUniversity of New South WalesSydneyNew South WalesAustralia
| | - Kecia A. Kerr
- Canadian Parks and Wilderness Society (CPAWS) ‐ Northern Alberta, Edmonton, AlbertaCanada
| | - Trina Rytwinski
- Canadian Centre for Evidence‐Based Conservation, Department of Biology and Institute of Environmental and Interdisciplinary ScienceCarleton UniversityOttawaOntarioCanada
| | - Richard Schuster
- Nature Conservancy of CanadaVancouverBritish ColumbiaCanada
- Department of BiologyCarleton UniversityOttawaOntarioCanada
| | - Vivian M. Nguyen
- Canadian Centre for Evidence‐Based Conservation, Department of Biology and Institute of Environmental and Interdisciplinary ScienceCarleton UniversityOttawaOntarioCanada
| | - Nathan Young
- School of Sociological and Anthropological Studies, Faculty of Social SciencesUniversity of OttawaOttawaOntarioCanada
| | - Joseph R. Bennett
- Canadian Centre for Evidence‐Based Conservation, Department of Biology and Institute of Environmental and Interdisciplinary ScienceCarleton UniversityOttawaOntarioCanada
| | - Steven J. Cooke
- Canadian Centre for Evidence‐Based Conservation, Department of Biology and Institute of Environmental and Interdisciplinary ScienceCarleton UniversityOttawaOntarioCanada
| |
Collapse
|
2
|
Holst MR, Faust A, Strech D. Do German university medical centres promote robust and transparent research? A cross-sectional study of institutional policies. Health Res Policy Syst 2022; 20:39. [PMID: 35413846 PMCID: PMC9004041 DOI: 10.1186/s12961-022-00841-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2021] [Accepted: 03/21/2022] [Indexed: 12/01/2022] Open
Abstract
Background In light of replication and translational failures, biomedical research practices have recently come under scrutiny. Experts have pointed out that the current incentive structures at research institutions do not sufficiently incentivise researchers to invest in robustness and transparency and instead incentivise them to optimize their fitness in the struggle for publications and grants. This cross-sectional study aimed to describe whether and how relevant policies of university medical centres in Germany support the robust and transparent conduct of research and how prevalent traditional metrics are. Methods For 38 German university medical centres, we searched for institutional policies for academic degrees and academic appointments as well as websites for their core facilities and research in general between December 2020 and February 2021. We screened the documents for mentions of indicators of robust and transparent research (study registration; reporting of results; sharing of research data, code and protocols; open access; and measures to increase robustness) and for mentions of more traditional metrics of career progression (number of publications; number and value of awarded grants; impact factors; and authorship order). Results While open access was mentioned in 16% of PhD regulations, other indicators of robust and transparent research were mentioned in less than 10% of institutional policies for academic degrees and academic appointments. These indicators were more frequently mentioned on the core facility and general research websites. Institutional policies for academic degrees and academic appointments had frequent mentions of traditional metrics. Conclusions References to robust and transparent research practices are, with a few exceptions, generally uncommon in institutional policies at German university medical centres, while traditional criteria for academic promotion and tenure still prevail. Supplementary Information The online version contains supplementary material available at 10.1186/s12961-022-00841-2.
Collapse
Affiliation(s)
- M R Holst
- Berlin Institute of Health at Charité - Universitätsmedizin Berlin, QUEST Center for Responsible Research, Charitéplatz 1, 10117, Berlin, Germany. .,Medizinische Hochschule Hannover, Institute of Ethics, History and Philosophy of Medicine, Carl-Neuberg-Str. 1, 30625, Hannover, Germany.
| | - A Faust
- Berlin Institute of Health at Charité - Universitätsmedizin Berlin, QUEST Center for Responsible Research, Charitéplatz 1, 10117, Berlin, Germany.,Medizinische Hochschule Hannover, Institute of Ethics, History and Philosophy of Medicine, Carl-Neuberg-Str. 1, 30625, Hannover, Germany
| | - D Strech
- Berlin Institute of Health at Charité - Universitätsmedizin Berlin, QUEST Center for Responsible Research, Charitéplatz 1, 10117, Berlin, Germany.,Medizinische Hochschule Hannover, Institute of Ethics, History and Philosophy of Medicine, Carl-Neuberg-Str. 1, 30625, Hannover, Germany
| |
Collapse
|
3
|
Rice DB, Raffoul H, Ioannidis JP, Moher D. Academic criteria for promotion and tenure in faculties of medicine: a cross-sectional study of the Canadian U15 universities. Facets (Ott) 2021. [DOI: 10.1139/facets-2020-0044] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022] Open
Abstract
Background: The objective of this study was to determine the presence of a set of prespecified criteria used to assess scientists for promotion and tenure within faculties of medicine among the U15 Group of Canadian Research Universities. Methods: Each faculty guideline for assessing promotion and tenure was reviewed and the presence of five traditional (peer-reviewed publications, authorship order, journal impact factor, grant funding, and national/international reputation) and seven nontraditional (citations, data sharing, publishing in open access mediums, accommodating leaves, alternative ways for sharing research, registering research, using reporting guidelines) criteria were collected by two reviewers. Results: Among the U15 institutions, four of five traditional criteria (80.0%) were present in at least one promotion guideline, whereas only three of seven nontraditional incentives (42.9%) were present in any promotion guidelines. When assessing full professors, there were a median of three traditional criteria listed, versus one nontraditional criterion. Conclusion: This study demonstrates that faculties of medicine among the U15 Group of Canadian Research Universities base assessments for promotion and tenure on traditional criteria. Some of these metrics may reinforce problematic practices in medical research. These faculties should consider incentivizing criteria that can enhance the quality of medical research.
Collapse
Affiliation(s)
- Danielle B. Rice
- Department of Psychology, McGill University, Montreal, QC H3A 1G1, Canada
- Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
| | - Hana Raffoul
- Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
- Faculty of Engineering, University of Waterloo, Waterloo, ON N2L 3G1, Canada
| | - John P.A. Ioannidis
- Department of Medicine, Department of Health Research and Policy, Department of Biomedical Data Science, and Department of Statistics, Stanford University, Stanford, CA 94305-5101, USA
- Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, CA 94305-5101, USA
| | - David Moher
- Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, ON K1H 8L6, Canada
| |
Collapse
|
4
|
Forero DA, Lopez-Leon S, Perry G. A brief guide to the science and art of writing manuscripts in biomedicine. J Transl Med 2020; 18:425. [PMID: 33167977 PMCID: PMC7653709 DOI: 10.1186/s12967-020-02596-2] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2020] [Accepted: 10/29/2020] [Indexed: 12/12/2022] Open
Abstract
Publishing articles in international scientific journals is the primary method for the communication of validated research findings and ideas. Journal articles are commonly used as a major input for evaluations of researchers and institutions. Few articles have been published previously about the different aspects needed for writing high-quality articles. In this manuscript, we provide an updated and brief guide for the multiple dimensions needed for writing manuscripts in the health and biological sciences, from current, international and interdisciplinary perspectives and from our expertise as authors, peer reviewers and editors. We provide key suggestions for writing major sections of the manuscript (e.g. title, abstract, introduction, methods, results and discussion), for submitting the manuscript and bring an overview of the peer review process and of the post-publication impact of the articles.
Collapse
Affiliation(s)
- Diego A Forero
- Health and Sport Sciences Research Group, School of Health and Sport Sciences, Fundación Universitaria del Área Andina, Bogotá, Colombia.
- MSc Program in Epidemiology, School of Health and Sport Sciences, Fundación Universitaria del Área Andina, Bogotá, Colombia.
| | - Sandra Lopez-Leon
- Global Drug Development, Novartis Pharmaceuticals Corporation, East Hanover, NJ, USA.
| | - George Perry
- Department of Biology and Neurosciences Institute, The University of Texas at San Antonio, San Antonio, TX, USA
| |
Collapse
|
5
|
Rice DB, Raffoul H, Ioannidis JPA, Moher D. Academic criteria for promotion and tenure in biomedical sciences faculties: cross sectional analysis of international sample of universities. BMJ 2020; 369:m2081. [PMID: 32586791 PMCID: PMC7315647 DOI: 10.1136/bmj.m2081] [Citation(s) in RCA: 83] [Impact Index Per Article: 20.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
OBJECTIVE To determine the presence of a set of pre-specified traditional and non-traditional criteria used to assess scientists for promotion and tenure in faculties of biomedical sciences among universities worldwide. DESIGN Cross sectional study. SETTING International sample of universities. PARTICIPANTS 170 randomly selected universities from the Leiden ranking of world universities list. MAIN OUTCOME MEASURE Presence of five traditional (for example, number of publications) and seven non-traditional (for example, data sharing) criteria in guidelines for assessing assistant professors, associate professors, and professors and the granting of tenure in institutions with biomedical faculties. RESULTS A total of 146 institutions had faculties of biomedical sciences, and 92 had eligible guidelines available for review. Traditional criteria of peer reviewed publications, authorship order, journal impact factor, grant funding, and national or international reputation were mentioned in 95% (n=87), 37% (34), 28% (26), 67% (62), and 48% (44) of the guidelines, respectively. Conversely, among non-traditional criteria, only citations (any mention in 26%; n=24) and accommodations for employment leave (37%; 34) were relatively commonly mentioned. Mention of alternative metrics for sharing research (3%; n=3) and data sharing (1%; 1) was rare, and three criteria (publishing in open access mediums, registering research, and adhering to reporting guidelines) were not found in any guidelines reviewed. Among guidelines for assessing promotion to full professor, traditional criteria were more commonly reported than non-traditional criteria (traditional criteria 54.2%, non-traditional items 9.5%; mean difference 44.8%, 95% confidence interval 39.6% to 50.0%; P=0.001). Notable differences were observed across continents in whether guidelines were accessible (Australia 100% (6/6), North America 97% (28/29), Europe 50% (27/54), Asia 58% (29/50), South America 17% (1/6)), with more subtle differences in the use of specific criteria. CONCLUSIONS This study shows that the evaluation of scientists emphasises traditional criteria as opposed to non-traditional criteria. This may reinforce research practices that are known to be problematic while insufficiently supporting the conduct of better quality research and open science. Institutions should consider incentivising non-traditional criteria. STUDY REGISTRATION Open Science Framework (https://osf.io/26ucp/?view_only=b80d2bc7416543639f577c1b8f756e44).
Collapse
Affiliation(s)
- Danielle B Rice
- Department of Psychology, McGill University, Montreal, QC, Canada
- Ottawa Hospital Research Institute, Ottawa, ON, Canada
| | - Hana Raffoul
- Ottawa Hospital Research Institute, Ottawa, ON, Canada
- Faculty of Engineering, University of Waterloo, Waterloo, ON, Canada
| | - John P A Ioannidis
- Department of Medicine, Stanford University, Stanford, CA, USA
- Department of Health Research and Policy, Stanford University, Stanford, CA, USA
- Department of Biomedical Data Science, and Statistics, Stanford University, Stanford, CA, USA
- Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, CA, USA
| | - David Moher
- Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, ON, Canada
- School of Epidemiology and Public Health, University of Ottawa, Ottawa, ON, Canada
| |
Collapse
|
6
|
Raju R, Claassen J, Pietersen J, Abrahamse D. An authentic flip subscription model for Africa: library as publisher service. LIBRARY MANAGEMENT 2020. [DOI: 10.1108/lm-03-2020-0054] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
PurposeThis paper investigates the fit for purpose of the flip model proposed by Max Planck Society and Plan S for the African environment. This flipped model is examined against the backdrop of African imperatives, which is much broader than just flipping a journal pricing model from subscription to open access. This paper also seeks a viable alternative model that supports the growth of African scholarship and the dissemination thereof.Design/methodology/approachThis paper adopts a descriptive research methodology, which allows for an in-depth analysis of a phenomenon. By using this method, this paper describes a flip model proposed by global north entities, which do not augur well for the growth of the OA movement in Africa.FindingsThe findings demonstrate that the global north centric flipped model exacerbates the inequality in the publishing landscape by further marginalizing the research voices from the global south. Africa is in dire need of an alternative that denorthernizes the publishing landscape, promote equity and equality, and is more inclusive of the research voices from Africa. South African academic libraries have demonstrated their willingness to experiment with and roll-out library publishing services. This proof of concept has been extended into a continental platform for the publication and dissemination of African scholarship.Originality/valueThis paper will be of interest to those who are grappling with viable alternatives to the current flip models, which include, inter alia, university leadership. This paper will also be of interest to global north libraries who are embarking on library publishing without the social justice obligation but are committed to the OA movement.
Collapse
|
7
|
|
8
|
Bakkum MJ, Tichelaar J, Wellink A, Richir MC, van Agtmael MA. Digital Learning to Improve Safe and Effective Prescribing: A Systematic Review. Clin Pharmacol Ther 2019; 106:1236-1245. [PMID: 31206612 PMCID: PMC6896235 DOI: 10.1002/cpt.1549] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2019] [Accepted: 05/27/2019] [Indexed: 12/12/2022]
Abstract
With the aim to modernize and harmonize prescribing education, the European Association for Clinical Pharmacology and Therapeutics (EACPT) Working Group on education recommended the extensive use and distribution of digital learning resources (DLRs). However, it is unclear whether the complex task of prescribing medicine can be taught digitally. Therefore, the aim of this review was to investigate the effect of diverse DLRs in clinical pharmacology and therapeutics education. Databases PubMed, EMBASE, CINAHL, ERIC, and CENTRAL were systematically searched. Sixty-five articles were included in the analyses. Direct effects on patients were studied, but not detected, in six articles. Skills and behavior were studied in 11 articles, 8 of which reported positive effects. Knowledge acquisition was investigated in 19 articles, all with positive effects. Qualitative analyses yielded 10 recommendations for the future development of DLRs. Digital learning is effective in teaching knowledge, attitudes, and skills associated with safe and effective prescribing.
Collapse
Affiliation(s)
- Michiel J Bakkum
- Department of Internal Medicine, Section Pharmacotherapy, Amsterdam UMC, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands.,Research and Expertise Centre in Pharmacotherapy Education (RECIPE), Amsterdam, The Netherlands
| | - Jelle Tichelaar
- Department of Internal Medicine, Section Pharmacotherapy, Amsterdam UMC, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands.,Research and Expertise Centre in Pharmacotherapy Education (RECIPE), Amsterdam, The Netherlands.,European Association for Clinical Pharmacology and Therapeutics (EACPT) Education Working Group, Frankfurt, Germany
| | - Anne Wellink
- Department of Internal Medicine, Section Pharmacotherapy, Amsterdam UMC, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands.,Research and Expertise Centre in Pharmacotherapy Education (RECIPE), Amsterdam, The Netherlands
| | - Milan C Richir
- Department of Internal Medicine, Section Pharmacotherapy, Amsterdam UMC, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands.,Research and Expertise Centre in Pharmacotherapy Education (RECIPE), Amsterdam, The Netherlands
| | - Michiel A van Agtmael
- Department of Internal Medicine, Section Pharmacotherapy, Amsterdam UMC, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands.,Research and Expertise Centre in Pharmacotherapy Education (RECIPE), Amsterdam, The Netherlands.,European Association for Clinical Pharmacology and Therapeutics (EACPT) Education Working Group, Frankfurt, Germany
| |
Collapse
|
9
|
Das R, Keep B, Washington P, Riedel-Kruse IH. Scientific Discovery Games for Biomedical Research. Annu Rev Biomed Data Sci 2019; 2:253-279. [PMID: 34308269 DOI: 10.1146/annurev-biodatasci-072018-021139] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/14/2023]
Abstract
Over the past decade, scientific discovery games (SDGs) have emerged as a viable approach for biomedical research, engaging hundreds of thousands of volunteer players and resulting in numerous scientific publications. After describing the origins of this novel research approach, we review the scientific output of SDGs across molecular modeling, sequence alignment, neuroscience, pathology, cellular biology, genomics, and human cognition. We find compelling results and technical innovations arising in problem-oriented games such as Foldit and Eterna and in data-oriented games such as EyeWire and Project Discovery. We discuss emergent properties of player communities shared across different projects, including the diversity of communities and the extraordinary contributions of some volunteers, such as paper writing. Finally, we highlight connections to artificial intelligence, biological cloud laboratories, new game genres, science education, and open science that may drive the next generation of SDGs.
Collapse
Affiliation(s)
- Rhiju Das
- Department of Biochemistry and Department of Physics, Stanford University, Stanford, California 94305, USA
| | - Benjamin Keep
- Department of Learning Sciences, Stanford University, Stanford, California 94305, USA
| | - Peter Washington
- Department of Bioengineering, Stanford University, Stanford, California 94305, USA
| | | |
Collapse
|
10
|
Dougherty MR, Slevc LR, Grand JA. Making Research Evaluation More Transparent: Aligning Research Philosophy, Institutional Values, and Reporting. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2019; 14:361-375. [DOI: 10.1177/1745691618810693] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
There is a growing interest in changing the culture of psychology to improve the quality of our science. At the root of this interest is concern over the reproducibility of key findings. A variety of large-scale replication attempts have revealed that several previously published effects cannot be reproduced, whereas other analyses indicate that the published literature is rife with underpowered studies and publication bias. These revelations suggest that it is time to change how psychological science is carried out and increase the transparency of reporting. We argue that change will be slow until institutions adopt new procedures for evaluating scholarly activity. We consider three actions that individuals and departments can take to facilitate change throughout psychological science: the development of individualized research-philosophy statements, the creation of an annotated curriculum vitae to improve the transparency of scholarly reporting, and the use of a formal evaluative system that explicitly captures behaviors that support reproducibility. Our recommendations build on proposals for open science by enabling researchers to have a voice in articulating (and contextualizing) how they would like their work to be evaluated and by providing a mechanism for more detailed and transparent reporting of scholarly activities.
Collapse
|
11
|
Katz DS, Allen G, Barba LA, Berg DR, Bik H, Boettiger C, Borgman CL, Brown CT, Buck S, Burd R, de Waard A, Eve MP, Granger BE, Greenberg J, Howe A, Howe B, Khanna M, Killeen TL, Mayernik M, McKiernan E, Mentzel C, Merchant N, Niemeyer KE, Noren L, Nusser SM, Reed DA, Seidel E, Smith M, Spies JR, Turk M, Van Horn JD, Walsh J. The principles of tomorrow's university. F1000Res 2018; 7:1926. [PMID: 30687499 PMCID: PMC6338243 DOI: 10.12688/f1000research.17425.1] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 12/05/2018] [Indexed: 11/20/2022] Open
Abstract
In the 21st Century, research is increasingly data- and computation-driven. Researchers, funders, and the larger community today emphasize the traits of openness and reproducibility. In March 2017, 13 mostly early-career research leaders who are building their careers around these traits came together with ten university leaders (presidents, vice presidents, and vice provosts), representatives from four funding agencies, and eleven organizers and other stakeholders in an NIH- and NSF-funded one-day, invitation-only workshop titled "Imagining Tomorrow's University." Workshop attendees were charged with launching a new dialog around open research - the current status, opportunities for advancement, and challenges that limit sharing. The workshop examined how the internet-enabled research world has changed, and how universities need to change to adapt commensurately, aiming to understand how universities can and should make themselves competitive and attract the best students, staff, and faculty in this new world. During the workshop, the participants re-imagined scholarship, education, and institutions for an open, networked era, to uncover new opportunities for universities to create value and serve society. They expressed the results of these deliberations as a set of 22 principles of tomorrow's university across six areas: credit and attribution, communities, outreach and engagement, education, preservation and reproducibility, and technologies. Activities that follow on from workshop results take one of three forms. First, since the workshop, a number of workshop authors have further developed and published their white papers to make their reflections and recommendations more concrete. These authors are also conducting efforts to implement these ideas, and to make changes in the university system. Second, we plan to organise a follow-up workshop that focuses on how these principles could be implemented. Third, we believe that the outcomes of this workshop support and are connected with recent theoretical work on the position and future of open knowledge institutions.
Collapse
Affiliation(s)
- Daniel S. Katz
- Department of Computer Science, University of Illinois at Urbana-Champaign, Urbana, IL, USA
- Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL, USA
- National Center for Supercomputing Applications, University of Illinois at Urbana-Champaign, Urbana, IL, USA
- School of Information Sciences, University of Illinois at Urbana-Champaign, Urbana, IL, USA
| | - Gabrielle Allen
- Department of Computer Science, University of Illinois at Urbana-Champaign, Urbana, IL, USA
- National Center for Supercomputing Applications, University of Illinois at Urbana-Champaign, Urbana, IL, USA
- Department of Astronomy, University of Illinois at Urbana-Champaign, Urbana, IL, USA
- College of Education, University of Illinois at Urbana-Champaign, Urbana, IL, USA
| | | | - Devin R. Berg
- Engineering and Technology Department, University of Wisconsin, Stout, Menomonie, WI, USA
| | - Holly Bik
- Department of Nematology, University of California-Riverside, Riverside, CA, USA
| | - Carl Boettiger
- Department of Environmental Science, Policy and Management, University of California, Berkeley, Berkeley, CA, USA
| | | | - C. Titus Brown
- School of Veterinary Medicine, University of California, Davis, Davis, CA, USA
| | - Stuart Buck
- Laura and John Arnold Foundation, Houston, TX, USA
| | - Randy Burd
- Long Island University, Brookville, NY, USA
| | | | | | - Brian E. Granger
- California Polytechnic State University, San Luis Obispo, CA, USA
| | | | | | - Bill Howe
- University of Washington, Seattle, Seattle, WA, USA
| | - May Khanna
- Department of Pharmacology, University of Arizona, Tucson, AZ, USA
- Center for Innovation in Brain Science, University of Arizona, Tucson, AZ, USA
| | | | | | - Erin McKiernan
- Departamento de Física, Universidad Nacional Autónoma de México, Mexico City, Mexico
| | - Chris Mentzel
- Gordon and Betty Moore Foundation, Palo Alto, CA, USA
| | - Nirav Merchant
- UA Data Science Institute, University of Arizona, Tucson, AZ, USA
| | - Kyle E. Niemeyer
- School of Mechanical, Industrial, and Manufacturing Engineering, Oregon State University, Corvallis, OR, USA
| | | | | | | | | | | | | | - Matt Turk
- School of Information Sciences, University of Illinois at Urbana-Champaign, Urbana, IL, USA
| | - John D. Van Horn
- USC Mark and Mary Stevens Neuroimaging and Informatics Institute, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
| | - Jay Walsh
- Northwestern University, Evanston, IL, USA
| |
Collapse
|
12
|
Nüst D, Granell C, Hofer B, Konkol M, Ostermann FO, Sileryte R, Cerutti V. Reproducible research and GIScience: an evaluation using AGILE conference papers. PeerJ 2018; 6:e5072. [PMID: 30013826 PMCID: PMC6047504 DOI: 10.7717/peerj.5072] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2018] [Accepted: 06/04/2018] [Indexed: 11/20/2022] Open
Abstract
The demand for reproducible research is on the rise in disciplines concerned with data analysis and computational methods. Therefore, we reviewed current recommendations for reproducible research and translated them into criteria for assessing the reproducibility of articles in the field of geographic information science (GIScience). Using this criteria, we assessed a sample of GIScience studies from the Association of Geographic Information Laboratories in Europe (AGILE) conference series, and we collected feedback about the assessment from the study authors. Results from the author feedback indicate that although authors support the concept of performing reproducible research, the incentives for doing this in practice are too small. Therefore, we propose concrete actions for individual researchers and the GIScience conference series to improve transparency and reproducibility. For example, to support researchers in producing reproducible work, the GIScience conference series could offer awards and paper badges, provide author guidelines for computational research, and publish articles in Open Access formats.
Collapse
Affiliation(s)
- Daniel Nüst
- Institute for Geoinformatics, University of Münster, Münster, Germany
| | - Carlos Granell
- Institute of New Imaging Technologies, Universitat Jaume I de Castellón, Castellón, Spain
| | - Barbara Hofer
- Interfaculty Department of Geoinformatics - Z_GIS, University of Salzburg, Salzburg, Austria
| | - Markus Konkol
- Institute for Geoinformatics, University of Münster, Münster, Germany
| | - Frank O. Ostermann
- Faculty of Geo-Information Science and Earth Observation (ITC), University of Twente, Enschede, The Netherlands
| | - Rusne Sileryte
- Faculty of Architecture and the Built Environment, Delft University of Technology, Delft, The Netherlands
| | - Valentina Cerutti
- Faculty of Geo-Information Science and Earth Observation (ITC), University of Twente, Enschede, The Netherlands
| |
Collapse
|
13
|
Sponsored Libre Research Agreements to Create Free and Open Source Software and Hardware. INVENTIONS 2018. [DOI: 10.3390/inventions3030044] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
14
|
Abstract
In this article, we describe our views on the benefits, and possible downsides, of openness in engineering research. We attempt to examine the issue from multiple perspectives, including reasons and motivations for introducing open practices into an engineering researcher's workflow and the challenges faced by scholars looking to do so. Further, we present our thoughts and reflections on the role that open engineering research can play in defining the purpose and activities of the university. We have made some specific recommendations on how the public university can recommit to and push the boundaries of its role as the creator and promoter of public knowledge. In doing so, the university will further demonstrate its vital role in the continued economic, social, and technological development of society. We have also included some thoughts on how this applies specifically to the field of engineering and how a culture of openness and sharing within the engineering community can help drive societal development.
Collapse
Affiliation(s)
- Devin R. Berg
- Engineering & Technology Department, University of Wisconsin-Stout, Menomonie, WI, 54751, USA
| | - Kyle E. Niemeyer
- School of Mechanical, Industrial, and Manufacturing Engineering, Oregon State University, Corvallis, OR, 97331, USA
| |
Collapse
|
15
|
Moher D, Naudet F, Cristea IA, Miedema F, Ioannidis JPA, Goodman SN. Assessing scientists for hiring, promotion, and tenure. PLoS Biol 2018; 16:e2004089. [PMID: 29596415 PMCID: PMC5892914 DOI: 10.1371/journal.pbio.2004089] [Citation(s) in RCA: 175] [Impact Index Per Article: 29.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Revised: 04/10/2018] [Indexed: 11/24/2022] Open
Abstract
Assessment of researchers is necessary for decisions of hiring, promotion, and tenure. A burgeoning number of scientific leaders believe the current system of faculty incentives and rewards is misaligned with the needs of society and disconnected from the evidence about the causes of the reproducibility crisis and suboptimal quality of the scientific publication record. To address this issue, particularly for the clinical and life sciences, we convened a 22-member expert panel workshop in Washington, DC, in January 2017. Twenty-two academic leaders, funders, and scientists participated in the meeting. As background for the meeting, we completed a selective literature review of 22 key documents critiquing the current incentive system. From each document, we extracted how the authors perceived the problems of assessing science and scientists, the unintended consequences of maintaining the status quo for assessing scientists, and details of their proposed solutions. The resulting table was used as a seed for participant discussion. This resulted in six principles for assessing scientists and associated research and policy implications. We hope the content of this paper will serve as a basis for establishing best practices and redesigning the current approaches to assessing scientists by the many players involved in that process.
Collapse
Affiliation(s)
- David Moher
- Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada
- Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, California, United States of America
| | - Florian Naudet
- Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, California, United States of America
- INSERM CIC-P 1414, Clinical Investigation Center, CHU Rennes, Rennes 1 University, Rennes, France
| | - Ioana A. Cristea
- Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, California, United States of America
- Department of Clinical Psychology and Psychotherapy, Babeş-Bolyai University, Cluj-Napoca, Romania
| | - Frank Miedema
- Executive Board, UMC Utrecht, Utrecht University, Utrecht, the Netherlands
| | - John P. A. Ioannidis
- Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, California, United States of America
- Department of Medicine, Stanford University, Stanford, California, United States of America
- Department of Health Research and Policy, Stanford University, Stanford, California, United States of America
- Department of Biomedical Data Science, Stanford University, Stanford, California, United States of America
- Department of Statistics, Stanford University, Stanford, California, United States of America
| | - Steven N. Goodman
- Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, California, United States of America
- Department of Medicine, Stanford University, Stanford, California, United States of America
- Department of Health Research and Policy, Stanford University, Stanford, California, United States of America
| |
Collapse
|