1
|
Nielsen SB, Lemire S, Bourgeois I, Fierro LA. Mapping the evaluation capacity building landscape: A bibliometric analysis of scholarly communities and themes. Eval Program Plann 2023; 99:102318. [PMID: 37257358 DOI: 10.1016/j.evalprogplan.2023.102318] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/09/2023] [Accepted: 05/18/2023] [Indexed: 06/02/2023]
Abstract
Evaluation capacity building (ECB) continues to attract attention. Over the past two decades, a broad literature has emerged-covering the dimensions, contexts, and practices of ECB. This article presents findings from a bibliometric analysis of ECB articles published in six evaluation journals from 2000 to 2019. The findings shed light on the communities of scholars that contribute to the ECB knowledge base, the connections between these communities, and the themes they cover. Informed by the findings, future directions for ECB scholarship and how bibliometric analysis can supplement more established approaches to literature reviews are discussed.
Collapse
|
2
|
Buetti D, Bourgeois I, Jafary M. Examining the competencies required by evaluation capacity builders in community-based organizations. Eval Program Plann 2023; 97:102242. [PMID: 36736193 DOI: 10.1016/j.evalprogplan.2023.102242] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/27/2021] [Revised: 01/10/2023] [Accepted: 01/19/2023] [Indexed: 06/18/2023]
Abstract
Increasing demand for evidence generated through program evaluation has led many community-based organizations (CBOs) to seek external support for evaluation capacity building (ECB). However, studies have yet to explore the essential competencies required by evaluation capacity builders working in the community sector. Our qualitative study aimed to examine the perceptions of ECB practitioners (n = 12) regarding essential competencies for building evaluation capacity in this sector. Our findings reveal that ECB practice requires competencies not found in known evaluation competency frameworks, such as instructional design, knowledge of organizational change models, motivating stakeholders, and understanding of the community sector. Our findings provide valuable information to help guide future education and training related to building the evaluation capacity of community organizations.
Collapse
Affiliation(s)
- David Buetti
- Faculty of Education, University of Ottawa, Canada.
| | | | - Maziar Jafary
- School of Sociological and Anthropological Studies, Faculty of Social Sciences, University of Ottawa, Canada.
| |
Collapse
|
3
|
Srivastava A. Challenges for evaluation practices and innovative approaches: Lessons during COVID-19 pandemic. Eval Program Plann 2022; 92:102095. [PMID: 35500477 PMCID: PMC9020492 DOI: 10.1016/j.evalprogplan.2022.102095] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/18/2021] [Revised: 11/24/2021] [Accepted: 04/17/2022] [Indexed: 06/14/2023]
Abstract
COVID-19 pandemic has affected every country across different continents, be a developed or developing economy. The COVID-19 pandemic has led to a dramatic loss of human life worldwide and presents an unprecedented challenge to public health, food systems and the world of work. Conducting evaluation during COVID-19 pandemic was even more challenging as compared to the evaluation in conflict areas. Sudden lockdown and sustained restrictions was unexpected and affected the evaluators plan of actions for the ongoing as well as forthcoming evaluation activities. Not only primary data collection but secondary research also got hampered as access to knowledge resource centres/libraries stopped due to closure of these centres. As far as primary data collection is concerned, not only data collection exercise got stopped but even for those evaluations where data collection had been completed, the electronic data entry of filled-in survey schedules got stalled for a while. The paper discusses the critical components of evaluation, which gets affected during pandemic like situation such as use of participatory evaluation techniques; missing evidence based policy decisions; external and internal validity not ensured or ethical norms get compromised. To overcome such situations, the evaluation world should be ready with the suggested solutions such as, Use of Artificial Intelligence, computer-assisted interviews, capacity building of community members for participatory evaluation and making ethical review of evaluation protocols mandatory.
Collapse
Affiliation(s)
- Alok Srivastava
- Centre for Media Studies 34-B, Community Centre, Saket, New Delhi 1100017, India.
| |
Collapse
|
4
|
LaMarre A, Riley B, Jain R, Zupko B, Buetti D. Chronic disease prevention evaluation in Ontario's public health system: a qualitative needs assessment. Can J Public Health 2020; 111:1002-1010. [PMID: 32504307 DOI: 10.17269/s41997-020-00317-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/05/2019] [Accepted: 03/31/2020] [Indexed: 11/17/2022]
Abstract
OBJECTIVES Building evaluation capacity for chronic disease prevention (CDP) is a critical step in ensuring the effectiveness of CDP programming over time. In this article, we highlight the findings of the qualitative arm of a mixed-methods needs assessment designed to assess the gaps and areas of strength within Ontario's public health system with respect to CDP evaluation. METHODS We conducted 29 interviews and focus groups with representatives from 25 public health units (PHUs) and analyzed the data using thematic analysis. We sought to understand what gaps and challenges exist in the Ontario public health system around CDP evaluation. RESULTS Challenges facing Ontario's PHUs in CDP evaluation include variation and centralization of capacity to evaluate, as well as competing priorities limiting the development of evaluative thinking. Participating PHUs identified the need for evaluation capacity building (ECB) strategies grounded in an understanding of the unique contexts in which they work and a desire for guidance in conducting a complex and thoughtful evaluation. Moving forward, PHUs noted a desire for a strong system of knowledge sharing and consultation across the public health system, including through strengthening existing partnerships with community collaborators. CONCLUSION These results support the case for ECB strategies that are adaptive and context-sensitive and equip PHUs with the skills required to evaluate complex CDP programming.
Collapse
Affiliation(s)
- Andrea LaMarre
- Massey University, Albany Campus, Dairy Flat Highway (SH17), Auckland, 0632, New Zealand. .,Propel Centre for Population Health Impact, University of Waterloo, Waterloo, Canada.
| | - Barbara Riley
- Propel Centre for Population Health Impact, University of Waterloo, Waterloo, Canada.,University of Waterloo and Renison University College, Waterloo, Canada
| | - Ruchika Jain
- Propel Centre for Population Health Impact, University of Waterloo, Waterloo, Canada
| | - Barbara Zupko
- Propel Centre for Population Health Impact, University of Waterloo, Waterloo, Canada
| | | |
Collapse
|
5
|
Nylen K, Sridharan S. Experiments in evaluation capacity building: Enhancing brain disorders research impact in Ontario. Eval Program Plann 2020; 80:101442. [PMID: 28578855 DOI: 10.1016/j.evalprogplan.2017.05.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2017] [Accepted: 05/06/2017] [Indexed: 06/07/2023]
Abstract
This paper is the introductory paper on a forum on evaluation capacity building for enhancing impacts of research on brain disorders. It describes challenges and opportunities of building evaluation capacity among community-based organizations in Ontario involved in enhancing brain health and supporting people living with a brain disorder. Using an example of a capacity building program called the "Evaluation Support Program", which is run by the Ontario Brain Institute, this forum discusses multiple themes including evaluation capacity building, evaluation culture and evaluation methodologies appropriate for evaluating complex community interventions. The goal of the Evaluation Support Program is to help community-based organizations build the capacity to demonstrate the value that they offer in order to improve, sustain, and spread their programs and activities. One of the features of this forum is that perspectives on the Evaluation Support Program are provided by multiple stakeholders, including the community-based organizations, evaluation team members involved in capacity building, thought leaders in the fields of evaluation capacity building and evaluation culture, and the funders.
Collapse
Affiliation(s)
- Kirk Nylen
- Ontario Brain Institute, University of Toronto, Department of Pharmacology, Canada.
| | - Sanjeev Sridharan
- The Evaluation Centre for Complex Health Interventions, St. Michael's Hospital, Institute for Health Policy, Management and Evaluation, University of Toronto, Canada.
| |
Collapse
|
6
|
Nakaima A, Sridharan S. Reflections on experiential learning in evaluation capacity building with a community organization, Dancing With Parkinson's. Eval Program Plann 2020; 80:101441. [PMID: 28619459 DOI: 10.1016/j.evalprogplan.2017.05.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2017] [Accepted: 05/06/2017] [Indexed: 06/07/2023]
Abstract
This paper discusses what was learned about evaluation capacity building with community organizations who deliver services to individuals with neurological disorders. Evaluation specialists engaged by the Ontario Brain Institute Evaluation Support Program were paired with community organizations, such as Dancing With Parkinson's. Some of the learning included: relationship building is key for this model of capacity building; community organizations often have had negative experiences with evaluation and the idea that evaluations can be friendly tools in implementing meaningful programs is one key mechanism by which such an initiative can work; community organizations often need evaluation most to be able to demonstrate their value; a strength of this initiative was that the focus was not just on creating products but mostly on developing a learning process in which capacities would remain; evaluation tools and skills that organizations found useful were developing a theory of change and the concept of heterogeneous mechanisms (informed by a realist evaluation lens).
Collapse
Affiliation(s)
- April Nakaima
- The Evaluation Centre for Complex Health Interventions,St. Michael's Hospital, Canada
| | - Sanjeev Sridharan
- The Evaluation Centre for Complex Health Interventions,St. Michael's Hospital, Canada; University of Toronto, Canada.
| |
Collapse
|
7
|
Arasanz C, Nylen K. The theory of change of the evaluation support program: Enhancing the role of community organizations in providing an ecology of care for neurological disorders. Eval Program Plann 2020; 80:101451. [PMID: 28571607 DOI: 10.1016/j.evalprogplan.2017.05.012] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2017] [Accepted: 05/08/2017] [Indexed: 06/07/2023]
Abstract
This paper discusses the Ontario Brain Institute's theory of change for the Evaluation Support Program, a program designed to enhance the role of community organizations in providing care and services for people living with a brain disorder. This is done by helping community organizations build evaluation capacity and foster the use of evidence to inform their activities and services. Helping organizations to build capacities to track the 'key ingredients' of their successes will help ensure that successes are replicated and services can be improved to maximize the benefit that people receive from them. This paper describes the hypothesized outcomes and early impacts of the Evaluation Support Program, as well as how the program will contribute to the field of evaluation capacity building.
Collapse
Affiliation(s)
| | - Kirk Nylen
- Ontario Brain Institute, University of Toronto, Department of Pharmacology, Canada
| |
Collapse
|
8
|
Sridharan S, Nakaima A. Valuing and embracing complexity: How an understanding of complex interventions needs to shape our evaluation capacities building initiatives. Eval Program Plann 2020; 80:101440. [PMID: 28559153 DOI: 10.1016/j.evalprogplan.2017.05.001] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2017] [Accepted: 05/06/2017] [Indexed: 06/07/2023]
Abstract
This paper describes some of the main challenges of evaluating complex interventions, as well as the implications of such challenges for evaluation capacity building. It discusses lessons learned from a case study of an evaluation of Dancing with Parkinson's, an organization that provides dance classes to people with Parkinson's disease in Toronto, Canada. These implications are developed from a realist evaluation lens. Key lessons include the need to develop skills to understand program mechanisms and contexts, recognize multiple models of causality, apply mixed method designs, and ensure the successful scaling up and spread of an intervention.
Collapse
Affiliation(s)
- Sanjeev Sridharan
- The Evaluation Centre for Complex Health Interventions, St. Michael's Hospital, Canada; Institute for Health Policy, Management and Evaluation, University of Toronto, Canada.
| | - April Nakaima
- The Evaluation Centre for Complex Health Interventions, St. Michael's Hospital, Canada
| |
Collapse
|
9
|
Gibson R, Robichaud S. Evaluating Dancing With Parkinson's: Reflections from the perspective of a community organization. Eval Program Plann 2020; 80:101449. [PMID: 28578854 DOI: 10.1016/j.evalprogplan.2017.05.010] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2017] [Accepted: 05/08/2017] [Indexed: 06/07/2023]
Abstract
In 2015, Dancing With Parkinson's (DWP), a Toronto-based community organization, participated in the Ontario Brain Institute's (OBI) newly launched Evaluation Support Program. This paper reflects on that experience. In particular, we identify the key lessons derived from the OBI initiative, discuss how these lessons have informed DWP practice going forward, and highlight what we consider to be the most valuable aspects of the Evaluation Support Program. While we now recognize the need to establish an evaluation culture within DWP, we find that there are significant challenges associated with both building and sustaining evaluation capacity in the context of a small community-based organization. Whereas DWP has built considerable strengths in terms of informal evaluation capacity, on its own, such capacity is insufficient to, for example, demonstrate DWP's impact to outside audiences or successfully scale up the program.
Collapse
Affiliation(s)
- Rachael Gibson
- The Evaluation Centre for Complex Health Interventions (TECCHI), Dancing With Parkinson's (DWP), Canada.
| | | |
Collapse
|
10
|
Wade J, Kallemeyn L. Evaluation capacity building (ECB) interventions and the development of sustainable evaluation practice: An exploratory study. Eval Program Plann 2020; 79:101777. [PMID: 31881418 DOI: 10.1016/j.evalprogplan.2019.101777] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/17/2019] [Revised: 10/30/2019] [Accepted: 12/16/2019] [Indexed: 06/10/2023]
Abstract
Evaluation capacity building (ECB) is a practice that can help organizations conduct and use evaluations; however, there is little research on the sustainable impact of ECB interventions. This study provides an empirical inquiry into how ECB develops sustained evaluation practice. Interviews were conducted with 15 organizational leaders from non-profits, higher education institutions, and foundations that "bought in" to ECB and were at least six months removed from an ECB contract. The result of this work highlights how sustained evaluation practice developed over time and what these practices looked like in real-world settings. A developmental, iterative cycle for how ECB led organizations to sustain evaluation practice emerged around key components to sustainability. First, leadership supported ECB work and resources were dedicated to evaluation. Staff began to conduct and use evaluation, which led to understanding the benefits of evaluation, and promoted value and buy-in to evaluation among staff. Common barriers and emerging sustainability supports not previously identified by ECB literature-the "personal" factor and ongoing ECB practitioner contact-are described. Practical tips for ECB practitioners to promote sustainability are also detailed.
Collapse
Affiliation(s)
- Jay Wade
- Planning, Implementation & Evaluation Org (PIE Org), 401 N. Michigan Ave., Chicago, IL, 60611, United States.
| | - Leanne Kallemeyn
- School of Education, Loyola University Chicago, 820 N. Michigan, Chicago, IL, 60611, United States.
| |
Collapse
|
11
|
Kumar Chaudhary A, Diaz J, Jayaratne KSU, Assan E. Evaluation capacity building in the nonformal education context: Challenges and strategies. Eval Program Plann 2020; 79:101768. [PMID: 31958716 DOI: 10.1016/j.evalprogplan.2019.101768] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/05/2019] [Revised: 11/24/2019] [Accepted: 12/07/2019] [Indexed: 06/10/2023]
Abstract
Policymakers' demand for increased accountability has compelled organizations to pay more attention to internal evaluation capacity building (ECB). The existing literature about ECB has focused on capacity building experiences and organizational research, with limited attention on challenges that internal evaluation specialists face in building organizational evaluative capacity. To address this knowledge gap, we conducted a Delphi study with evaluation specialists in the United States' Cooperative Extension Service and developed a consensus on the most pervasive ECB challenges as well as the most useful strategies for overcoming ECB challenges. Challenges identified in this study include limited time and resources, limited understanding of the value of evaluation, evaluation considered as an afterthought, and limited support and buy-in from administrators. Alternatively, strategies found in the study include a shift in an organizational culture where evaluation is appreciated, buy-in and support from administration, clarifying the importance of quality than quantity of evaluations, and a strategic approach to ECB. The challenges identified in this study have persisted for decades, meaning administrators must understand the persistence of these issues and make an earnest investment (financial and human resource) to make noticeable progress. The Delphi approach can be used more often to prioritize ECB efforts.
Collapse
Affiliation(s)
- Anil Kumar Chaudhary
- The Pennsylvania State University, Department of Agricultural Economics, Sociology, and Education, 209C Ferguson Building, University Park, PA, 16802, USA.
| | - John Diaz
- University of Florida, Department of Agricultural Education and Communication, 1200 N Park Road, Plant City, FL, 33563, USA.
| | - K S U Jayaratne
- North Carolina State University, Department of Agricultural and Human Sciences, 200 Ricks Hall, Raleigh, NC, 27695 - 7607, USA.
| | - Elsie Assan
- The Pennsylvania State University, Department of Agricultural Economics, Sociology and Education, 012 Ferguson Building, University Park, PA, 16802, USA.
| |
Collapse
|
12
|
Schwarzman J, Bauman A, Gabbe BJ, Rissel C, Shilton T, Smith BJ. Understanding the factors that influence health promotion evaluation: The development and validation of the evaluation practice analysis survey. Eval Program Plann 2019; 74:76-83. [PMID: 30928767 DOI: 10.1016/j.evalprogplan.2019.03.002] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/31/2018] [Revised: 01/22/2019] [Accepted: 03/18/2019] [Indexed: 06/09/2023]
Abstract
The demand for improved quality of health promotion evaluation and greater capacity to undertake evaluation is growing, yet evidence of the challenges and facilitators to evaluation practice within the health promotion field is lacking. A limited number of evaluation capacity measurement instruments have been validated in government or non-government organisations (NGO), however there is no instrument designed for health promotion organisations. This study aimed to develop and validate an Evaluation Practice Analysis Survey (EPAS) to examine evaluation practices in health promotion organisations. Qualitative interviews, existing frameworks and instruments informed the survey development. Health promotion practitioners from government agencies and NGOs completed the survey (n = 169). Principal components analysis was used to determine scale structure and Cronbach's α used to estimate internal reliability. Logistic regression was conducted to assess predictive validity of selected EPAS scale. The final survey instrument included 25 scales (125 items). The EPAS demonstrated good internal reliability (α > 0.7) for 23 scales. Dedicated resources and time for evaluation, leadership, organisational culture and internal support for evaluation showed promising predictive validity. The EPAS can be used to describe elements of evaluation capacity at the individual, organisational and system levels and to guide initiatives to improve evaluation practice in health promotion organisations.
Collapse
Affiliation(s)
- Joanna Schwarzman
- School of Public Health and Preventive Medicine, Monash University, 553 St Kilda Road, Melbourne, VIC, 3004, Australia.
| | - Adrian Bauman
- Prevention Research Collaboration, School of Public Health, The University of Sydney, Sydney, NSW 2006, Australia
| | - Belinda J Gabbe
- School of Public Health and Preventive Medicine, Monash University, 553 St Kilda Road, Melbourne, VIC, 3004, Australia; Health Data Research UK, Swansea UniversityMedical School, Swansea University, Singleton Park, Swansea, SA2 8PP, Wales, UK
| | - Chris Rissel
- Prevention Research Collaboration, School of Public Health, The University of Sydney, Sydney, NSW 2006, Australia
| | - Trevor Shilton
- National Heart Foundation of Australia, 334 Rokeby Road, Subiaco, WA 6008, Australia
| | - Ben J Smith
- School of Public Health and Preventive Medicine, Monash University, 553 St Kilda Road, Melbourne, VIC, 3004, Australia; Prevention Research Collaboration, School of Public Health, The University of Sydney, Sydney, NSW 2006, Australia
| |
Collapse
|
13
|
Lindeman PT, Bettin E, Beach LB, Adames CN, Johnson AK, Kern D, Stonehouse P, Greene GJ, Phillips G. Evaluation capacity building-Results and reflections across two years of a multisite empowerment evaluation in an HIV prevention context. Eval Program Plann 2018; 71:83-88. [PMID: 30223173 DOI: 10.1016/j.evalprogplan.2018.09.001] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/19/2018] [Revised: 07/26/2018] [Accepted: 09/05/2018] [Indexed: 06/08/2023]
Abstract
As the need for rigorous evidence of program efficacy increases, integrating evaluation activities into program implementation is becoming crucial. As a result, external evaluators are placing increased focus on evaluation capacity building as a practice. However, empirical evidence of how to foster evaluation capacity in different contexts remains limited. This study presents findings from an evaluation capacity survey conducted within a multisite Empowerment Evaluation initiative, in which an external evaluator worked with 20 project teams at diverse community agencies implementing HIV prevention projects. Survey results revealed representatives from project teams (n = 33) reported significantly higher overall evaluation capacity after engaging with the external evaluator on planning and implementing their evaluation. Improvements differed across organization type, intervention type, staff position, and reported engagement on various activities throughout the course of the evaluation. Results indicated empowerment evaluation and other stakeholder-focused evaluation approaches are broadly applicable when evaluation capacity building is a desired outcome, particularly when able to engage project staff in the planning of the evaluation and in delivering technical assistance services. Accordingly, efforts should be made by program funders, staff, and evaluators to encourage active engagement starting in the early stages of program and evaluation planning.
Collapse
Affiliation(s)
- Peter T Lindeman
- Department of Medical Social Sciences, Feinberg School of Medicine, Northwestern University, 625 N. Michigan Ave., 14th Floor, Chicago, IL, 60611, USA; Institute for Sexual and Gender Minority Health and Wellbeing, Northwestern University, 625 N. Michigan Ave., 14thFloor, Chicago, IL, 60611, USA.
| | - Emily Bettin
- Department of Medical Social Sciences, Feinberg School of Medicine, Northwestern University, 625 N. Michigan Ave., 14th Floor, Chicago, IL, 60611, USA; Institute for Sexual and Gender Minority Health and Wellbeing, Northwestern University, 625 N. Michigan Ave., 14thFloor, Chicago, IL, 60611, USA.
| | - Lauren B Beach
- Department of Medical Social Sciences, Feinberg School of Medicine, Northwestern University, 625 N. Michigan Ave., 14th Floor, Chicago, IL, 60611, USA; Institute for Sexual and Gender Minority Health and Wellbeing, Northwestern University, 625 N. Michigan Ave., 14thFloor, Chicago, IL, 60611, USA.
| | - Christian N Adames
- Department of Medical Social Sciences, Feinberg School of Medicine, Northwestern University, 625 N. Michigan Ave., 14th Floor, Chicago, IL, 60611, USA; Institute for Sexual and Gender Minority Health and Wellbeing, Northwestern University, 625 N. Michigan Ave., 14thFloor, Chicago, IL, 60611, USA.
| | - Amy K Johnson
- Center for Gender, Sexuality and HIV Prevention, Division of Adolescent Medicine, Ann & Robert H. Lurie Children's Hospital of Chicago, 225 E. Chicago Ave., Chicago, IL, 60611, USA; AIDS Foundation of Chicago, 200 W. Jackson Blvd. #2100, Chicago, IL, 60606, USA.
| | - Dave Kern
- HIV/STI Bureau, Chicago Department of Public Health, 333 S. State Street, Chicago, IL, 60604, USA.
| | - Patrick Stonehouse
- HIV/STI Bureau, Chicago Department of Public Health, 333 S. State Street, Chicago, IL, 60604, USA.
| | - George J Greene
- Department of Medical Social Sciences, Feinberg School of Medicine, Northwestern University, 625 N. Michigan Ave., 14th Floor, Chicago, IL, 60611, USA; Institute for Sexual and Gender Minority Health and Wellbeing, Northwestern University, 625 N. Michigan Ave., 14thFloor, Chicago, IL, 60611, USA.
| | - Gregory Phillips
- Department of Medical Social Sciences, Feinberg School of Medicine, Northwestern University, 625 N. Michigan Ave., 14th Floor, Chicago, IL, 60611, USA; Institute for Sexual and Gender Minority Health and Wellbeing, Northwestern University, 625 N. Michigan Ave., 14thFloor, Chicago, IL, 60611, USA.
| |
Collapse
|
14
|
Vengrin C, Westfall-Rudd D, Archibald T, Rudd R, Singh K. Factors affecting evaluation culture within a non-formal educational organization. Eval Program Plann 2018; 69:75-81. [PMID: 29730544 DOI: 10.1016/j.evalprogplan.2018.04.012] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/21/2017] [Revised: 04/27/2018] [Accepted: 04/29/2018] [Indexed: 06/08/2023]
Abstract
While research has been done on many aspects of evaluation within a variety of contexts and organizations, there is a lack of research surrounding the culture of evaluation. This study set out to examine this evaluative culture in one of the world's largest non-formal educational organizations through the use of an online survey and quantitative methodology. A path model was developed to examine the factors affecting evaluation culture. Results show perception regarding evaluation, program area, college major, location, training in evaluation, degree level, and years of experience explained 28% of the variance within evaluation culture. Results also found that the culture of evaluation is greatly impacted by leadership. By taking a closer look at the evaluation culture of a large non-formal educational organization, much can be learned about how to better develop and support evaluative work in other similar organizations and programs.
Collapse
Affiliation(s)
- Courtney Vengrin
- Virginia Polytechnic Institute and State University, 175 West Campus Dr., 240 Litton-Reaves Hall, Blacksburg VA, 24061, United States.
| | - Donna Westfall-Rudd
- Virginia Polytechnic Institute and State University, 175 West Campus Dr., 264 Litton-Reaves Hall, Blacksburg VA, 24061, United States.
| | - Thomas Archibald
- Virginia Polytechnic Institute and State University, 175 West Campus Dr., 284 Litton-Reaves Hall, Blacksburg VA, 24061, United States.
| | - Rick Rudd
- Virginia Polytechnic Institute and State University, 175 West Campus Dr., 214 Litton-Reaves Hall, Blacksburg VA, 24061, United States.
| | - Kusum Singh
- Virginia Polytechnic Institute and State University, 1759 Kraft Dr., VT CRC, Blacksburg VA, 24061, United States.
| |
Collapse
|
15
|
Lawrenz F, Kollmann EK, King JA, Bequette M, Pattison S, Nelson AG, Cohn S, Cardiel CLB, Iacovelli S, Eliou GO, Goss J, Causey L, Sinkey A, Beyer M, Francisco M. Promoting evaluation capacity building in a complex adaptive system. Eval Program Plann 2018; 69:53-60. [PMID: 29704777 DOI: 10.1016/j.evalprogplan.2018.04.005] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/25/2017] [Revised: 03/08/2018] [Accepted: 04/08/2018] [Indexed: 06/08/2023]
Abstract
This study provides results from an NSF funded, four year, case study about evaluation capacity building in a complex adaptive system, the Nanoscale Informal Science Education Network (NISE Net). The results of the Complex Adaptive Systems as a Model for Network Evaluations (CASNET) project indicate that complex adaptive system concepts help to explain evaluation capacity building in a network. The NISE Network was found to be a complex learning system that was supportive of evaluation capacity building through feedback loops that provided for information sharing and interaction. Participants in the system had different levels of and sources of evaluation knowledge. To be successful at building capacity, the system needed to have a balance between both centralized and decentralized control, coherence, redundancy, and diversity. Embeddedness of individuals within the system also provided support and moved the capacity of the system forward. Finally, success depended on attention being paid to the control of resources. Implications of these findings are discussed.
Collapse
Affiliation(s)
- Frances Lawrenz
- University of Minnesota, Educational Psychology, 174 EdSciB, 56 East River Rd, Minneapolis, MN, 55455, USA.
| | - Elizabeth Kunz Kollmann
- Museum of Science, Boston, Research and Evaluation Department, 1 Science Park, Boston, MA, 02114, USA.
| | - Jean A King
- University of Minnesota, 178 Pillsbury Dr SE, 242 Burton Hall, Minneapolis, MN, 55455, USA.
| | - Marjorie Bequette
- Science Museum of Minnesota, Department of Evaluation and Research, 120 W. Kellogg Blvd, Saint Paul, MN, 55102, USA.
| | - Scott Pattison
- Oregon Museum of Science and Industry, Research and Evaluation Department, 1945 SE Water Ave, Portland, OR, 97214, USA.
| | - Amy Grack Nelson
- Science Museum of Minnesota, Department of Evaluation and Research, 120 W. Kellogg Blvd, Saint Paul, MN, 55102, USA.
| | - Sarah Cohn
- Science Museum of Minnesota, Department of Evaluation and Research, 120 W. Kellogg Blvd, Saint Paul, MN, 55102, USA.
| | - Christopher L B Cardiel
- Oregon Museum of Science and Industry, Research and Evaluation Department, 1945 SE Water Ave, Portland, OR, 97214, USA.
| | - Stephanie Iacovelli
- Museum of Science, Boston, Research and Evaluation Department, 1 Science Park, Boston, MA, 02114, USA
| | - Gayra Ostgaard Eliou
- Science Museum of Minnesota, Department of Evaluation and Research, 120 W. Kellogg Blvd, Saint Paul, MN, 55102, USA
| | - Juli Goss
- Museum of Science, Boston, Research and Evaluation Department, 1 Science Park, Boston, MA, 02114, USA.
| | - Lauren Causey
- Science Museum of Minnesota, Department of Evaluation and Research, 120 W. Kellogg Blvd, Saint Paul, MN, 55102, USA.
| | - Anne Sinkey
- Oregon Museum of Science and Industry, Research and Evaluation Department, 1945 SE Water Ave, Portland, OR, 97214, USA
| | - Marta Beyer
- Museum of Science, Boston, Research and Evaluation Department, 1 Science Park, Boston, MA, 02114, USA.
| | - Melanie Francisco
- Oregon Museum of Science and Industry, Research and Evaluation Department, 1945 SE Water Ave, Portland, OR, 97214, USA
| |
Collapse
|
16
|
Gagnon F, Aubry T, Cousins JB, Goh SC, Elliott C. Validation of the evaluation capacity in organizations questionnaire. Eval Program Plann 2018; 68:166-175. [PMID: 29605761 DOI: 10.1016/j.evalprogplan.2018.01.002] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/21/2017] [Accepted: 01/01/2018] [Indexed: 06/08/2023]
Abstract
The purpose of this study was to test the construct validity of the Evaluation Capacity in Organizations Questionnaire (ECOQ). Conceptually, the ECOQ examines the role of evaluation in organizational development and, most notably in organizational learning. In this model, evaluation capacity building (ECB) initiatives are assumed to contribute to the development of a culture of systematic self-assessment and reflection, which, in turn, leads to increased organizational learning. Our sample consisted of internal evaluators within the federal, provincial or municipal government, not-for-profit organizations, private firms, and colleges or universities in Canada. Exploratory factor analysis (EFA) and latent path analysis (LPA) were conducted to better understand the underlying structural aspect of the organizational capacity to do and use evaluation construct as measured by the ECOQ. The results of our study indicate that the ECOQ effectively assesses an organization's capacity to do and use evaluation. Furthermore, evidence provided by the LPA statistical analysis suggests that an organization's capacity to learn is enhanced by the relationships among the various factors. Implications of using a validated model of an organization's capacity to do and use evaluations in both research and practice are discussed.
Collapse
Affiliation(s)
- France Gagnon
- University of Ottawa, 75 Laurier Ave E, Ottawa, ON, K1N 6N5, Canada.
| | - Tim Aubry
- University of Ottawa, 75 Laurier Ave E, Ottawa, ON, K1N 6N5, Canada.
| | - J Bradley Cousins
- University of Ottawa, 75 Laurier Ave E, Ottawa, ON, K1N 6N5, Canada.
| | - Swee C Goh
- University of Ottawa, 75 Laurier Ave E, Ottawa, ON, K1N 6N5, Canada.
| | - Catherine Elliott
- University of Ottawa, 75 Laurier Ave E, Ottawa, ON, K1N 6N5, Canada.
| |
Collapse
|
17
|
Bourgeois I, Simmons L, Buetti D. Building evaluation capacity in Ontario's public health units: promising practices and strategies. Public Health 2018; 159:89-94. [PMID: 29599056 DOI: 10.1016/j.puhe.2018.01.031] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2017] [Revised: 01/22/2018] [Accepted: 01/31/2018] [Indexed: 11/22/2022]
Abstract
OBJECTIVES This article presents the findings of a project focusing on building evaluation capacity in 10 Ontario public health units. The study sought to identify effective strategies that lead to increased evaluation capacity in the participating organizations. STUDY DESIGN This study used a qualitative, multiple case research design. METHODS An action research methodology was used to design customized evaluation capacity building (ECB) strategies for each participating organization, based on its specific context and needs. This methodological approach also enabled monitoring and assessment of each strategy, based on a common set of reporting templates. A multiple case study was used to analyze the findings from the 10 participating organizations and derive higher level findings. RESULTS The main findings of the study show that most of the strategies used to increase evaluation capacity in public health units are promising, especially those focusing on developing the knowledge, skills, and attitudes of health unit staff and managers. Facilitators to ECB strategies were the engagement of all staff members, the support of leadership, and the existence of organizational tools and infrastructure to support evaluation. It is also essential to recognize that ECB takes time and resources to be successful. CONCLUSIONS The design and implementation of ECB strategies should be based on organizational needs. These can be assessed using a standardized instrument, as well as interviews and staff surveys. The implementation of a multicomponent approach (i.e. several strategies implemented simultaneously) is also linked to better ECB outcomes in organizations.
Collapse
|
18
|
Janzen R, Ochocka J, Turner L, Cook T, Franklin M, Deichert D. Building a community-based culture of evaluation. Eval Program Plann 2017; 65:163-170. [PMID: 28889041 DOI: 10.1016/j.evalprogplan.2017.08.014] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/16/2017] [Revised: 07/10/2017] [Accepted: 08/23/2017] [Indexed: 06/07/2023]
Abstract
In this article we argue for a community-based approach as a means of promoting a culture of evaluation. We do this by linking two bodies of knowledge - the 70-year theoretical tradition of community-based research and the trans-discipline of program evaluation - that are seldom intersected within the evaluation capacity building literature. We use the three hallmarks of a community-based research approach (community-determined; equitable participation; action and change) as a conceptual lens to reflect on a case example of an evaluation capacity building program led by the Ontario Brian Institute. This program involved two community-based groups (Epilepsy Southwestern Ontarioand the South West Alzheimer Society Alliance) who were supported by evaluators from the Centre for Community Based Research to conduct their own internal evaluation. The article provides an overview of a community-based research approach and its link to evaluation. It then describes the featured evaluation capacity building initiative, including reflections by the participating organizations themselves. We end by discussing lessons learned and their implications for future evaluation capacity building. Our main argument is that organizations that strive towards a community-based approach to evaluation are well placed to build and sustain a culture of evaluation.
Collapse
Affiliation(s)
- Rich Janzen
- Centre for Community Based Research, 190 Westmount Road North, Waterloo, Ontario, N2L 3G5, Canada.
| | - Joanna Ochocka
- Centre for Community Based Research, 190 Westmount Road North, Waterloo, Ontario, N2L 3G5, Canada.
| | - Leanne Turner
- Alzheimer Society of Oxford, 575 Peel St, Woodstock, Ontario, N4S 1K6, Canada.
| | - Tabitha Cook
- Epilepsy Southwestern Ontario, 690 Hale Street, London, Ontario, N5W 1H4, Canada.
| | - Michelle Franklin
- Epilepsy Southwestern Ontario, 690 Hale Street, London, Ontario, N5W 1H4, Canada.
| | - Debbie Deichert
- Alzheimer Society Perth County, 1020 Ontario Street, Unit 5, Stratford, Ontario, N5A 6Z3, Canada.
| |
Collapse
|
19
|
Chen KHJ. Contextual influence on evaluation capacity building in a rapidly changing environment under new governmental policies. Eval Program Plann 2017; 65:1-11. [PMID: 28601737 DOI: 10.1016/j.evalprogplan.2017.06.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/24/2016] [Revised: 04/03/2017] [Accepted: 06/01/2017] [Indexed: 06/07/2023]
Abstract
Evaluation capacity building (ECB) is a context-dependent process. Contextual factors affecting ECB implementation have been explored theoretically and practically, but their influence within a changing environment has seldom been discussed. This study examined essential context-sensitive parameters, particularly those involved in implementing new governmental policies regarding higher education. Taiwan was used as a case study for exploring the effect of contextual change on ECB attributes from the perspectives of training receivers and providers. Surveys and interviews were used for data collection and importance-performance analysis was applied for data analysis. Four prominent features were identified. First, the ECB attributes perceived as important by receivers were performed adequately, whereas those perceived as less important were performed less well. Second, under new policies, training provider designed training covering a wide range of ECB, whereas receivers focused on those can be directly applied in evaluation process. Third, in a small education system such as Taiwan's, the complexity of peer review is high and ethical issues become important. Fourth, because the evaluation structure has been changed from single- to dual-track, receivers expect more training for institution staff, whereas providers insist on hierarchical training. Aligning ECB supply and needs is paramount for adaptation to new policies.
Collapse
Affiliation(s)
- Karen Hui-Jung Chen
- Department of Education, National Taipei University of Education, Taiwan; Higher Education Evaluation and Accreditation Council of Taiwan, Taiwan.
| |
Collapse
|
20
|
Valenti M, Campetti R, Schoenborn N, Quinlan K, Dash K. Building evaluation capacity of local substance abuse prevention programs serving LGBQ populations. Eval Program Plann 2017; 63:101-108. [PMID: 28456016 DOI: 10.1016/j.evalprogplan.2017.04.003] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/19/2016] [Revised: 04/04/2017] [Accepted: 04/15/2017] [Indexed: 06/07/2023]
Affiliation(s)
| | | | | | | | - Kim Dash
- Education Development Center, Inc., United States
| |
Collapse
|
21
|
Archibald T, Sharrock G, Buckley J, Cook N. Assumptions, conjectures, and other miracles: The application of evaluative thinking to theory of change models in community development. Eval Program Plann 2016; 59:119-127. [PMID: 27324286 DOI: 10.1016/j.evalprogplan.2016.05.015] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/12/2016] [Accepted: 05/23/2016] [Indexed: 06/06/2023]
Abstract
Unexamined and unjustified assumptions are the Achilles' heel of development programs. In this paper, we describe an evaluation capacity building (ECB) approach designed to help community development practitioners work more effectively with assumptions through the intentional infusion of evaluative thinking (ET) into the program planning, monitoring, and evaluation process. We focus specifically on one component of our ET promotion approach involving the creation and analysis of theory of change (ToC) models. We describe our recent efforts to pilot this ET ECB approach with Catholic Relief Services (CRS) in Ethiopia and Zambia. The use of ToC models, plus the addition of ET, is a way to encourage individual and organizational learning and adaptive management that supports more reflective and responsive programming.
Collapse
Affiliation(s)
- Thomas Archibald
- Virginia Tech, Agricultural, Leadership, & Community Education (0343), Litton-Reaves Hall, Rm. 284, 175 West Campus Drive, Blacksburg, VA 24061, United States.
| | - Guy Sharrock
- Catholic Relief Services, 228W. Lexington Street, Baltimore, MD 21201, United States.
| | - Jane Buckley
- JCB Consulting, Evaluativethinkingcapacity.com, 68 Chesapeake Landing, West Henrietta, NY 14586, United States.
| | - Natalie Cook
- Virginia Tech, Agricultural, Leadership, & Community Education (0343), Litton-Reaves Hall, Rm. 284, 175 West Campus Drive, Blacksburg, VA 24061, United States
| |
Collapse
|
22
|
Rorrer AS. An evaluation capacity building toolkit for principal investigators of undergraduate research experiences: A demonstration of transforming theory into practice. Eval Program Plann 2016; 55:103-111. [PMID: 26788814 DOI: 10.1016/j.evalprogplan.2015.12.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/25/2012] [Revised: 12/16/2015] [Accepted: 12/17/2015] [Indexed: 06/05/2023]
Abstract
This paper describes the approach and process undertaken to develop evaluation capacity among the leaders of a federally funded undergraduate research program. An evaluation toolkit was developed for Computer and Information Sciences and Engineering(1) Research Experiences for Undergraduates(2) (CISE REU) programs to address the ongoing need for evaluation capacity among principal investigators who manage program evaluation. The toolkit was the result of collaboration within the CISE REU community with the purpose being to provide targeted instructional resources and tools for quality program evaluation. Challenges were to balance the desire for standardized assessment with the responsibility to account for individual program contexts. Toolkit contents included instructional materials about evaluation practice, a standardized applicant management tool, and a modulated outcomes measure. Resulting benefits from toolkit deployment were having cost effective, sustainable evaluation tools, a community evaluation forum, and aggregate measurement of key program outcomes for the national program. Lessons learned included the imperative of understanding the evaluation context, engaging stakeholders, and building stakeholder trust. Results from project measures are presented along with a discussion of guidelines for facilitating evaluation capacity building that will serve a variety of contexts.
Collapse
Affiliation(s)
- Audrey S Rorrer
- University of North Carolina at Charlotte, 9201 University City Blvd., Charlotte, NC 28223, USA.
| |
Collapse
|
23
|
Campbell R, Townsend SM, Shaw J, Karim N, Markowitz J. Can a workbook work? Examining whether a practitioner evaluation toolkit can promote instrumental use. Eval Program Plann 2015; 52:107-117. [PMID: 25996627 DOI: 10.1016/j.evalprogplan.2015.04.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/14/2014] [Revised: 04/24/2015] [Accepted: 04/29/2015] [Indexed: 06/04/2023]
Abstract
In large-scale, multi-site contexts, developing and disseminating practitioner-oriented evaluation toolkits are an increasingly common strategy for building evaluation capacity. Toolkits explain the evaluation process, present evaluation design choices, and offer step-by-step guidance to practitioners. To date, there has been limited research on whether such resources truly foster the successful design, implementation, and use of evaluation findings. In this paper, we describe a multi-site project in which we developed a practitioner evaluation toolkit and then studied the extent to which the toolkit and accompanying technical assistance was effective in promoting successful completion of local-level evaluations and fostering instrumental use of the findings (i.e., whether programs directly used their findings to improve practice, see Patton, 2008). Forensic nurse practitioners from six geographically dispersed service programs completed methodologically rigorous evaluations; furthermore, all six programs used the findings to create programmatic and community-level changes to improve local practice. Implications for evaluation capacity building are discussed.
Collapse
Affiliation(s)
- Rebecca Campbell
- Department of Psychology, Michigan State University, 127 C Psychology Building, East Lansing, MI 48824-1116, United States.
| | - Stephanie M Townsend
- Townsend Consulting & Evaluation, 8 Locke Drive, Pittsford, NY 14534, United States.
| | | | - Nidal Karim
- CARE USA, 151 Ellis Street, NE, Atlanta, GA 30303, United States.
| | - Jenifer Markowitz
- Forensic Nurse Consultant, 2308 Mt. Vernon Avenue, Suite 238, Alexandria, VA 22301, United States.
| |
Collapse
|
24
|
Bourgeois I, Whynot J, Thériault É. Application of an organizational evaluation capacity self-assessment instrument to different organizations: similarities and lessons learned. Eval Program Plann 2015; 50:47-55. [PMID: 25757074 DOI: 10.1016/j.evalprogplan.2015.01.004] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/14/2014] [Revised: 01/19/2015] [Accepted: 01/31/2015] [Indexed: 06/04/2023]
Abstract
Organizational evaluation capacity (EC) has received significant attention in the evaluation research literature in the past decade. Much of the focus has been on defining organizational evaluation capacity, which can be thought of as the competencies and structures required to conduct high-quality evaluation studies (capacity to do), as well as the organization's ability to integrate evaluation findings into its decision-making processes (capacity to use). This paper seeks to contribute to this growing body of knowledge through a multiple case study of EC across three different organizations (e.g., non-profit, provincial government and federal government, herein named sectors); the novelty of this particular study is that each case study is based on the use of a common measurement tool developed by Bourgeois, Toews, Whynot and Lamarche (2013). The cross-case analysis presented in the paper reveals that evaluation capacity tends to be higher, both in terms of capacity to do and capacity to use, in organizations that have developed systematic mechanisms to institute an evaluation culture within their walls. Interestingly, however, we also found that capacity to use does not first require capacity to do, as evidenced in the non-profit organization under study.
Collapse
Affiliation(s)
- Isabelle Bourgeois
- École nationale d'administration publique, University of Québec, 283 boulevard Alexandre-Taché, Gatineau, QC, Canada J8X 3W7.
| | - Jane Whynot
- Whynot & Associates Evaluation and Research Consulting, 824 Dickens Avenue, Ottawa, ON, Canada.
| | - Étienne Thériault
- École nationale d'administration publique, University of Québec, 283 boulevard Alexandre-Taché, Gatineau, QC, Canada J8X 3W7.
| |
Collapse
|
25
|
Ensminger DC, Kallemeyn LM, Rempert T, Wade J, Polanin M. Case study of an evaluation coaching model: exploring the role of the evaluator. Eval Program Plann 2015; 49:124-136. [PMID: 25677616 DOI: 10.1016/j.evalprogplan.2015.01.002] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2014] [Revised: 09/23/2014] [Accepted: 01/10/2015] [Indexed: 06/04/2023]
Abstract
This study examined the role of the external evaluator as a coach. More specifically, using an evaluative inquiry framework (Preskill & Torres, 1999a; Preskill & Torres, 1999b), it explored the types of coaching that an evaluator employed to promote individual, team and organizational learning. The study demonstrated that evaluation coaching provided a viable means for an organization with a limited budget to conduct evaluations through support of a coach. It also demonstrated how the coaching processes supported the development of evaluation capacity within the organization. By examining coaching models outside of the field of evaluation, this study identified two forms of coaching--results coaching and developmental coaching--that promoted evaluation capacity building and have not been previously discussed in the evaluation literature.
Collapse
Affiliation(s)
- David C Ensminger
- Loyola University Chicago, School of Education, 820 N Michigan Ave, Chicago, IL 60611, United States.
| | - Leanne M Kallemeyn
- Loyola University Chicago, School of Education, 820 N Michigan Ave, Chicago, IL 60611, United States.
| | - Tania Rempert
- Loyola University Chicago, School of Education, 820 N Michigan Ave, Chicago, IL 60611, United States.
| | - James Wade
- Loyola University Chicago, School of Education, 820 N Michigan Ave, Chicago, IL 60611, United States.
| | - Megan Polanin
- Loyola University Chicago, School of Education, 820 N Michigan Ave, Chicago, IL 60611, United States.
| |
Collapse
|
26
|
Cousins JB, Goh SC, Elliott C, Aubry T, Gilbert N. Government and voluntary sector differences in organizational capacity to do and use evaluation. Eval Program Plann 2014; 44:1-13. [PMID: 24462833 DOI: 10.1016/j.evalprogplan.2013.12.001] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/21/2008] [Revised: 12/05/2013] [Accepted: 12/10/2013] [Indexed: 06/03/2023]
Abstract
Research on evaluation capacity is limited although a recent survey article on integrating evaluation into the organizational culture (Cousins, Goh, Clark, & Lee, 2004) revealed that interest in the topic is increasing. While knowledge about building the capacity to do evaluation has developed considerably, less is understood about building the organizational capacity to use evaluation. This article reports on the results of a pan-Canadian survey of evaluators working in organizations (internal evaluators or organization members with evaluation responsibility) conducted in 2007. Reliability across all constructs was high. Responses from government evaluators (N=160) were compared to responses from evaluators who work in the voluntary sector (N=89). The former were found to self-identify more highly as 'evaluators' (specialists) whereas the latter tended to identify as 'managers' (non-specialists). As a result, government evaluators had significantly higher self-reported levels of evaluation knowledge (both theory and practice); and they spent more time performing evaluation functions. However, irrespective of role, voluntary sector respondents rated their organizations more favorably than did their government sector counterparts with respect to the antecedents or conditions supporting evaluation capacity, and the capacity to use evaluation. Results are discussed in terms of their implications for evaluation practice and ongoing research.
Collapse
Affiliation(s)
- J Bradley Cousins
- Faculty of Education, Centre for Research on Educational and Community Services (CRECS), University of Ottawa, Canada.
| | - Swee C Goh
- Telfer School of Management, University of Ottawa, Canada
| | | | - Tim Aubry
- School of Psychology, CRECS, University of Ottawa, Canada
| | | |
Collapse
|