achieving our potential?
May 8 - 12, 2021
The program is under review.
Evaluation in the context of fragility, conflict and violence (Evalfcv)
Keiko Kuji-Shikatani and Hur Hassnain
Join us as we strategize utilizing evaluation to promote good governance, peace, resilience, effective institutions and inclusive societies in development and humanitarian assistance. Afghanistan Evaluation Society, Pakistan Evaluation Association, and CES supported by EvalPartners and IDEAS EvalFCV TIG began our journey in 2018 to increase access to knowledge, expertise, lessons and best practices related to utilizing evaluations in the context of fragility, conflict and violence in achieving SDG16. Admittedly we had various challenges, but we remained hopeful and focused facilitating the understanding of the challenges faced by evaluators working in in situations that are fluid, complex and volatile, and sharing learning to inform ways to be effective in such trying contexts. Opportunities to advocate for Evalfcv, challenged us to further our thinking and be intentional in continuing our partnerships going forward, so that the VOPEs can partner towards the realization of the SDGs.
The presenter will present and facilitate discussions on whose utilization are we focusing on as evaluators. The presenter would draw upon and share experiences from evaluations in international development. The presentation/discussion will contemplate on the question of whose utilization are we as evaluators looking at: is it the funder (development partner), the implementer, and/or the beneficiaries (communities/organizations). Furthermore, the presenter will also share notes on whether evaluators can balance the utilization at all three levels of stakeholders. The presentation/discussion will include (from practical experience) what role the evaluator can play and what approaches and/or ingredients are required and how it varies with various types of evaluations. Do context and needs affect who uses the evaluation? Experiences from Canada will be solicited from participants to compare and contrast.
Increasing the utility of evaluation findings/recommendations- widening the participation of communities to promote transparency and accountability- Liberia as case study
Swaliho F. Kamara and Musa K. Sanoe
Evaluation is a tool to promote accountability, improve decision making and continue learning. What significance will evaluations be if the results are not utilized. In this thematic breakfast, we will focus on how to change the paradigm of evaluation from being exclusively donor requirements to a power tool for increasing community participation. In many low and middle income comes where majority of the population are illiterate, evaluation findings will not make sense based upon how the results are being communication in the reports. In most cases, only the formal sectors would understand and interpret figures resulting from evaluation but not the rural communities. This clearly undermines our leadership in ensuring that we remain accountable to our stakeholders including the beneficiaries, the marginalized populations. We will demonstrate to our professional colleagues, how to increase the understanding and participation of evaluation results by illiterate population.
Do you like Green Eggs and Ham? Scaling up an evaluative rubric to achieve CES sustainability goals.
Andrealisa Belzer and Matt Jacques
No matter what you like for breakfast, the CES Sustainability Working Group (SWG) would love your input on how to adapt the CES Conference Greening Rubric, for use as a tool to manage the footprint of other CES activity. Since 2018, the Canadian Evaluation Society has taken steps to improve the integration of natural systems outcomes into our operational and professional responsibilities, including: establishment of the Sustainability Working Group, updates to core competencies to integrate natural systems outcomes, and adoption of sustainability as an overarching CES strategic principle. The Greening Rubric was first developed for C2019 and was strengthened for use in C2020. Consistent with Principles-Focussed Evaluation (Patton, 2018) three operational principles are implemented and monitored by an evaluative rubric. By invitation from the National Board and Chapters, the SWG is now offering webinar orientation and exploring adaptation of the Greening Rubric for broader influence.
Using Evaluation Experiences and Evidence from the South to Advance the Global Gender Agenda through active collaboration
Bintou Nimaga, Rituu Nanda, Fabiola Amariles and Svetlana Negroustoueva
(members of EvalGender & Plus)
In the Era of the global development compact – the 2030 Agenda for Sustainable Development, adopted by UN Member States in 2015 provided a blueprint for shared efforts towards global peace, prosperity, and a better more sustainable future for all. The moral imperative of ‘No one left Behind’ impels all development actors, to work to end all forms of discrimination. Without a doubt, the most pressing, widespread and intractable form of inequality is gender-based. The pervasiveness of gender injustice is not as a result of the absence of effort but of resistance by the globally dominating privileged. This thematic Breakfast meeting is for the gender and evaluation tribe gathered at the CES 2020 to meet and genuinely plan for global collaboration rooted in Canada with the rest of the world through the sharing of experiences and resources.
Can Mentoring for CES Evaluators Support Competency Development? Evaluation Utilization?
Brenda Stead and Christopher Cameron
Mentors’ can develop leadership skills, acquire new insights, and gain a personal sense of satisfaction from knowing that they have helped someone. ‘Mentees’ can grow their competencies, gain valuable advice from a more experienced person, and build their professional networks. The present bilingual CES Mentoring Initiative was launched in 2016 for the benefit of CES members, including affiliates of the American and Australasian Evaluation Societies. This session will engage participants on questions related to the utilization and potential utilization of mentoring (the CES Mentoring Initiative) to support CES members in developing professional competencies and their application in diverse and changing contexts, and the production of high quality and useful evaluations.
Return on Investment Analyses of Prevention Initiatives: Achieving credibility and utilization at low cost?
We’ve conducted “return on investment” analyses (ROI and SROI) of prevention initiatives in several areas: chronic illness, violence against women, crime prevention, and supporting victims of violence. We found that achieving credible analyses of prevention initiatives with NGOs and intersectoral teams was challenging, and resources allocated were typically inadequate. Considerable time was required in many areas: deeply engaging stakeholder, understanding contributions, planning for utilization and mobilization, accurately articulating and adjusting outcomes, refining indicators, improving data collection tools and processes, determining & developing appropriate proxies, and establishing appropriate discount factors. Participants will hear key lessons we’ve learned on improving approaches to achieve credibility and utilizability. We invite others conducting ROI and SROI to exchange and discuss ideas with potential to optimize analyses and utilization within the resources available.
Let’s utilize all this – What can your local Chapter do for you?
Emily Brennan and Marie-Philippe Lemoine
Are you enjoying engaging with your fellow evaluators during the conference, but not sure where to go from here? CES-NCC President Emily and Secretary Marie-Philippe invite you to this thematic breakfast, where we will discuss what evaluators would like to see from their Chapter after the conference is over. What can we do to build on the relationships and insights from the conference? Let’s brainstorm ways we can continue to engage and share in the immediate future, to drive evaluation utilization through a dynamic community, in the National Capital Region and beyond. We invite all to share their experiences, thoughts on upcoming opportunities, and their wildest dreams and ambitions for full utilization of your Chapters potential. The hosts will also pick your brains on some creative suggestions!
Infographics for Evaluation
The pressure is increasing to use more visuals in evaluation, and for good reason – people absorb this information more readily, and therefore the utility of the evaluation is increased. Infographics are a great way to incorporate data visualization into your evaluation – and a lot of fun to make to boot. That said, they do come with challenges, many of which relate to the fact that it is a less precise medium for conveying findings. In this breakfast session, I will pass around a number of (printed) infographics I have done, and discuss the lessons I have learned along the way. My goal is for participants to leave this discussion with a sense of how to most effectively use these visuals, and convinced that they are easier to create than many people think.
The erosion of actual use: Challenges and options for evaluators
Utilization-focused evaluation brings excitement at the start. Primary interested users are able to steer a process that is often controlled by outsiders, especially funders. Enjoying the opportunity to own or collaborate in the design of an evaluation can lead to a deluge of evaluation uses and associated Key Evaluation Questions. The evaluator needs to help the client prioritize on what is realistic, always with a focus on actual use. In this session I will share experiences where the level of interest began to wane, as organizational momentum led to an increase in defensiveness. I will share examples of recent evaluations where the erosion was affected by power structure that resisted some of the emerging findings, or were as the evaluator I was faced by a client that did not adhere to the evaluation standards. The session will focus on the challenges and options for evaluators to respond, both to maximize use in practical ways, and to remain committed to evaluation standards.
Enhancing Confidence in Evaluation Findings when Assessing Effectiveness with Qualitative Methods
Evaluators will discuss strategies to achieve the utilization of findings on program effectiveness when the findings arise from qualitative methods. In order to give decision-makers the confidence to use the findings, evaluators can discuss how they ensure and communicate that their findings are reliable and based on robust methods. Evaluators can also discuss specific practices they use to enable funding decisions to be made based on findings from qualitative methods. If there are non-believers, this discussion can facilitate dialogue on whether it is desirable for evaluators to assess effectiveness even when administrative data is not available or a large-scale survey is not feasible.
Implementing a sex, gender and diversity lens for evaluation: challenges and resources
Peter Czerny and Robert Tkaczyk
Having introduced a sex, gender and diversity lens for the Public Health Agency of Canada and Health Canada evaluation team over a year ago, we are ready to share our tools and experiences in applying them. The breakfast will facilitate a conversation and information exchange about challenges and solutions for ensuring that sex, gender and diversity are meaningfully taken into account when designing, conducting and reporting on evaluations. This is especially important in terms of meeting the utilization-focused needs of decision-makers while highlighting the voices of those who may not be well-represented in data or research, policies or program on a given issue.
No such thing as free advice: incorporating expert advice to enhance utilization in evaluation
Janice Remai and Michael Goodyer
Expert input is often sought in evaluations to provide technical expertise, peer review or broader contextual insights. For evaluators, who are not subject matter experts, these perspectives can be valuable in strengthening conclusions and recommendations and enhancing the credibility of the evaluation product with program representatives to increase utilization. However, the feedback of experts, who are not evaluators, must also be gathered and shaped in a way to inform the evaluation in a relevant way. This thematic breakfast presentation will explore how the Canadian Institutes of Health Research (CIHR) and Goss Gilroy have sought to incorporate expert input over several evaluations. A placemat of possible options, including their advantages and disadvantages, will be distributed as the basis for sharing practical lessons on how to optimize the incorporation of subject matter experts in evaluations.
Stuck in the Middle: Balancing Social and Entrepreneurial Goals as Independent Consultants
Independent consultants utilize evaluation for two main purposes: contributing to informed decision-making for our clients (and thus assisting them in achieving their mission), and earning an income for ourselves. Although these two aims are usually complementary, there can be tensions between making a difference and making a living, especially when supporting organizations working to address equity issues that can benefit from well-conducted evaluations but who may lack the resources to engage in the process. This thematic breakfast roundtable, informed by conversations that the presenter participated in at AEA 2019, invites anyone interested in this issue to share their reflections, critical questions, and promising practices on how to keep our consulting practices financially healthy while supporting organizations working for social change.
Switching gears: What do you do when the user(s) of your evaluation changes?
This session will explore to all-too-common situation when the expected user or users of your evaluation change mid-way through the evaluation. Will the new client for your evaluation be interested in the evaluation questions you are exploring? Will they have different expectations for how the evaluation will inform decision-making? What if they don’t buy-in to the value of evaluation at all? Join me and your peers to discuss strategies for getting new clients/senior managers on-board and minimizing the impact on your evaluation schedule, budget and overall usefulness to the organization.
Innovative Approaches to Program Evaluation in Sport and Physical Activity
The facilitators will share their wealth of experience designing and implementing highly engaging program evaluations in a fast-paced youth sport setting. Conversation will touch on strategies for involving coaches and other front line staff, getting participants excited about evaluation, and managing relationships with corporate partners. The facilitators will also discuss the use of technology and digital platforms to increase the effectiveness of program evaluations, and the art of building an organizational culture of evaluation in the world of sport and physical activity. Participants will walk away with new ideas on how to 1) apply innovative strategies and tactics to measure outcomes in sport and physical activity programs, 2) engage participants in evaluation processes in a sport/physical activity setting, and 3) optimize the volume and quality of data collected in a sport/physical activity setting – a context that presents several unique challenges to high-quality evaluation.