Talleres

Actualmente estamos revisando el programa.

WorkshopEl sábado 13 de junio, mañana

L’examen de systèmes de mesure de rendement

Simon Roy, Natalie Kishchuk
Niveau débutant-intermédiaire

Bien que la mesure du rendement constitue un des piliers de la gestion contemporaine, sa pratique laisse souvent à désirer : on y retrouve assez souvent des amas d’indicateurs plus ou moins valides et utiles, analysés en silo et utilisés davantage pour satisfaire les exigences d’imputabilité que pour la réelle gestion du rendement des programmes. Cet atelier vise à aider les participant·e·s développer un regard critique sur toutes les composantes des systèmes de mesure de rendement : cadres logiques et profils de programme, indicateurs, opérations de calcul et d’interprétation ainsi que rapports de rendement. Le contenu sera axé principalement mais pas limité au contexte fédéral; il sera applicable donc aux systèmes de monitorage utilisés par des organismes non-gouvernementaux pour les fins de reddition de compte.

El sábado 13 de junio, tarde

The ins and outs of utilization-focused developmental evaluation

Ricardo Ramirez, Dal Brodhead
Intermediate

The goal of the workshop is to illustrate the ‘practical wisdom’ that is needed to design and manage utilization-focused developmental evaluation (UFDE). Participants will enhance or gain competencies associated mainly with situational practice, as well as associated management and interpersonal practice. We will cover the main steps of UFE and the variations needed to embed developmental elements within. We will review the mentoring strategies that we use in coaching primary interested users thought the process. We will review the types of project and evaluation uses, for which UFDE turned out to be the best approach. We will offer several case studies and a couple of published articles illustrating recent examples. Participants will be encouraged to engage actively in responding to real cases through facilitated group discussions. We will include an option to review current or upcoming evaluation assignments that the participants are preparing for and assist them in preparation.

El sábado 13 de junio, todo el día

Engaging Audiences Through Slide Design and Interaction: Your New Presentation Superpower!

Sheila Robinson
Beginner level

Presentations are about audience learning. The goal is for participants to walk away with new understandings and abilities they can use in their own professional practice. Successful presenters work in service to this goal, whether presenting an evaluation report, giving a keynote, facilitating a workshop, or even running a meeting with stakeholders. In this immersive session, participants will learn and practice audience engagement and effective slide design. They will experience more than a dozen ways to engage audiences and even more slide design techniques to enhance visual communication (bring laptop for hands-on practice) – all to enable audiences to put new knowledge into practice in their own contexts. These are skills easily learned, guided by simple principles, but with the powerful potential to influence key connections between presenter and audience. The guiding principle here is, «Every presentation worth doing has just one purpose: To make change happen.» -Seth Godin

 

Indigenous Evaluation: Foundations, Relationships, Application, and Situational Responsiveness

Carolee Dodge Francis, Nicole Bowman
Beginner level

Indigenous evaluation (IE) is a member of the culturally responsive evaluation family and thus framed within an Indigenous intersectionality as it relates to western evaluation. This practical and interactive workshop will facilitate learning and understanding of theories and methodologies associated with Indigenous evaluation for applied practice. Workshop participants will be shown strategies to build authentic relationships, professional practice matrices, and examples of how to co-design effective systems, programs, or collaborative project evaluation studies. Specifically, workshop participants will be instructed on how to conduct with (not on) Tribal nations and Indigenous communities. This session is based upon the four directions contained within the Lunaape Medicine Wheel. This pedagogical style aligns and integrates theoretical, methodological and traditional knowledge, along with cultural and community protocols for applied Indigenous evaluation.

 

Introduction to Statistics for Better Decision-Making Discussions

Carolyn Hoessler
Beginner level

With better statistical conversations and designs comes better-informed decision-making. Join us for a half-day practical professional workshop to brush up on what statistics is really about. This workshop focuses on a basic introduction to what statistics can be used for and can’t be used for, and how to discuss, raise and address concerns about the quality of statistical designs based on a four-part alignment framework that will help you to better describe and feel empowered to discuss statistics. Evaluators and partnering organizations often need to read and interpret reports and engage in dialogue, critique designs, and engage in decision-making discussions where statistics is held accurately or not as decisive evidence. This workshop will help you to better engage in these discussions and have an informed approach to addressing the quality of a statistical design for greater utilization of meaningful results.

 

Consulting Boot Camp: What You Need to Know to Get Started

Gail Vallance Barrington
Intermediate level

Are you dreaming about becoming an independent consultant? For many, this prospect is both exciting and daunting. You ask: How do I get started? What kind of business skills do I need? Will there be enough work? Can I afford to do this? These and other unanswered questions may be holding you back. Join Gail Barrington, an independent consultant with more than 30 years of experience, for this full-day workshop. Explore the most important start-up issues and reflect on your own consulting skills. Determine if consulting is an appropriate career choice for you and come away with an understanding of the simple but important skills you need to be successful. Key topics include personal characteristics; marketing; business planning; setting up shop; managing fees, time, and getting paid; and maintaining work-life balance. Valuable examples, worksheets, and stories will be shared. Through lecture, small group discussion, and personal reflection, participants will plan a start-up strategy.

 

Economic Foundations of Theories of Change

Gregory Mason
Intermediate level

Economics can enrich and inform the theory of change and create more informative results chains. The economics covered in this session reach back to Adam Smith and the classic Wealth of Nations and move forward to the most recent developments in behavioural economics and game theory. Using ideas as public goods, sunk costs, market failure, moral hazard, externalities asymmetric information, and prospect theory, participants will gain insight into the economic foundation of the interventions used by government to guide the invisible hand. The workshop combines lectures with Socratic style seminar interactions. Two examples anchor the main ideas:1) a vaccination program starts with the 1968 Axnick study on measles and continuing to current HPV vaccination programs; and 2) implementing a basic income using the Ontario Basic Income Pilot as the example. Participants need no prior knowledge of economics, and Lecture with Socratic seminar will be the teaching tools of choice.

 

A Hands-On Introduction to Arts-Based Data Collection Tools for Evaluators

Jennica Nichols, Maya Lefkowich
Intermediate level

Evaluators need diverse tools to engage with stakeholders, measure impacts, and maximize utilization. Arts-based methods are accessible, engaging, and flexible. They create new opportunities for evaluators in data collection and use. This workshop provides a window into how arts-based data collection can be used in evaluation. It will teach foundational concepts and foster understanding around the unique aspects of arts-based data collection, including key ethical and practical considerations. Participants will then participate in three hands-on exercises in small groups to gain practical experience using three arts-based tools (body mapping, drawing, photo elicitation). We will discuss analysis and how to assess credibility in arts-based work. Lastly, we will share strategies to overcoming potential barriers to integrating arts-based data collection into your evaluation practice. The workshop will include short lectures, individual and small group activities, and group discussion.

 

Gender-Sensitive and Feminist MEAL: Best Practices, Challenges and Tips for Making Evaluation Meaningful and Using Data to Drive Decision-making

Deborah Simpson, Rotbah Nitia, Robert Vandenberg, Elaine Stavnitzky, Andres Goulsborough
Intermediate level

With an increased demand to incorporate GBA+ into our evaluative work and renewed mandate for Feminist International Assistance Policy, development and non-development actors are challenged to ensure that evaluations provide intersectional analysis incorporating a feminist lens. This workshop will introduce feminist evaluation and consider how it moves gender-sensitive evaluation (that allows application of GBA+ analysis) toward being gender-transformative (i.e. changing power relations through the process of evaluation itself) that determine who benefits from the purpose, collection and use of data. This workshop introduces concepts and realities behind gender-sensitive and feminist evaluative practices of Oxfam, MEDA, ADRA Canada & Salanga, WUSC & Plan International; explores real-life examples and catalyzes collaborative discussion for participants to dive into their own organizational practices and apply best practices, with a focus on improving utilization of findings for all stakeholders.

 

Introduction à la récolte des effets (outcome harvesting)

Jérôme Leblanc
Niveau intermédiaire

La formation vise à offrir aux participant·e·s les éléments de base pour mettre en œuvre une démarche de récolte des effets. De manière spécifique, la formation vise à : présenter la théorie et les réflexions à l’origine de la méthode; présenter les principes qui guident cette méthode; présenter ses étapes génériques; réaliser des exercices pratiques pour améliorer la capacité des participant·e·s à repérer des effets et à rédiger des formulations d’effets efficaces. Un accent particulier sera mis sur les différents moyens de faciliter l’utilisation des résultats de la récolte des effets en allant des présentations classiques, aux « data party » en passant par les outils numériques de facilitation. Comme il s’agit d’une méthode d’évaluation participative, un accent sera également placé sur comment la participation tout au long de l’évaluation, « des bonnes personnes au bon moment », permet de favoriser l’appropriation des résultats et en conséquence leur utilisation.

 

Sustainability-Ready Evaluation: Moving from Theory to Practice

Andy Rowe, Francois Dumaine, Debbie DeLancey
Advanced level

The world is facing unprecedented environmental threats yet evaluation is not achieving its potential. We focus on human systems, paying little attention to natural systems or the impact of human activities. A recent stock-taking indicates that evaluation is unprepared to incorporate natural systems into our work. This workshop will provide a brief overview of the current state of sustainability-ready evaluation, and engage participants in working to adapt existing evaluations to nexus evaluations by incorporating natural system effects and contributions to the evaluation approach. Small group and plenary sessions will identify challenges, resources and opportunities including client/political challenges, identifying appropriate methods and approaches and working with multiple types of knowledge and worldviews. The workshop will set the stage for a community of practice to generate guidance documents and publications to continue development of best practices and approaches.

 

El domingo 14 de junio, mañana

Analyzing Qualitative Data Using Qualitative and Quantitative Analysis Techniques

Simon Roy, Leah Simpkins
Intermediate level

This intermediate level workshop presents qualitative and quantitative analysis techniques of textual data such as interview findings. At the end of the workshop, participants will be able to conduct and present the analyses in a clear and systematic fashion. The workshop makes a clear distinction between both techniques and begins with a review of the qualitative analysis techniques using Excel, followed by quantitative analysis techniques using computer-assisted qualitative analysis software. The presenters will demonstrate, in a step-by-step process, how to conduct qualitative analyses using a live case. The pros and cons of quantifying the evidence using percentages and proportions will be discussed. In the second hour, participants will learn the main steps involved in the quantitative analysis of qualitative findings, including the steps involved in identifying key codes and word clusters. A discussion will also be held about the strengths and limitations of such analyses.

 

Policy Evaluation Using Simulation

Gregory Mason
Intermediate level

Program evaluation is typically ex-post, or after the fact. This can date advice on program design and implementation, leading to under utilization. Ex-ante evaluation methods, notably cost-benefit, experiments, and simulations offer evaluators an opportunity to «get out in front» of programs and policies. This workshop explores policy simulations using Excel. Participants will review the main ideas of policy simulation using two case studies – vaccination programs and modelling the impact of electricity rate increases on household incomes. Starting from first principles, the workshop will build each case study to become a functioning policy evaluation study. Key assumptions anchor the development of policy models, and the workshop will demonstrate techniques to validate model results. Participants will download the models in advance of the workshop and will have an opportunity to run simulations and alter the assumptions underlying the simulations.

 

Reflective Practice and Innovation: Making Creativity Part of Your Life

Gail Vallance Barrington
Intermediate level

Reflective Practice bridges theory and practice. It helps us interpret the complexity, uncertainty, unpredictability, and disruption we encounter every day. If we think creatively, we can allow ourselves to experiment, innovate, and refocus, expanding our skills and adding value to our work. This workshop will explore how we can incorporate reflection more fully into our working lives. What barriers and issues stand in the way of innovation? What questions should we be asking? What reflective strategies can we use, and if we use them, what are the implications? Five reflective strategies will be described, and participants will have an opportunity to experiment with several of them and discuss their experiences. We will reflect on the links between reflection, innovation, and action and will leave with a personal strategy to incorporate creativity into our daily lives.

 

Selecting Among Evaluator Stances: A Reflective Practice Workshop

Natalie Kishchuk
Intermediate level

Stance refers to the evaluator’s general attitudinal orientation vis-à- vis stakeholders and the evaluand. Many stances are possible; they vary in their legitimacy and appropriateness in different contexts. In order to enhance stakeholder relations and responsiveness, this workshop’s goal is to increase participants’ stance self-awareness. They will be invited to reflect on stances they unconsciously or deliberately adopt, reject and/or have imposed upon them. Six stances identified in the literature (Disinterested Scientist; Passionate Participant; Transformative Intellectual; Critical Friend; Servant-Leader; Recording Secretary) will be reviewed. In a «sorting hat» exercise, participants will identify their most and least adopted/imposed, and most to least comfortable, stances. Discussion will focus on how and why these stances have and have not been helpful. Participants will then role-play unfamiliar stances, for two key practice moments: choosing methods and interpreting findings.

 

Intégrer l’analyse comparative entre les sexes plus (ACS+) dans le processus d’évaluation : l’essentielle progression de la théorie à la pratique

François Dumaine
Niveau débutant-intermédiaire

On s’attend de toutes les évaluations fédérales à ce qu’elles considèrent l’analyse comparative entre les sexes plus (ACS+). À l’extérieur du cadre fédéral, beaucoup d’organismes s’intéressent également aux questions d’équité, de diversité et d’inclusion. Cet atelier vise deux objectifs. D’abord, permettre aux participant·e·s d’explorer l’interaction entre l’ACS+ et l’évaluation de programme. À bien des égards, il s’agit de deux processus d’analyse. Il faut maintenant comprendre là où ils peuvent se rejoindre et se compléter. Deuxièmement, les participant·e·s auront l’occasion d’explorer des stratégies pratiques leur permettant d’intégrer l’ACS+ dans leurs études évaluatives. Si l’intégration de l’ACS+ soulève d’importants défis, elle offre néanmoins de précieuses pistes permettant à l’évaluation de programme de s’acquitter de son rôle fondamental visant à documenter et comprendre toutes les dimensions de l’impact d’un programme.

 

Use of Evaluation Theory in Practice

Kaireen Chaytor
Intermediate level

Shadish’s AEA presidential address was entitled «Theory is Who We Are». That was 1992. The question in 2020 may be: Without theory who are we? This workshop will explore Evaluation Theory – a separate entity from program theory – and its potential use in evaluation practice. We will look at the significance of the absence of evaluation theory in teaching and practice. Concepts from the literature of Shadish, Chelimski, and Alkin will be used to show how evaluation theory interacts with practice. Chelimsky (2012) argues that a theory-practice combination will produce both the strongest possible evaluative information and better use of the information in government or elsewhere. Evaluation theory must be used at the design phase answering questions such as what kind of evaluation may be feasible, what types of evaluation questions are required, and what methods are then appropriate. Exercises will confront the challenge of bringing abstract theory to the real world of practice problems.

 

El domingo 14 de junio, tarde

Evaluation partenariale du partenariat dans le cadre de projets et de programmes

Jean-Marie Loncle
Niveau débutant

Les partenariats occupent de plus en plus de place dans le cadre de la réflexion et de la mise en œuvre de projets et de programmes, du fait du caractère complexe et incertain de ceux-ci et de l’environnement dans lesquels ils sont développés. Ils requièrent l’association de nombreuses parties prenantes. Cela pose la question du « travailler ensemble », de la culture, du partage de compétences et de ressources ainsi que de la mutualisation des moyens de manière à améliorer l’efficacité des actions mises en œuvre. Il s’agit d’un principe d’action en tant que tel et non plus d’un simple moyen au service de l’action. Les partenariats sont une condition indispensable que l’on peut également soumettre à l’évaluation afin de pouvoir déterminer leur valeur ajoutée et déterminer leurs perspectives d’avenir. Nous verrons donc dans cet atelier comment traiter le partenariat comme objet d’évaluation.

 

Exploring How Evaluation Planning, Processes, and Communication Influence Utilization

Erica McDiarmid
Beginner level

In practice, the level of evaluation utilization is often determined before the evaluation is completed due to improper planning or ineffective processes. Therefore, a purposeful evaluation planning process is essential in achieving a product that will result in appropriate utilization. This interactive workshop will present the key considerations during the evaluation process that will meaningfully engage key stakeholders, increase evaluation buy-in, and optimize utilization. The workshop will dive into the role of communication with key stakeholders and highlight best practices in communicating evaluation results to promote utilization. Participants will be provided with a template that they can work through as each essential planning component is discussed, creating individual plans for evaluation utilization. Through this workshop, participants will be able to identify key evaluation planning process considerations that could be applied to optimize evaluation utilization.

 

Knowledge Translation and Dissemination: The End of the Research and Evaluation Cycle – or is it? Evaluation of Knowledge Uptake and Utilization

Kelly Skinner, Jasmin Bhawra, Steve Montague
Beginner level

Knowledge transfer and exchange (KTE) has become an integral part of organizational practice, whereby knowledge generated through research and other activities is synthesized and disseminated to relevant stakeholders in the form of reports, workshops, and other knowledge products. However, knowledge products and processes are seldom evaluated. Given the vast amount of resources, money, and effort that organizations and researchers spend on knowledge translation and dissemination, it seems an obvious question to ask – what is the uptake and utilization of your work? To meet this need, Skinner developed a 44-item Knowledge Uptake and Utilization Tool (KUUT) in 2007. Over the past decade, the KUUT has been used by numerous governmental and non-profit organizations to assess knowledge uptake and utilization of a range of knowledge products (i.e., reports, websites) and KTE processes (i.e., workshops, training events). This workshop will explore utilization measurement using the KUUT.

 

Achieving and Exceeding Utility Standards for Program Evaluation

Sandra Sellick
Intermediate level

The utility standards for program evaluation, originally developed by the Joint Committee for Standards in Educational Evaluation (JCSEE) and adopted by the Canadian Evaluation Society in 2012, are intended to increase the extent to which program stakeholders find evaluation processes valuable in meeting their needs. Do your practices as an evaluator achieve or exceed these standards? This workshop is designed to take participants on deep dive into the standards with multiple opportunities for active learning experiences through a variety of modes. It will be a place where work and fun meet to create a climate for learning. Participants can expect to be engaged in individual reflective activities, case study analysis, group discussions, role play (if they feel comfortable doing so within the workshop setting), and more. Participants who have a copy of guide to the 3rd edition of the JCSEE program evaluation standards are encouraged to bring it to personalize at the workshop.

 

Designing Mixed Methods Evaluations

Cheryl Poth
Intermediate level

Have you heard about mixed methods evaluations but not sure how to design? If so then this workshop, led by an evaluation practitioner with more than a decade of experience teaching program evaluation and mixed methods research, is for you! Through integrating a practical and learner-responsive approach, this workshop will guide you in the fundamentals of designing mixed methods evaluations. The workshop is organized around three key questions that can also be considered the learning objectives: (a) What distinguishes mixed methods evaluations from other types of evaluation designs? (b) How do evaluators position the suitability, value, and feasibility of mixed methods evaluations? (c) How can evaluators avoid common pitfalls seen in mixed methods evaluations?

 

Realist Interviewing

Emma Williams
Intermediate level

Realist evaluation aids utilization by explaining why and how policies/programmes work (or not) in different circumstances, and with different groups. A key technique is the realist ‘teacher-learner interview’, which is designed to test and refine programme theory. Workshop participants will learn how and why realist interviews differ in important ways from other interview methods, and by the end of the session will know how to: select the right interviewees; design a realist interview; elicit stories in the interview that can explain the different ways the program works; explore key elements of program theory with interviewees; identify and address common challenges in realist interviewing. Some theory will be presented, but much of the session will be spent in actually practising and receiving feedback on realist interviewing techniques. Structured exercises are based on real life realist evaluations, but participants are also encouraged to bring questions from their own projects.

 

El domingo 14 de junio, todo el día

Evaluative Thinking: Principles and Practices to Enhance Evaluation Capacity, Quality, and Utilization

Thomas Archibald, Jane Buckley
Beginner level

How does one «think like an evaluator»? How can program implementers learn to think like evaluators? Recent years have witnessed an increased use of the term «evaluative thinking,» yet this particular way of thinking, reflecting, and reasoning is not always well understood. Patton warns that as attention evaluative thinking has increased, we face the danger that the term «will become vacuous through sheer repetition and lip service» (2010, p. 162). This workshop can help avoid that pitfall. Drawing from our research and practice in evaluation capacity building, in this workshop we use discussion and hands-on activities to address: (1) What evaluative thinking (ET) is and how it pertains to your context; (2) How to promote and strengthen ET among individuals and organizations with whom you work; and (3) How to use ET to identify assumptions, articulate program theory, and conduct evaluation with an emphasis on learning and adaptive management.

 

Designing for Impact: How to Apply Design Thinking for Better Programs and Evaluations

Cameron Norman, John Gargani
Intermediate level

This workshop is based on four years of experience introducing evaluators to design in the Design Loft, a part of the annual conference of the American Evaluation Association. In this workshop, you will: learn how design is both a way of thinking and a strategy for collaboration; apply practical tools and techniques to the design programs and evaluations, and; discover how to harness feedback, teamwork, and iteration to create better programs and evaluations. We believe in hands-on learning in a studio format. So from the start, you and other participants will be challenged to work on design problems that evaluators commonly face, such as describing programs in ways that make them more reliable and evaluable, basing programs and evaluations on a deep understanding of the people who are affected, and integrating evaluative thinking into program design, and integrating design thinking into evaluation.

 

Evaluating Coalitions and Community Collaboratives

Susan Wolfe, Ann Price
Intermediate level

This workshop provides practical steps for conducting community-based participatory evaluation, using data, and facilitating evaluation use with coalitions and collaboratives. We stress the importance of employing a utilization focused approach. The goal is to provide attendees with frameworks, information, techniques, and tools that will guide and support their work. Content includes an introduction to collaboratives; frameworks, models, and principles; the basics for evaluating collaboratives (approaches, methods, levels of evaluation, stages of development, measures and tools); overcoming collaborative evaluation challenges, using evaluation results with collaborative members, best practices to ensure success, and resources. The workshop will include cultural humility guidance and guidelines for incorporating measures of equity and justice into their evaluation, as well as guidance for developing genuine partnerships with communities for the evaluation processes.

 

Learning as You Go in Becoming Part of the Solution as a Blue Marble Evaluator (BME) in the Public Sector

Keiko Kuji-Shikatani
Intermediate level

For evaluators both internal and external working directly/indirectly for/or with the social innovators in the public/not-for-profit sector who find themselves dealing with problems, trying out strategies, and striving to achieve goals that emerge from their engagement with the change process. BME is a global initiative focused on training the next generation of evaluators, urging us to know and face the realities of the Anthropocene and act accordingly. Various levels of governments in countries such as Canada are by far the largest stakeholders in transformative engagements where public servants, guided by principles of democracy, respect for people, integrity, stewardship and excellence are entrusted to serve the people. Hands-on learning will show how BME can provide a framework for developing, adapting and evaluating major system change initiatives and transformation involving complex networks while rooted in utilization-focused, developmental, and principles-focused evaluation. 

window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'UA-140679622-1');