2012 MESI Spring Training
Evaluation in a Complex World: Changing Expectations, Changing Realities
Training dates: March 28-30, 2012
Pre-Conference Workshops: March 26-27, 2012
Post-Conference Workshops: March 30-31, 2012
University of Minnesota, St. Paul, MN
- Jan Malcolm, Chief Executive Officer of Courage Center, former Minnesota Commissioner of Health
- Foundation Leadership Panel, featuring leaders from the Twin Cities’ and Minnesota’s top foundations
- Michael Quinn Patton, Founder & Director, Utilization-Focused Evaluation and Independent Evaluation Consultant, author of Utilization-Focused Evaluation and Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use
- Richard Krueger, Professor Emeritus and Senior Fellow in Evaluation Studies, University of Minnesota
- Jean A. King, Professor and Director, Minnesota Evaluation Studies Institute, University of Minnesota
- Vanessa McKendall-Stephens, President, Face Valu Evaluation Consulting & Associates, Minneapolis, MN
- And others from the Twin Cities community and across the University of Minnesota
Conference Sessions and Handouts
List of conference sessions with links to handouts made available from 2012 presenters.
Jean A. King, Director, Minnesota Evaluation Studies Institute and Professor, Organizational Leadership Policy and Development, U of MN
This introductory session will introduce new or novice evaluators and people responsible organizationally for evaluation to processes for evaluative thinking, evaluation planning and implementation, and ways to report an evaluation across a variety of contexts. Topics will include evaluation purposes, framing questions, data collection and analysis, evaluation standards and principles, and the many constraints associated with conducting evaluations in complex settings.
Scott Chazdon, Evaluation and Research Coordinator, U of MN Extension; Judy Temple, Associate Professor, Humphrey School of Public Affairs and Department of Applied Economics, U of MN; Stuart Yeh, Associate Professor, Organizational Leadership Policy and Development, College of Education and Human Development, U of MN
Understanding the explicit outcomes of social programs has become a topic of increasing importance as governments
and foundations continue to spend billions of dollars to address persistent problems affecting society. Impact evaluation seeks to identify the changes that can be attributed to specific interventions (projects, programs, or policies) so that
funders can wisely use future resources. The three speakers in this session will provide their personal perspective on
how to measure the impact of social programs, leaving time for questions and audience discussion about the practice of
Cost-Effectiveness Analysis [Google Doc]
Ripple Effect Mapping [Google Doc]
International Development Evaluation
Elizabeth Hutchinson and Edna Ogwangi, Land O’ Lakes Foundation; Joan DeJaeghere, Assistant Professor, Organizational Leadership Policy and Development, College of Education and Human Development, U of MN; Shirley J. Miske, Witt & Associates Inc., St. Paul, MN
International development focuses on creating an improved quality of life for human beings, wherever they live,
encompassing a variety of practices in a diverse array of fields. The evaluation of international development programs is necessarily complex and hugely challenging. This session will highlight the practice of three international development
evaluators who have worked literally all over the world.
International Monitoring and Evaluation [Google Doc]
Connecting Evaluation Theory and Practice
Vanessa McKendall-Stephens, President, Face Valu Evaluation Consulting & Associates Minneapolis, MN; Laura J. Pejsa, Ph.D., Associate Director, Minnesota Evaluation Studies Institute, U of MN
Logic models, theory of change, theory of action, program theory… these are terms that funders and evaluators often use when working with programs. Unfortunately, the introduction to and experience with logic and theory can often become a confusing and/or laborious for program staff. In this session, we will break down these concepts and explore how to make meaningful connections between theory and the everyday, real-world practice of programs. We will also discuss how thoughtfully designed logic models and program theories can aid in planning quality evaluations.
Accessible and Inspiring: Presenting Evaluation Results in Ways that are Useful to Clients
Laura Bloomberg, Executive Director of the Center for Integrative Leadership and Graduate Faculty Member, Humphrey School of Public Affairs, U of MN
Here’s an unfortunate reality: “Written evaluation reports are nearly as varied as those who write them, but the
great majority share a common characteristic: They make tedious and tiresome reading” (Fitzpatrick, Sanders, &
Worthen, 2011). This workshop will explore skills, strategies, and insights for communicating evaluation findings that
are useful, relevant—and inspiring! to stakeholders.
Meaningful and Accessible [Google Doc]
Evaluating Programs with Vulnerable Populations
Vidhya Shanker, College of Education and Human Development, U of MN; Jarrett Gupton, Assistant Professor, Organizational Leadership, Policy and Development, College of Education and Human Development; Timothy B. Zuel, Hennepin County Human Services and Public Health
How can an evaluator conduct effective studies for programs that serve vulnerable populations, knowing the multiple challenges such evaluations face? The presenters in this session will discuss their extensive experience engaging with different groups of individuals placed at risk: homeless youth (Jarrett Gupton), immigrant and refugee women (Vidhya Shanker), and youth served by a county social service system (Timothy Zuel).
Culture and Evaluation
Vanessa McKendall-Stephens, President Face Valu Evaluation Consulting & Associates, Minneapolis, MN
“Cultural Competency” has increasingly become a buzz word in many fields of practice, including evaluation. But what does it really mean to be “culturally competent”? In this introductory session, we will discuss the meaning of culture in our personal and professional lives and how we attend to culture in evaluation practice. The session is facilitated by a seasoned independent evaluator who specializes in multicultural evaluation and capacity building in community contexts. *Note: This session is recommended in conjunction with “Adapting Evaluation for Specific Cultures: A Hmong Case Study.”
Using GIS in Evaluation
Francis Harvey, Professor, Department of Geography, College of Liberal Arts, U of MN; Mark Herzfeld, Senior Program Evaluator, Ramsey County Research & Evaluation; Luther Krueger, Crime Prevention, Minneapolis Police Department
An opportunity for evaluators to explore the potential of GIS as a key tool in evaluation processes.
Conference Session MESI Cafe. Facilitated table discussions Join others in talking about issues in evaluation. Table discussion topics include: Independent evaluation consulting; GLBT issues in evaluation; Academic program advice; Evaluator competencies; MESI “Time out” (open table for networking, discussions of your choice); and more TBA.
Adapting Evaluation for Specific Cultures: A Hmong Case Study
Kalue Her and Anna Martin, Neighborhood House, St. Paul, MN; Mao Thao, Educational Psychology, College of Education and Human Development, U of MN
The AEA Cultural Competency Statement is a reminder to evaluators of the importance of different types and
influences of culture in every setting—race and ethnicity, power dynamics, geographic locale, relationships,
organizational size, etc. Cultures each have unique characteristics that demand evaluators to pay attention and respond
accordingly. This session will focus on the Hmong culture as an example case, illustrating how an effective evaluation is influenced, changed, and nuanced as a result of culture. The session will have three parts: (1) a chance to learn
specific features of the Hmong culture, (2) a brief presentation by evaluators who have worked in Hmong settings, and (3) an interactive discussion of the cultural implications for evaluators working on culturally-specific studies. *Note: This session is recommended in conjunction with “Culture and Evaluation.”
Program Evaluation with the Hmong Community: Reflections of a Hmong Evaluator [Google Doc]
Internet Focus Group Interviewing
Richard Krueger, Professor Emeritus and Senior Fellow in Evaluation Studies, Organizational Leadership Policy and Development, College of Education and Human Development, U of MN and Graduate Students
Focus groups can be conducted in a variety of ways. A recent trend is to conduct these groups over the Internet using either a bulletin board format or a real-time audio or video conversation. The Internet allows the researcher to connect with participants who otherwise would be unable to come together in person. Professor Richard Krueger and a team of graduate students have been examining and evaluating low-cost and practical ways of conducting focus groups on the Internet. They will share what they have learned about various platforms and demonstrate how these systems operate.
Participatory Approaches to Community-based Public Health Evaluation
Cathy Jordan, Director, University of MN Extension Children, Youth and Family Consortium; Anne Belcher, Environments for Health, School of Nursing, Indiana University
Community-based public health utilizes equitable partnerships between community organizations, academic institutions and/or public health agencies to address public health issues within communities. This approach involves sharing power, benefits, risks, credit and responsibility equally among partners. Partners collaborate to identify issues, mobilize resources and implement programs. This participatory approach can, and should, also apply to the evaluation phase of programs and projects. The participatory approach to public health evaluation is not about seeking “input” or “feedback” from community members. It’s about radically changing who asks the questions, who answers the questions and who benefits from the findings. In this session we will explore the reasons for undertaking participatory evaluation, guiding principles, challenges, methods and examples from the experiences of the presenters in Minnesota and Indiana.
25 No-cost/Low-cost Tools for Program Evaluation
Susan Kistler, Executive Director of the American Evaluation Association
This session will provide a review of over 25 low-cost/no-cost tools that are useful, used, and user-friendly. Who isn’t short on time, short on funds, and short on the patience needed to decide which tools are worth investing the time needed to access and use? Drawing on contributions from over 20 colleagues in different contexts, we’ll show examples of tools that are used by practicing evaluators to conduct background research; create and document evaluation plans and logic models; facilitate data cleaning, exploration and analysis; listen to and learn from online exchanges; and promote and enhance collaboration.
Using Stories in Evaluation
Richard Krueger, Professor Emeritus and Senior Fellow in Evaluation Studies, Organizational Leadership Policy and Development, College of Education and Human Development, U of MN
Stories are powerful. A well-told story grabs us and keeps us interested. It can help us understand the experiences of others. It helps us communicate emotions. It can help us share what we know with others. Stories can breathe life into numbers. Stories help lay people understand research—whether it is in education, health, economics, psychology, business, or science. But these short narrative accounts are sometimes dismissed as fiction, unreliable, untrue and mere anecdotes. This session will present strategies that allow the evaluator to use stories in a credible, defensible manner using accepted principles of qualitative research.
Evaluation Capacity Building
Michael Baizerman, Professor, School of Social Work and Director, Youth Studies, U of MN and Ross Velure Roholt, Assistant Professor, Youth Studies, School of Social Work, U of MN
This workshop will provide participants an interactive overview of the theory and practice of evaluation capacity building (ECB), a process that works with people to increase the ability of their organization to independently conduct evaluations. Topics will include a definition and rationale of ECB and a discussion of its underlying assumptions; how to plan, implement, and evaluate the components of effective ECB; and how to recognize and overcome likely challenges to the capacity building process.
Revised August 13, 2014