Survey							
                            
		                
		                * Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Evaluating Collaboration National Extension Family Life Specialists Conference April 28, 2005 Ellen Taylor-Powell, Ph.D. Evaluation Specialist University of Wisconsin-Extension Types of evaluation questions      The outcomes are broad and complex. How do we get started? Is evaluating process good enough or do we have to evaluate outcomes? Who should be involved in evaluating a collaborative program? I’m not in charge. How do I evaluate it? How do I take credit for something that we’ve done together? Issues and challenges  Power – control Process of the evaluation  Data       Standards and quality of the evaluation Cross-cultural issues Measurements issues Attribution Taking credit Collaborative evaluation (not evaluation of collaboration) Since mid – 1970’s, new paradigm of participatory evaluation “applied social research that involves trained evaluation personnel…and practice-based decision makers working in partnership” (Cousins and Earl, 1992)  Multiple approaches -from broadening decision making (practical) to emancipation and social change (transformation)  Emphasis on using data collection and feedback to strengthen and monitor collaboration and thus increase overall effectiveness and efficiency   value in the process of evaluation, process use (Patton, 1997), as much as product Who controls? Who participates? How much? Researcher control Consultation All legitimate groups Primary users Deep participation Practitioner/participant control Adapted from Cousins and Whitmore, 1998 First…    Who wants to know what? For what purpose? How will information be used? Building a logic model of collaboration SITUATION INPUTS OUTPUTS OUTCOMES Collaborative Product Collaborative Relationship Assumptions External factors EVALUATION Collaborative Effectiveness Collaboration: Theory of change Change in Knowldge Partners Attitudes Implement activities – action plan • Clientele Skills •Users Motivation holders Monitor and evaluate •Policy makers Intent Researchbased Communicate •Publics Selfefficacy Funding Key stake Advocacy/ Policy Capacity building TA Change in behaviors Collaborative Relationship building • Individual members • Group Change in • KAS • Selfefficacy • Intent Policy changes System changes Community changes Change in • behaviors •decision making WHAT DO YOU WANT TO KNOW? Changes in conditions Valueadded •Effective functioning partnership •Member satisfaction Evaluating the Collaborative Relationship 1. Process evaluation      How is it functioning? How effective is the group work? Are we likely to achieve our desired results? How satisfied are members? Questions about capacities, operations, climate, context Factors influencing success Projected tasks/activities relative to stages of development Milestones and Critical Events (journey) MILESTONES  Significant points along the way  Examples  Key stakeholders on board  Vision statement established  Grant secured  Action plan formulated – plan of work  Project implemented/service provided  Project evaluated CRITICAL EVENTS  Unexpected events, positive and negative  Progress markers  Evidence of accomplishments  Disruptions or obstacles  Examples  Change in membership  Policy change  New donor added 2. Outcomes (Process outcomes):  What difference has being a part of this group made for the individual?    Knowledge, skills, motivations, behaviors, etc. Human capital development What difference is their for the group?  Group functioning, identify, resource pooling, etc Note: Outcomes can be positive, negative or neutral Methods          Informal feedback WHEN? Member (partner) Survey Periodic Review Member (partner) interviews Points of particular concern Group discussions Key informant interviews Observation Identification and use of indicators Network analysis ; sociogram Use existing materials (integrate into ongoing operations)    Minutes of meetings Logs: telephone, event, registration forms Management charts Tools - Techniques         Community Group Member Survey Collaborative Relationship scales Internal collaborative functioning scales Plan Quality Index Meeting effectiveness inventories Stage of readiness On-line Wilder Collaboration Factors Inventory (Amherst H. Wilder Foundation) On-line Partnership self-assessment tool (Center for Advancement of Collaborative Strategies in Health) Evaluating Programs/Products created/implemented by the collaboration 1.  Process or implementation evaluation (Focus: program delivery vs. coordination or support role) How is program being implemented? Fidelity to plan? Extent of delivery? Participation? What is/has happened that wasn’t planned?  Outcome evaluation What is different? For whom? How? To what extent?  For: Individuals, Groups/Families, Agencies, Systems, Communities  Changes in …  Change in : Individuals Attitudes, perceptions, knowledge, competence, skills, abilities, behaviors, actions, lifestyles Groups/families Interactions, behaviors, actions, values, culture Agency, organization #/type of services/programs delivered, access, practices, resource generation, resource use, policies Systems Relationships, interaction patterns, linkages, networks, practices, policies, resource use, institutionalization of changes Communities Values, attitudes, relations, support systems, civic action, social norms, policies, laws, practices, conditions Tools - Techniques  Monitor implementation Logs, management charts,  Interviews  Observations   Achievement of outcomes Clientele surveys  Clientele interviews  Observations   Mixed Methods Evaluating self - Taking credit     Mutual (reciprocal) accountability How do I take credit for my part? How does Extension gain visibility, recognition? What is your contribution? What role did you play? What value did you bring? Document role you play, your activities and contributions, inputs you bring, resources you make available, niche, value… Your contribution  Log of activities, roles played  Record inputs, resources contributed  Management chart; analysis of minutes  Independent assessment Survey  Interviews   Your (partner) performance: Most important indicator: other partners’ satisfaction with your performance (Brinkerhoff, 2002)  Mutual assessment among partners of each partner’s performance. Resulting discussion re. Discrepancies = powerful information sharing and trust building. (We aren’t very good at this type of thing)  Web address  http://www.uwex.edu/ces/pdande