Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Syracuse University Instructional Design, development, &Evaluation EERS Proper Questions to Address: A Fundamental Issue for Program Evaluation Ye Chen [email protected] Fundamental issues • Fundamental issues “are those underlying concerns, problems, or choices that continually resurface in different guises throughout our evaluation work” (Smith & Brandon, 2008) • Questions to address, methodology Question to address Methodology Fundamental issues Relevance Rigor Evaluation questions to address The Heart of Innovation. (February 15, 2014). How Einstein Would Solve a Problem If He Only Had an Hour To Do It. Retrieved from: http://www.ideachampions.com/weblogs/archives/2014/02/post_118.shtml Seven Theorists Scriven Campbell Stake Cronbach Rossi Weiss Wholey Michael Scriven-Make judgement of program value • Purpose of evaluation: -Serve the public interest -Produce a judgement of value, worth, or merit, and sum the results into a final evaluative judgement • Value: -Values should be investigated and justified empirically Rogers P. (March 29th, 2012). [Photograph of Scriven]. Retrieved from: http://genuineevaluation.com/strategies-forimproving-the-quality-of-evaluation-the-independentevaluation-advisor “Bad is bad and good is good and it is the job of evaluators to decide which is which” --Scriven Scriven-Make judgement of the program value • Central questions to address: - Is the evaluand good? How good is it? How much is it worth? What components of it are good? In what respects is it good? Is it good compare to alternatives? What combination of it and its alternatives is worth most? Scriven-Make judgement of the program value • Philosophy of viewing reality -Pre-evaluative questions (what is the evaluand? Who value the evaluand?) -Quasi-evaluative questions (what makes the evaluand good? what will make it better? ) -Evaluative questions -Multi-dimensional questions: description, client, background, resources, function, delivery system, consumer, needs & values, standards, process, outcomes, generalizability, costs, comparisons, significance, recommendations, report, metaevaluation. Multi-dimensional questions Donald T. Campbell- Causal question & bias control • Reality distortion-bias control • “Evaluators to play a servant-methodologist rather than an advisory role” --Campbell Wikipedia. (February, 2014). Donald T. Campbell. Retrieved from: http://en.wikipedia.org/wiki/Donald_T._Campbell Campbell- Causal question & bias control • Priority on internal validity: -Experiment, quasi-experiment when and how treatments are delivered? When and how control groups are formed? When outcomes are observed? - Random assignment whether manipulating A (intervention, cause) brings about B (outcome, effects)? Campbell- Causal question & bias control • Central questions: -How can we achieve dependable knowledge of the program, especially about the consequence of our operations in this program? How might such process be improved? -How well an intervention/evaluation is implemented? What unintended effects the intervention has? What is the explanation of causal process/mechanism? Robert E. Stake-Case study & Descriptive questions • Goal of evaluation: -More about service rather than critical/scientific analysis • Role of evaluators: -To tell the story of what is happening • Value College of Education at Illnois. [Photograph of Stake]. Retrieved from: http://education.illinois.edu/people/stake -All evaluation is value oriented, and a program has no single value Stake-Case study & Descriptive questions • Central questions -what is happening in the program? what are the actual activities there? • Responsive evaluation: -Should let questions emerge and change -what kind of activities or events happened in the context? what are audience requirements for information? what are the different value perspectives of stakeholders are considered? Lee J. Cronbach-Formative questions • Goal of evaluation: -To improve programs rather than to certify their worth -Formative evaluation contributes more than summative one • Role of evaluators: Alexander P. M.(October 10, 2001). [Photograph of Lee J. Cronbach]. Retrieved from: http://news.stanford.edu/news/2001/october10/cronbachobi t-1010.html -To collect facts practitioners can/will use to do a better job, or to develop a deeper understanding Cronbach-Formative questions • Central questions: -How the program produces its effects? What parameters influence its effectiveness? What are the aspects of the program where revision is desirable? • Discussion of social action: -How a social program is understood? What is known about the program? How political, social, and organizational contexts influence program functioning? What the processes are within the program? what program process contribute to program outcomes? Peter H. Rossi- Questions situates in policy context • Goal of evaluation: -“To better the lot of humankind by improving social conditions and community life” -The key to this betterment: knowing when, where, and why to use the intervention in productive policy research American Sociological Aassociation (January 08, 2005). Peter Henry Rossi. Retrieved from: http://www2.asanet.org/governance/rossip.html • Give a special place to stakeholders most involved in “decision making process” Rossi- Questions situates in policy context • Examples of questions to ask: -Plan stage: what is the extent and severity of problem requiring social intervention? what is the design of the program that could ameliorate the problem? -Ongoing and innovative stage: Whether the program is reaching their intended target populations and whether it is providing resources, services, and benefits envisioned? -Implement stage: Whether the program is effective? What the magnitudes of the impact? -Decision making stage: What are the costs in relation to benefits? How about compare its cost-efficiency to that of alternative ones? Carol H. Weiss- Questions to improve policy • Purpose of evaluation: -To influence “Evaluation should be continuing education for program managers, planners, and policymakers” --Weiss Weiss C. [Photograph of Carol Weiss]. Retrieved from: http://www.carolweissmft.com/ Weiss- Questions to improve policy • Central questions: -How can public policy making be improved? What role can the social sciences play in that improvement? -How policymakers make policy? what is the role that research plays in doing so? What kind of research could facilitate that process? Weiss- Questions to improve policy • Stages in policy research: -Research formulation: What is the policy issue? What is the need for knowledge about the issues? -Conducting the study. How data source is selected? How measures are developed? How we do sampling? and how to do data collection and analysis? -Draw policy implications. What is the resolution of the policy issues? Joseph S. Wholey-Evaluation in governmental programs • Goal of evaluation: -To make certain that policies and programs meet the needs of society • Justification for evaluation: -The “usefulness to policymakers and program managers” University of Southern California. [Photograph of Joseph S. Wholey]. Retrieved from: https://pressroom.usc.edu/joseph-swholey/ Wholey-Evaluation in governmental programs • Central question: -What are the successes and failures of the program in meeting the nation’s goals? Wholey-Evaluation in governmental programs • Four evaluation tools: Evaluability assessment: (program intent, reality, decisions) -What is the program intent? what is the program reality? and how to assist policy, management, and evaluation decisions? -Are the expected impacts well defined as to be measurable? Is the logic laid out clearly enough to be tested? Is anyone clearly in charge of the problem? who? What are the constraints on his ability to act? What range of actions might he take? Rapid feedback evaluation (preliminary assessment on program performance) -What are the program objectives? what are the performance indicators? Wholey-Evaluation in governmental programs • Four evaluation tools: Performance monitoring: -What are the process measures and outcome measures? Intensive evaluation: -What is the validity of causal assumptions linking program activities to program outcomes? -What activities designed to improve the management and performance of agencies and programs? Comparative Analysis of Different Theorists’ Positions See Table 1 in the handout • Purpose of evaluation • Whose concern reflected in the questions • Whether worth evaluation • Context of evaluation • Nature of questions • Method of answering questions