What should be evaluated




















Might need to gather data through key informant interviews, or unobtrusive measures for example looking at patterns of wear from foot traffic or techniques for gathering sensitive data for example polling booth. Login Login and comment as BetterEvaluation member or simply fill out the fields below. Search this guide:. Decide how decisions about the evaluation will be made 2. Scope the evaluation 3. Develop the Terms of Reference ToR 4. Engage the evaluation team 5.

Manage development of the evaluation methodology 6. Manage development of the evaluation work plan including logistics 7. Manage implementation of the evaluation 8. Guide production of quality report s 9. Disseminate reports and support use of evaluation. Firstly, check the implications of the stage of development of the project or program that is being evaluated. Is it still being planned?

Is it part—way through implementation? Or is it near the end — or has it in fact already ended? Stage of development? Secondly, consider whether there are important aspects that are either complicated with many components or complex emergent that should be addressed in the evaluation design.

Implications Everyone shares a single set of objectives Impacts to be included can be readily identified from the beginning. There are different objectives valued by different stakeholders. Implications Single organisation Primary intended users and uses easy to identify and address in the development of Key Evaluation Questions and endorsement of the design Multiple organisations which can be identified with specific, formalized responsibilities Likely to need to negotiate access to data and ways to link and co-ordinate data Might need to negotiate parameters of a joint impact evaluation, including negotiating scope and focus.

Adaptive — evolving and personalised program that responds to specific and changing needs. To what extent is this exact initiative needed to solve the problem? Implications There is only one way to achieve the intended impacts. Counterfactual reasoning appropriate. Possibly one of several ways of achieving the intended impacts uncertain.

Implications The intervention is enough to produce the intended impacts. Works the same for everyone. Implications Simple relationship cause and effect. Measurement of change can be done at a convenient time and confidently extrapolated Complicated relationship that needs expertise to understand and predict. Implications Easily predictable and therefore can be readily included in the data collection plans Need to draw on previous research and common sense to identify potential unintended impacts and gather data about them Need expertise to predict and address.

Unpredictable - only identified and addressed when they occur. Thirdly, identify whether any of these other issues are present and will need to be addressed: Issue Possible implications for the evaluation design Long time until impacts will be evident Might need to gather data about intermediate outcomes that will be evident during the timeframe of the evaluation and use other research and evaluation evidence to predict the likely achievement of impacts Difficulty observing implementation activities eg conflict affected or remote areas Might need to gather data through remote sensing, key informants, big data or crowdsourcing Difficulty observing results outcomes, impacts eg sensitive issues, private behaviour Might need to gather data through key informant interviews, or unobtrusive measures for example looking at patterns of wear from foot traffic or techniques for gathering sensitive data for example polling booth.

Next: Step 5 Sub-step. What is evaluation? Experts stress that evaluation can: Improve program design and implementation. Demonstrate program impact. Within the categories of formative and summative, there are different types of evaluation. Which of these evaluations is most appropriate depends on the stage of your program: Type of Evaluation Purpose Formative 1. Needs Assessment Determines who needs the program, how great the need is, and what can be done to best meet the need.

For more information, Needs Assessment Training uses a practical training module to lead you through a series of interactive pages about needs assessment. Process or Implementation Evaluation Examines the process of implementing the program and determines whether the program is operating as planned. Can be done continuously or as a one-time assessment. Results are used to improve the program.

Summative 1. Outcome Evaluation Investigates to what extent the program is achieving its outcomes. These outcomes are the short-term and medium-term changes in program participants that result directly from the program. Impact Evaluation Determines any broader, longer-term changes that have occurred as a result of the program.

These impacts are the net effects, typically on the entire school, community, organization, society, or environment. EE impact evaluations may focus on the educational, environmental quality, or human health impacts of EE programs. Before Program Begins. These summative evaluations build on data collected in the earlier stages.

To what extent is the need being met? What can be done to address this need? What predicted and unpredicted impacts has the program had? Needs Assessment. Outcome Evaluation. Impact Evaluation. Evans' Short Course on Evaluation Basics : Good evaluation is tailored to your program and builds on existing evaluation knowledge and resources.

Good evaluation is inclusive. Good evaluation is honest. Good evaluation is replicable and its methods are as rigorous as circumstances allow. Developing and implementing such an evaluation system has many benefits including helping you to: better understand your target audiences' needs and how to meet these needs design objectives that are more achievable and measurable monitor progress toward objectives more effectively and efficiently learn more from evaluation increase your program's productivity and effectiveness To build and support an evaluation system: Couple evaluation with strategic planning.

Revisit and update your evaluation plan and logic model See Step 2 to make sure you are on track. Build an evaluation culture by rewarding participation in evaluation, offering evaluation capacity building opportunities, providing funding for evaluation, communicating a convincing and unified purpose for evaluation, and celebrating evaluation successes.

Lets start. When planning the data collection, it works best to explicitly ask these types of general questions rather than infer findings on these issues from more specific data. Back to top Doing the evaluation — what should it cover? Share on facebook Tweet this Share linkedin Email. Search Terms. Objectives of the engagement process, e. Context, e. Levels of involvement, e.

Methods and techniques used, e. Who was involved, e. Inputs costs , e. Back to top.



0コメント

  • 1000 / 1000