Getting started: preparing for monitoring and evaluation, Situation analyses/Needs assessments (Formative research), Monitoring and Evaluation Frameworks (3 parts), Evaluation: Monitoring, Outcome and Impact, Monitoring and evaluation for specific areas of work, Glossary of Terms from Programming Essentials and Monitoring and Evaluation Sections, random selection, control and comparison groups, Teacher/Staff Evaluation of School Presentation. It involves collecting and analyzing information about a program’s activities, characteristics, and outcomes. Norland, E. (2004, Sept). Is this the most efficient way to use limited resources. to improve students’ ability to read). The guide presents information for planning and conducting evaluations; information on linking programme goals, objectives, activities, outcomes, and evaluation strategies; sources and techniques for data gathering; and tips on analyzing and interpreting the data collected and sharing the results. Timperley, H & Parr, J 2009, ‘Chain of Influence from policy to practice in the New Zealand literacy strategy’, Research Papers in Education, vol.24, no.2, pp.135-154. Update these documents on a regular basis, adding new strategies, changing unsuccessful strategies, revising relationships in the model, and adding unforeseen impacts of an activity (EMI, 2004). See MEERA’s searchable database of EE evaluations to get started. The key issue is to think about the evaluation question and adopt the data and methods that will provide the most robust answer to that question. To embed the sort of evaluative thinking described above into activity across education requires everyone to be evaluative thinkers in one way or another. better understand your target audiences' needs and how to meet these needs, design objectives that are more achievable and measurable, monitor progress toward objectives more effectively and efficiently, increase your program's productivity and effectiveness. Putting the IPPF Monitoring and Evaluation Policy into Practice: A Handbook on Collecting, Analyzing and Utilizing Data for Improved Performance (International Planned Parenthood, 2009). A good evaluation is one that is likely to be replicable, meaning that someone else should be able to conduct the same evaluation and get the same results. speech pathology) and some simply reflect the way good teachers organise their classroom (e.g. An evaluation of the impact of a campaign to raise awareness around the provisions of a recently enacted law on violence against women for example would need to incorporate: baseline data on awareness of the law’s provisions prior to the campaign for the intervention group; endline data on awareness of the law’s provisions after the campaign for the intervention group; baseline data on awareness of the law’s provisions prior to the campaign for a closely matched control group not exposed to the campaign; and. This is important because very few programs work for everyone. If continuation of the program is not in question, it may be better focusing on process questions bearing on program efficiency or quality improvement. See also the UN Women online guide to gender equality and human rights responsive evaluation in English, French and Spanish. Stakeholders need to understand the questions the evaluation sought to answer, the methods employed to answer them, any assumptions that were made, what the evaluation found and the consequences of those findings. It is important to periodically assess and adapt your activities to ensure they are as effective as they can be. For additional monitoring and evaluation reports by sector, see the following sections: M&E Fundamentals: A Self-Guided Minicourse (Frankel and Gage/MEASURE Evaluation, 2007). This requires taking baseline and follow-up measures and comparing these over time. 6. For example, people who didn’t respond well to the intervention might also be less likely to participate in interviews or focus groups. These designs are referred to as 'quasi-experiments' in Figure 4. Within the categories of formative and summative, there are different types of evaluation. This is where reference to benchmarks or comparison groups is critical. These summative evaluations build on data collected in the earlier stages. The challenge for schools and systems is to work out whether they have been achieved. Strategic planning is also a good time to create a list of questions you would like your evaluation to answer. A Place to Start: A Resource Kit for Preventing Sexual Violence (Sexual Violence Prevention Programme of the Minnesota Department of Health). Outcome evaluation usually identifies average effects: were the recipients better off under this program than they would have been in its absence. Implement remedial measures to get programmes back on track and remain accountable to the expected results the programme is aiming to achieve. Evidence that your EE program is not achieving all of its ambitious objectives can be hard to swallow, but it can also help you learn where to best put your limited resources. If the study design does not involve a randomly-assigned control group, it is not possible to make a definitive statement regarding any differences in outcome between areas with the programme and areas without the programme.
Non Stick Cookware, Tundra 3 Burner Stove Costco, Motor City Casino, Where Can I Get My Makeup Done Professionally Near Me, Go With The Flow Là Gì, Twosetviolin Net Worth, Two-burner Stove Cover, Marmot Limestone 4p Footprint, Moira O'hara, My God, They Killed Him, Rebecca Quick Salary, Makeup Looks From Around The World,