Мониторинг и оценка региональных программ

Содержание

Слайд 2

Outline

Introduction
Overview of Evaluation in Developed and Developing Countries
Results-Based Monitoring and

Outline Introduction Overview of Evaluation in Developed and Developing Countries Results-Based Monitoring
Evaluation (M&E)
Approaches to Evaluation

Слайд 3

Introduction: Importance of evaluation

There are growing pressures in developing countries to improve

Introduction: Importance of evaluation There are growing pressures in developing countries to
performance of their public sectors
Involves reform by tracking results of government or organizational actions over time
Is a management tool

Слайд 4

The Power of Measuring Results

If you do not measure results, you cannot

The Power of Measuring Results If you do not measure results, you
tell success from failure
If you can not see success, you can not reward it
If you can not reward success, you are probably rewarding failure
If you can not see success, you can not learn from it
If you can not recognize failure, you can not correct it
If you can demonstrate results, you can win public support

Слайд 5

Overview of Evaluation in Developed and Developing Countries

Overview of Evaluation in Developed and Developing Countries

Слайд 6

Evaluation in Developed Countries

Most of the 32 OECD countries have mature M&E

Evaluation in Developed Countries Most of the 32 OECD countries have mature
systems
Earliest adopters had:
democratic political systems
strong empirical traditions
civil servants trained in social science
efficient administrative systems and institutions

Слайд 7

A Strong Evaluation Culture Exists when:

Evaluation takes place in many policy domains
Supply

A Strong Evaluation Culture Exists when: Evaluation takes place in many policy
of evaluators on staff who have mastered methods of different specialized disciplines
National discourse exists on evaluation
Profession exists with own societies or meetings with discussion of norms and ethics
(continued on next slide - 1 of 3)

Слайд 8

A Strong Evaluation Culture Exists when: (cont.)

Institutional arrangements exist in government for

A Strong Evaluation Culture Exists when: (cont.) Institutional arrangements exist in government
conducting evaluations and disseminating to decision makers
Institutional arrangements present in legislative bodies for conducting evaluations and disseminating them to decision makers
(continued on next slide- 2 of 3)

Слайд 9

A Strong Evaluation Culture Exists when: (cont.)

7. An element of pluralism exists within

A Strong Evaluation Culture Exists when: (cont.) 7. An element of pluralism
each policy domain
Evaluation activities also take place within the supreme audit institution
Evaluations focus not only on technical production or relation between inputs and outputs but also on program or policy outcomes

IPDET © 2012

Слайд 10

Approaches

Whole-of-Government
Enclave
Mixed

Approaches Whole-of-Government Enclave Mixed

Слайд 11

Whole-of-Government Approach

Adopted in some early M&E pioneer countries
Broad-based, comprehensive M&E at all

Whole-of-Government Approach Adopted in some early M&E pioneer countries Broad-based, comprehensive M&E
levels of government
Millennium Development Goals created impetus
Challenging where different ministries are at different stages

Слайд 12

Enclave Approach

More limited, focus on one part or sector of government (a

Enclave Approach More limited, focus on one part or sector of government
ministry or the cabinet)
Strategy:
begin at local, state, or regional governmental level
pilot evaluation systems in a few key ministries or agencies

Слайд 13

Mixed Approach

Blended whole-of-government and enclave approaches
Some areas have a comprehensive approach; others

Mixed Approach Blended whole-of-government and enclave approaches Some areas have a comprehensive
more sporadic attention

Слайд 14

Evaluation in Developing Countries

Face similar and different challenges
Weak political will slows progress
Difficulties

Evaluation in Developing Countries Face similar and different challenges Weak political will
in inter-ministerial cooperation and coordination can impede progress

Слайд 15

Evaluation Systems in Developing Countries

New evaluation systems need:
political will in the government
highly

Evaluation Systems in Developing Countries New evaluation systems need: political will in
placed champions willing to assume political risks
credible institutions

Слайд 16

Developing Countries Need to:

Establish a foundation for evaluation
statistical systems and data, as

Developing Countries Need to: Establish a foundation for evaluation statistical systems and
well as budgetary systems
Routinely collect baseline information
Train officials in data collection, monitoring methods, and analysis

Слайд 17

Development Assistance Committee (DAC) Criteria for Evaluating Development Assistance

Relevance
Effectiveness
Efficiency
Impact
Sustainability

Development Assistance Committee (DAC) Criteria for Evaluating Development Assistance Relevance Effectiveness Efficiency Impact Sustainability

Слайд 18

Results-Based Monitoring and Evaluation

Results-Based Monitoring and Evaluation

Слайд 19

Results-Based Monitoring

IPDET © 2012

Results-based monitoring (what we call “monitoring”) is a continuous

Results-Based Monitoring IPDET © 2012 Results-based monitoring (what we call “monitoring”) is
process of collecting and analyzing information on key indicators, and comparing actual results to expected results

Слайд 20

Results-Based Evaluation

IPDET © 2012
Results-based evaluation is an assessment of a planned, ongoing,

Results-Based Evaluation IPDET © 2012 Results-based evaluation is an assessment of a
or completed intervention to determine its relevance, efficiency, effectiveness, impact, and/or sustainability

Слайд 21

Difference between Results-Based Monitoring and Results-Based Evaluation

IPDET © 2012

Monitoring: tracks movement of

Difference between Results-Based Monitoring and Results-Based Evaluation IPDET © 2012 Monitoring: tracks
indicators towards the achievement of specific, predetermined targets
Evaluation: takes a broader view, considering progress toward stated goals, the logic of the initiative, and its consequences
Both are needed to better manage policies, programs, and projects

Слайд 22

Brief Introduction to Theory of Change

IPDET © 2012

Theory of change is a

Brief Introduction to Theory of Change IPDET © 2012 Theory of change
representation of how a project, program or policy initiative is expected to lead to the outcomes and impacts. It also identifies the underlying assumptions being made with respect to how the change will occur.

Слайд 23

Components of Theory of Change

IPDET © 2012

Inputs – financial, human, and material

Components of Theory of Change IPDET © 2012 Inputs – financial, human,
resources
Activities – tasks undertaken
Outputs – products and services
Outcomes – behavioral changes
Impacts – long term widespread improvement in society

Слайд 24

Theory of Change and Types of Monitoring

IPDET © 2012

Theory of Change and Types of Monitoring IPDET © 2012

Слайд 25

Performance Indicators

IPDET © 2012

A variable that tracks the changes in the development

Performance Indicators IPDET © 2012 A variable that tracks the changes in
intervention or shows results relative to what was planned
The cumulative evidence of a cluster of indicators is used to see if an initiative is making progress

Слайд 26

Step 1: Conducting a Readiness Assessment

IPDET © 2012

16

Ten Steps to Building a

Step 1: Conducting a Readiness Assessment IPDET © 2012 16 Ten Steps
Results-Based M&E System

Слайд 27

What Is a Readiness Assessment?

IPDET © 2012

A systematic approach to determine the

What Is a Readiness Assessment? IPDET © 2012 A systematic approach to
capacity and willingness of a government or organization to construct a results-based M&E system
The approach focuses on: presence or absence of champions, incentives, roles and responsibilities, organizational capacity, and barriers to getting started

Слайд 28

Incentives

IPDET © 2012

Sort out the answers to these questions:
What is driving the

Incentives IPDET © 2012 Sort out the answers to these questions: What
need for building an M&E system?
Who are the champions for building and using an M&E system?
What is motivating those who champion building an M&E system?
Who will benefit from the system?
Who will not benefit?

Слайд 29

Barriers to M&E

IPDET © 2012

Do any of the following present barriers to

Barriers to M&E IPDET © 2012 Do any of the following present
building an M&E system?
lack of fiscal resources
lack of political will
lack of a champion for the system
lack of an outcome-linked strategy ,or experience
How do we confront these barriers?

Слайд 30

Step 2: Agreeing on Outcomes to Monitor and Evaluate

IPDET © 2012

Step 2: Agreeing on Outcomes to Monitor and Evaluate IPDET © 2012

Слайд 31

Why an Emphasis on Outcomes?

IPDET © 2012

Makes explicit the intended objectives of

Why an Emphasis on Outcomes? IPDET © 2012 Makes explicit the intended
government action
Outcomes are what produce benefits
Clearly setting outcomes is key to designing and building results-based M&E system
Important! Budget to outputs, manage to outcomes!
(“Know where you are going before you get moving”)

Слайд 32

Developing Outcomes for One Policy Area: Education

IPDET © 2012

Developing Outcomes for One Policy Area: Education IPDET © 2012

Слайд 33

Outcomes:

IPDET © 2012

Outcomes are usually not directly measured — only reported on
Outcomes

Outcomes: IPDET © 2012 Outcomes are usually not directly measured — only
must be translated to a set of key indicators
When choosing outcomes, “Do not go it alone!” – agreement is crucial

Слайд 34

Step 3: Selecting Key Indicators to Monitor Outcomes

IPDET © 2012

Step 3: Selecting Key Indicators to Monitor Outcomes IPDET © 2012

Слайд 35

Results Indicator

IPDET © 2012

A specific variable, that when tracked systematically over time,

Results Indicator IPDET © 2012 A specific variable, that when tracked systematically
indicates progress (or lack thereof) toward an outcome or impact
for new M&E systems, all indicators should be numerical
qualitative indicators can come later with mature M&E systems
Indicators ask: How will we know success when we see it?

Слайд 36

Indicator Development

IPDET © 2012

“CREAM”
Clear
Relevant
Economic
Adequate
Monitorable

Indicator Development IPDET © 2012 “CREAM” Clear Relevant Economic Adequate Monitorable

Слайд 37

Matrix for Building/Using Indicators

IPDET © 2012

Matrix for Building/Using Indicators IPDET © 2012

Слайд 38

Developing Set of Outcome Indicators for One Policy Area: Education

IPDET © 2012

Developing Set of Outcome Indicators for One Policy Area: Education IPDET © 2012

Слайд 39

Developing Indicators

IPDET © 2012

Develop your own indicators to meet your needs
Developing good

Developing Indicators IPDET © 2012 Develop your own indicators to meet your
indicators usually takes more than one try
State all indicators neutrally – not “increase in…” or “decrease in…”
Pilot, Pilot, and Pilot!

Слайд 40

Step 4: Gathering Baseline Data on Indicators

IPDET © 2012

Step 4: Gathering Baseline Data on Indicators IPDET © 2012

Слайд 41

Baseline Data and Sources

IPDET © 2012

Baseline data:
Measurements to find out - where

Baseline Data and Sources IPDET © 2012 Baseline data: Measurements to find
are we today?
Primary source:
gathered specifically for the project
Secondary source:
collected for another purpose
can save money but be careful to ensure that it is truly the information you need

Слайд 42

IPDET © 2012

IPDET © 2012

Слайд 43

Continuing Example, Developing Baseline Data for One Policy Area: Education

IPDET © 2012

Continuing Example, Developing Baseline Data for One Policy Area: Education IPDET © 2012

Слайд 44

Step 5: Planning for Improvement: Selecting Realistic Targets

IPDET © 2012

Step 5: Planning for Improvement: Selecting Realistic Targets IPDET © 2012

Слайд 45

Targets:

IPDET © 2012

The quantifiable levels of the indicators that a country or

Targets: IPDET © 2012 The quantifiable levels of the indicators that a
organization wants to achieve at a given point in time
Example:
Agricultural exports will increase in the next three years by 20% over the baseline

Слайд 46

Identifying Expected or Desired Level of Improvement Requires Targets

IPDET © 2012

+

=

Identifying Expected or Desired Level of Improvement Requires Targets IPDET © 2012 + =

Слайд 47

Caution:

IPDET © 2012

It takes time to observe the effects of improvements, therefore:
-

Caution: IPDET © 2012 It takes time to observe the effects of
Be realistic when setting targets
- Avoid promising too much and thus programming yourself to fail

Слайд 48

Continuing Example, Setting Performance Targets for One Policy Area: Education

IPDET © 2012

Continuing Example, Setting Performance Targets for One Policy Area: Education IPDET © 2012

Слайд 49

Step 6: Monitoring for Results

IPDET © 2012

Step 6: Monitoring for Results IPDET © 2012

Слайд 50

Key Types of Monitoring

IPDET © 2012

Key Types of Monitoring IPDET © 2012

Слайд 51

Implementation Monitoring Links to Results Monitoring

IPDET © 2012

Implementation Monitoring Links to Results Monitoring IPDET © 2012

Слайд 52

IPDET © 2012

IPDET © 2012

Слайд 53

Successful Monitoring Systems

IPDET © 2012

To be successful, every monitoring system needs the

Successful Monitoring Systems IPDET © 2012 To be successful, every monitoring system
following:
ownership
management
maintenance
credibility

Слайд 54

Step 7: Using Evaluation Information

IPDET © 2012

Step 7: Using Evaluation Information IPDET © 2012

Слайд 55

Evaluation Means Info on:

IPDET © 2012

Evaluation Means Info on: IPDET © 2012

Слайд 56

Evaluation — When to Use?

IPDET © 2012

Any time there is an unexpected

Evaluation — When to Use? IPDET © 2012 Any time there is
result or performance outlier that requires further investigation
When resource or budget allocations are being made across projects, programs, or policies
When a decision is being made whether or not to expand a pilot
When there is a long period with no improvement, and the reasons for this are not clear
When similar programs or policies are reporting divergent outcomes

Слайд 57

Step 8: Reporting Findings

IPDET © 2012

Step 8: Reporting Findings IPDET © 2012

Слайд 58

Reporting Findings

IPDET © 2012

Provides information on status of projects, programs, and policies
Yields

Reporting Findings IPDET © 2012 Provides information on status of projects, programs,
clues to problems
Creates opportunities to consider changes
Provides important information over time on trends and directions
Helps confirm or challenge theory of change

Слайд 59

When Analyzing and Presenting Data:

IPDET © 2012

Compare indicator data with the baseline

When Analyzing and Presenting Data: IPDET © 2012 Compare indicator data with
and targets, and provide this information in an easy-to-understand visual display
Compare current information with past data and look for patterns and trends
Be careful about drawing sweeping conclusions based on small amounts of information. The more data points you have, the more certain you can be that trends are real
(continued on next slide)

Слайд 60

When Analyzing and Presenting Data: (cont.)

IPDET © 2012

Protect the messenger: people who

When Analyzing and Presenting Data: (cont.) IPDET © 2012 Protect the messenger:
deliver bad news should not be punished. Uncomfortable findings can indicate new trends or notify managers of problems early on, allowing them time needed to solve these problems

Слайд 61

Step 9: Using Findings

IPDET © 201

Step 9: Using Findings IPDET © 201

Слайд 62

Ten Uses of Results Findings

IPDET © 2012

Responds to elected officials’ and the

Ten Uses of Results Findings IPDET © 2012 Responds to elected officials’
public’s demands for accountability
Helps formulate and justify budget requests
Helps in making operational resource allocation decisions
Triggers in-depth examinations of what performance problems exist and what corrections are needed
Helps motivate personnel to continue making program improvements
(continued on next slide)

Слайд 63

Ten Uses of Results Findings (cont.)

IPDET © 2012

Monitors the project or program

Ten Uses of Results Findings (cont.) IPDET © 2012 Monitors the project
performance against outcome targets
Provides data for special, in-depth program evaluations
Helps track services delivery against precise outcome targets
Supports strategic and other long-term planning efforts
Communicates with the public to build public trust

Слайд 64

Step 10: Sustaining the M&E System within the Organization

IPDET © 2012

Step 10: Sustaining the M&E System within the Organization IPDET © 2012

Слайд 65

Critical Components Crucial to Sustaining

IPDET © 2012

Demand
Clear roles and responsibilities
Trustworthy and credible

Critical Components Crucial to Sustaining IPDET © 2012 Demand Clear roles and
information
Accountability
Capacity
Incentives

Слайд 66

Concluding Comments

IPDET © 2012

The demand for capacity building never ends! The only

Concluding Comments IPDET © 2012 The demand for capacity building never ends!
way an organization can coast is downhill
Keep your champions on your side and help them!
Establish the understanding with the Ministry of Finance and the Parliament that an M&E system needs sustained resources
Look for every opportunity to link results information to budget and resource allocation decisions
(continued on next slide)

Слайд 67

Concluding Comments (cont.)

IPDET © 2012

Begin with pilot efforts to demonstrate effective results-based

Concluding Comments (cont.) IPDET © 2012 Begin with pilot efforts to demonstrate
monitoring and evaluation
Begin with an enclave strategy (e.g., islands of innovation) as opposed to a whole-of-government approach.
Monitor both implementation progress and results achievements
Complement performance monitoring with evaluations to ensure better understanding of public sector results

Слайд 68

Approaches to Evaluation

Approaches to Evaluation

Слайд 69

What is the evaluation approach?

The systematic application of social research procedures for

What is the evaluation approach? The systematic application of social research procedures
assessing the conceptualization, design, implementation, and utility of programs” (Rossi and Freeman 1993)
“Evaluation identifies the impacts of an intervention program by analyzing cause and effect” (Ezemenari et al., 2001)

Слайд 70

Key Points

There is no a silver bullet approach
Answer different research questions
Intrinsically connected

Key Points There is no a silver bullet approach Answer different research
to the design of the project/program/policy
Have different data requirements
Not all project/program/policies can be evaluated
Usefulness of triangulation methods

Слайд 71

Four Main Evaluation Approaches

Impact Evaluation
Outcome-Based Evaluation
Monitoring/Process Evaluation
Participatory Evaluation

Four Main Evaluation Approaches Impact Evaluation Outcome-Based Evaluation Monitoring/Process Evaluation Participatory Evaluation

Слайд 72

Impact Evaluation

Impact evaluation is intended to determine more broadly:
-whether the program had

Impact Evaluation Impact evaluation is intended to determine more broadly: -whether the
the desired effects on individuals, households, and institutions,
-whether those effects are attributable to the program intervention.
Relevant Research Question: Is the intervention causally effective in attaining the desired goals or benefits?

Слайд 73

The Evaluation Problem

The Evaluation Problem

Слайд 74

The Evaluation Problem

The Evaluation Problem

Слайд 75

The Evaluation Problem

When participation in the program is related to unmeasured characteristics

The Evaluation Problem When participation in the program is related to unmeasured
that are themselves related to the program outcomes it is difficult to disentangle the causal effect of the intervention.
If the same individual could be observed at the same point in time with and without the program, the evaluation problem would not arise.
But, we cannot observe the same individual in both states at the same time: This is the evaluation problem.
The key to disentangling project impacts from any intervening variables is determining what would have happened in the absence of the program at the same point in time: THE COUNTERFACTUAL.

Слайд 76

Thinking About The Problem At Hand

Thinking About The Problem At Hand

Слайд 77

Defining Counterfactuals

Determining the counterfactual is at the core of impact evaluation
Use

Defining Counterfactuals Determining the counterfactual is at the core of impact evaluation
control or comparison groups (those who do not participate in a program or receive benefits), which are subsequently compared with the treatment group (individuals who do receive the intervention).
Control or comparison groups consist of a group of individuals who do not receive the intervention but have similar characteristics to those receiving the intervention.

Слайд 78

Why it Matters?

We want to know if the program had an impact,

Why it Matters? We want to know if the program had an
the average size, and the distribution of that impact
Understand if policies work
Justification for program
Scale up or not
Compare different policy options within a program
Understand the net benefits of the program
Understand the distribution of gains and losses

Слайд 79

Key Steps in Designing and Implementing

Determining whether or not to carry

Key Steps in Designing and Implementing Determining whether or not to carry
out an impact evaluation
Clarifying objectives of the evaluation
Exploring data availability
Designing the evaluation
Forming the evaluation team
If data will be collected: sample design and selection, surveys, training fieldwork personnel, pilot testing, data collection, data management and access
Ongoing data collection
Analyzing the data
Writing the report

Слайд 80

Determining Whether Or Not To Carry Out An Impact Evaluation

Cost and benefits

Determining Whether Or Not To Carry Out An Impact Evaluation Cost and
should be assessed
Strong political and financial support
Program is suitable for evaluation

Слайд 81

Clarifying Objectives of Evaluation

Establishing clear objectives
Use and analysis of the program‘s logical

Clarifying Objectives of Evaluation Establishing clear objectives Use and analysis of the
framework helps
Example: The evaluation is about the “effect of the PROBECAT training program on labor market outcomes”
Example: The evaluation is about the “effect of the PROBECAT training program on subsequent labor hourly earnings of beneficiaries”

Слайд 82

Data Availability

Know the institutions of the program well.
Collect information on the

Data Availability Know the institutions of the program well. Collect information on
relevant “stylized facts”
Ensure that there is data on the outcome indicators and relevant explanatory variables

Слайд 83

Designing The Evaluation

Know the institutions of the program well.
Defined the evaluation

Designing The Evaluation Know the institutions of the program well. Defined the
question(s) (unit of analysis, outcomes, time framework, etc)
Timing and budget concerns (short-, medium- long-term evaluation)
Implementation capacity. Big issue in developing countries.

Слайд 84

Impact Evaluation Example: PROGRESA

PROGRESA is the principal antipoverty strategy of the Mexican

Impact Evaluation Example: PROGRESA PROGRESA is the principal antipoverty strategy of the
government
Large program
-By 2003 4.2 million families were receiving benefits
-72,000 localities
-40% of all rural families
-1/9 of all families
Annual budget: 46% of Federal poverty alleviation budget

Слайд 85

PROGRESA’S Goals

Long-run poverty alleviation
- Investment in human capital
- Education
- Health
- Nutrition
Short-run

PROGRESA’S Goals Long-run poverty alleviation - Investment in human capital - Education
poverty alleviation
-cash transfers

Слайд 86

Features & Institutions

Conditional cash transfers given to mothers (why?)
Simultaneous and targeted intervention

Features & Institutions Conditional cash transfers given to mothers (why?) Simultaneous and
in 3 key sectors (synergies)
Experimental evaluation of the intervention
Uses existing school and health facilities

Слайд 87

Overall Program Benefits

Beneficiary households receive on average 200 pesos per month
-22%

Overall Program Benefits Beneficiary households receive on average 200 pesos per month
increase in the income level
About 50% of the 200 pesos are cash transfers for food
The rest are cash transfers for school-related items
Heterogeneous benefits depending on family size and schooling needs

Слайд 88

Evaluation Framework

Evaluation Framework

Слайд 89

Evaluation Framework

This is a three-step process:
Identification of marginalized localities using marginality

Evaluation Framework This is a three-step process: Identification of marginalized localities using
index (geographic targeting)
Selection of treatment and control localities within rural localities (random assignment of poor localities)
Selection of beneficiary households within rural localities with high marginality index (non-random assignment)
Good geographic targeting of rural areas
Accurate method of selecting poor households within localities (7% under coverage)

Слайд 90

Evaluation Framework

Program randomized at the locality level
Sample of 506 localities
-186 control (no

Evaluation Framework Program randomized at the locality level Sample of 506 localities
PROGRESA)
-320 treatment (PROGRESA)
24,077 Households (hh)
-78% beneficiaries
The experiment lasted only for a year and a half because the control group families started to receive benefits in December 1999

Слайд 91

PROGRESA Evaluation Surveys/Data

BEFORE initiation of program
October/November 97: Household census used for selecting

PROGRESA Evaluation Surveys/Data BEFORE initiation of program October/November 97: Household census used
program beneficiaries
March 98: Consumption, school attendance, and health
AFTER initiation of program
Included survey of beneficiary households regarding operations
November 1998
June 1999
November/December 1999

Слайд 92

Evaluation Research Questions: Education Component

Are more children attending school because of PROGRESA?
Does

Evaluation Research Questions: Education Component Are more children attending school because of
PROGRESA have more impact in certain grades?
Any effects on drop-out rates, grade progression, repetition, reentry?

Слайд 93

Evaluation Results: Education

Positive effect on school attendance of boys and girls in

Evaluation Results: Education Positive effect on school attendance of boys and girls
primary and secondary school
-Boys in secondary: increased 8 %
-Girls in secondary: increased 14%
Negative impact on children’s labor market participation (especially boys)
10% increase in overall educational attainment (8% higher earnings)

Слайд 94

Evaluation Research Questions: Health

Does PROGRESA increase visits to public health clinics?
Does PROGRESA

Evaluation Research Questions: Health Does PROGRESA increase visits to public health clinics?
have an effect on child health?
Does PROGRESA have an effect on the health of adults?

Слайд 95

Evaluation Results: Health

Significant increase in visit rates
-Nutrition monitoring visits
-Immunization rates
-Prenatal care in

Evaluation Results: Health Significant increase in visit rates -Nutrition monitoring visits -Immunization
1st trimester (8% increase)
No substitution between private and public facilities
Incidence of illness fell 12% in children between ages 0-5.
Significantly positive effects on adult health

Слайд 96

Evaluation Research Questions: Nutrition

Does PROGRESA impact child growth?
Does PROGRESA impact household consumption

Evaluation Research Questions: Nutrition Does PROGRESA impact child growth? Does PROGRESA impact
and food diet?

Слайд 97

Evaluation Results: Nutrition

Significant effect in increasing child growth (1cm higher growth)
Significant

Evaluation Results: Nutrition Significant effect in increasing child growth (1cm higher growth)
effect in reducing the probability of stunting
-Children 12-36 months
Household total consumption increases
PROGRESA households “eat better”
-Higher expenditures on fruits, vegetables, meats, and animal products

Слайд 98

(2) Outcome-Based Evaluation

(2) Outcome-Based Evaluation

Слайд 99

Basic Definitions

Outcome-based evaluation is a systematic way to assess the extent to

Basic Definitions Outcome-based evaluation is a systematic way to assess the extent
which a program has achieved its intended results,
How has the program made a difference?
Is the welfare of participants better off after the program?

Слайд 100

Why It Matters?

Contribute to program effectiveness
Provide a logical framework for program development
Generate

Why It Matters? Contribute to program effectiveness Provide a logical framework for
information for decision-making
Communicate program value

Слайд 101

But Unlike Impact Evaluation

It does not prove cause and effect, only suggest

But Unlike Impact Evaluation It does not prove cause and effect, only
a cause and effect relationship.
It shows contribution, not attribution

Слайд 102

(3)Monitoring/Process Evaluation

Help to assess whether a program is being implemented as was

(3)Monitoring/Process Evaluation Help to assess whether a program is being implemented as
planned.
Is a particular intervention reaching its target population?
What activities and services are provided?
Is there consistency between the activities and the program’s goals?
It is also concerned with how the program operates and focuses on problems in service delivery.
Who is served?
When and how long?
Are the intended services being provided

Слайд 103

Why It Matters?

Helps on determining how a program’s potential impact is related

Why It Matters? Helps on determining how a program’s potential impact is
to its implementation (evaluator’s perspective)
Provide information that stakeholders need to judge the appropriateness of program activities and to decide whether a program should be continued, expanded, or contracted (accountability perspective)
Provide information to incorporate corrective measures as a regular part of program operations (management perspective)

Слайд 104

(4)Participatory Evaluation

Representatives of agencies and stakeholders (including beneficiaries) work together in designing,

(4)Participatory Evaluation Representatives of agencies and stakeholders (including beneficiaries) work together in
carrying out, interpreting, and reporting an evaluation
Departs from the audit ideal of independence
Departs from scientific detachment
Partnership based on dialogue and negotiation

Слайд 105

Principles of Participatory Evaluation

Evaluation involves building participants’ skills
Participants commit to the evaluation

Principles of Participatory Evaluation Evaluation involves building participants’ skills Participants commit to
and make decisions and draw own conclusions
Participants ensure evaluation focuses on methods and results they consider important
People work together promoting group unity
Participants understand and find meaningful all aspects of the evaluation
Self-accountability is highly valued
Evaluators/Facilitators act as resources

Слайд 106

Participatory Process

No single right way
Commitment to the principles of participation and inclusion
-those

Participatory Process No single right way Commitment to the principles of participation
closest to the situation have valuable and necessary information
Develop strategies to develop trust and honest communication
-information sharing and decision-making
-create “even ground”

Слайд 107

Benefits of Participatory

Increased buy-in, less resistance
Results are more likely to be used
Increased

Benefits of Participatory Increased buy-in, less resistance Results are more likely to
sustainability
Increased credibility of results
More flexibility in approaches
Can be systematic way of learning from experience

Слайд 108

Challenges of Participatory

Concern that evaluation will not be objective
Those closest to the

Challenges of Participatory Concern that evaluation will not be objective Those closest
intervention may not be able to see what is actually happening if it is not what they expect
Participants may be fearful of raising negative views
Time consuming
Clarifying roles, responsibilities, and process
Skilled facilitation is required
Just-in-time training
Имя файла: Мониторинг-и-оценка-региональных-программ-.pptx
Количество просмотров: 282
Количество скачиваний: 0