All evaluation projects are guided by the Evaluation Template. Organized around 4 questions, the template provides a simple yet powerful tool for working with clients and organizations. Figure 1 provides an overview of the Evaluation Template.
Question 1: Is the program in the implementation stage or operational stage?
Implementation Phase (Also: Start-up or Investment Phase):
- Program start based on original funding
- System is oriented towards generating output
- Procedures evolving
- Strategy/plan focused on practical considerations of being operational
- Service deliver may be inconsistent, but array of service options evolving
- Evaluation focuses on success of implementation process and output indicators
Operational Phase (Also: Maintenance, Continuation, Service-Delivery Phase)
- Program maturing based on continuation funding
- System is oriented towards delivering impact
- Procedures stable/routine/institutionalized
- Strategy/plan focused on issues of sustainability
- Service delivery routine and array of services largely complete
- Evaluation focuses on operational efficiency and impact indicators
Question 2: What was done?
Evaluation Component: Process Evaluation
Purpose & Problems*
- What was the purpose of the program?
- What specific problem(s) was/were addressed?
- How was that/were those problem(s) identified?
Context & Partners*
- Who was involved in the planning of the program?
- Who should have been involved in the planning of the program?
- Who was involved in the implementation of the program?
- Who should have been involved in the implementation of the program?
- What were the roles of those who participated in the planning and implementation?
- Where these roles consistent with the capacity of the individuals or agencies
- Was the response strategy used reasonable?
- Was the response strategy relevant for the problem(s) addressed?
- Did the response strategy replicate another program?
- How much did it follow the original program?
- Was the replication cognizant of the local context for the program?
- Was the response strategy appropriate to the available resources, including data, money, personnel, legal authority, and inter-agency cooperation?
- Did the agency/agencies believe the response strategy would achieve its purpose?
- Was the response strategy accurately implemented as it was designed to be?
- Was the response strategy timed and funded to optimize “treatment dosage” levels?
Obstacles & Changes
- What obstacles or problems were encountered?
- How were those obstacles or problems met?
- Was the strategy to meet the obstacles or problems successful?
- Are there other strategies that could have been used to meet those obstacles or problems?
- What changes occurred in purpose, problem identification, partners, or response strategies over the course of the program?
- Why did the changes occur?
- What was the response to those changes?
- Was data collection part of the program design?
- Does data collected reliably reflect what occurred?
- Can data be accessed in a timely manner?
Issues to Consider
- Researcher’s own biases
- Ethical implications of program design and implementation
- Use of a control group
Question 3: How much did it cost?
Evaluation Component: Cost/Benefit Evaluation
- How much money was spent?
- What was the money spent on? (Administration, Program delivery, Infrastructure, Outsourcing, Pass-through funds, etc.)
- What benefits are perceived by partners from the money spent?*
- What benefits are actually observed from the money spent?
- What indicators will be used to determine benefits?* (The indicators will be different depending on the arena in which a particular program operates.)
||Direct Economic Benefits: Those benefits derived from solid data about actual cost savings or capital maximization
||Indirect Economic Benefits: Those benefits which tend to be more speculative
- Increased earning capacity
- Decreased expenses
- Increased quality of life
- Better family dynamics
- Savings to public sector
- Benefits to private sector
- Community involvement
- Community improvement
Issues to Consider:
- Indirect economic benefits are hard to calculate
- Funder should be clear on benefits expected from investment
- Distinguishing between direct and indirect economic benefits enables policy makers to better gauge the value of a program’s impact
Question 4: What difference did it make?
Evaluation Component: Impact Evaluation
- What were the direct results of the program?
- Did the direct results match up with the desired outcomes?
- What enhancements might make the response strategy more effective either in this or future programs?
- Do statistical analyses indicate that the response strategy had a significant effect on the problem(s) addressed?
- Are contextual factors such as politics, geography, demographics, culture, etc., a limitation to observable impacts?
- Are there methodological limitations to the impact evaluation?
Issues to Consider:
- Measuring some non-quantifiable factors
- Use of a control group