Planning Program Evaluation

A program evaluation starts with a proposal, which describes the program to be evaluated, the purpose and scope of the evaluation, the questions that will guide the investigation, the research design, the instruments of data collection, the data-analysis procedures, an evaluation management plan, and a budget.

Description of the Program

A program evaluation plan must describe with as much detail as possible the program to be evaluated. The description introduces the justification that was set to start the program; the goals and objectives of the program; the activities and strategies used by the program; the target population; and the anticipated short-, medium-, and long-term outcomes of the program.

Purpose and Scope

The program's purpose and scope explain the justification for a program evaluation and what facets of the program will be concerned with the evaluation (Box 19.2). For example, the purpose and scope should answer questions such as

- Will the evaluation assist in future program planning and provide information to stakeholders?

- Will it judge the overall value, worth, and merit of the program for participants?

- Will it determine whether program goals or objectives are being met, and to what extent, for the stakeholders?

- Will it support, reinforce, and enhance the attainment of desired program outcomes?

Box 19.2 Sample Purpose and Scope Paragraph

The purpose of the evaluation is to support, reinforce, and enhance the attainment of the outcomes of the Parent-Connections program. Also, the evaluation aims to engage the key partners of the program in holistic thinking about best practices that can further help fulfill the vision to foster a safe, caring, and nurturing environment for children in our communities.

Evaluation Questions

The evaluation questions are the main questions that will guide the overall evaluation endeavor. What are the issues to be addressed by the evaluation? What are the specific questions to be answered by the evaluation? The evaluation question should be related to the main goals (for a large program) or the goals and objectives of the program (for a small program). The evaluation questions should not be confused with the survey questions. The survey questions are specific questions to ask individual interviewees. The evaluation questions are the broad questions about the overall purpose or worth of the program that the process of evaluation is trying to answer. A survey question is answered during an interview or once an interviewee has answered a question. An evaluation question can be answered only after the evaluation process has been completed. One effective way to ask good evaluation questions is to reformulate the goals of the program in the form of questions. For example, if the goal of a program is to "increase the academic performance of participating children," the evaluation question can be "To what extent has the program contributed to improving the academic performance of participating children?" Understand that the evaluation question focuses on assessing or measuring whether the program achieved its goal. See Box 19.3 for sample evaluation questions.

Research Design

The research design describes the form of the data in which the evaluation will be collected and the general plan of data collection and analysis. The form of data can be qualitative (interview, focus groups, program records) or quantitative methods (sampling) of data collection. The design for data collection and analysis can use one of the four common types of designs:

Box 19.3 Sample Evaluation Questions

To what extent did the Oshkosh Community Foundation after-school program contribute to increasing the academic performance of the participating children?

What are the impacts of the program on the integration of the refugee families into the Appleton community?

1. Exploratory: Usually conducted in the beginning of a program to identify best approaches to service delivery and appropriate outcomes to measure. Exploratory design tends to a focus on situation and priorities.

2. Descriptive: Usually a formative evaluation used to assess whether a program is being implemented as planned. A descriptive design provides feedback that can help improve a program. A descriptive design focuses on input and output.

3. Experimental and quasi experimental: Aims to find evidence of a causal or correlational relationship between the outputs and the outcomes of a program. An experimental or quasi experimental design focuses on outcomes.

4. Mixed: A combination of the above-mentioned designs. A mixed approach may focus on a combination of input, output, and outcomes.

See Box 19.4 for a sample research design paragraph.

Instruments of Data Collection

The evaluation plan should specify what instruments of data collection (e.g., questionnaire, interview guide, program records) will be used in conducting the evaluation, as well as the type of information (activity indicator) that the instruments will help collect (Box 19.5). Instruments of data collection can be secondary data (e.g., program reports) and program records, interview questionnaires, standardized tests, a focus group guide, observation guide, photography, and other.

Box 19.4 Sample Research Design Paragraph

The evaluation will combine qualitative and quantitative methods of data collection and analysis, including teacher and staff interviews, student pretests/post-tests, and parent surveys, in order to assess the perceptions of participants about the programs, the benefits for the target population, and the attainment of short-term and midterm outcomes described in the program proposal.

Box 19.5 Sample Instruments of Data Collection

Data will be collected through:

An English proficiency test to assess the participant's progress in learning basic English.

A participant satisfaction questionnaire to assess the perception of the participants about their experience in the program.

Use of level-of-functioning scales before, during, and after each client is involved in the program.

Box 19.6 Sample Paragraph of Data Analysis

Descriptive statistics and paired sample t-tests for pre- and post-test variables will indicate the level of change in students' science knowledge due to program participation. The program coordinator will be comparing data from 2012-2013 with information collected in 2011-2012 to explore whether the program has improved.

Data Analysis

Data analysis explains how the data will be analyzed and the criteria that will be used to perform the analysis. For example, do you plan to analyze the outputs, the outcomes, or both? Will you use statistical analysis? See Box 19.6 for a sample paragraph of data analysis.

Evaluation Management Plan

The evaluation management plan provides an overall plan for conducting the evaluation (Box 19.7). It describes a work plan detailing the evaluation questions, the indicators, and the data-collection procedures used (source, method, sample, timing, and the person responsible).

Program Evaluation Report

A program evaluation report is a document that summarizes the findings from a program evaluation (Box 19.8). An evaluation report includes most of the items included in an evaluation plan, plus the plan's findings and recommendations. It is important to underline that the evaluation plan describes what will be done during the evaluation process (future). The evaluation report describes what was done (past).

Box 19.7 Template for AN Evaluation Management Plan

Data Collection

Person

Questions

Indicators

Sources

Methods Sample Timing

Responsible

What was

Percentage

Question-

Survey/ 100% of At the

External

the level of

of

naire

interview participants end of

evaluator

satisfaction

participants

the last

of the

who were

training

participants?

satisfied?

session

Box 19.8 Sample Evaluation Report Outline

- Executive summary

- Program description

- Evaluation methodology

- Findings

- Interpretation and reflection

- Recommendations

 
< Prev   CONTENTS   Next >