Evaluation helps you to understand what practices work in what settings and why. Formally, evaluation involves the systematic collection of information about the activities and outcomes of practices to improve their effectiveness, inform decision making, and maintain accountability.

The MCH Navigator has trainings based on the CDC’s Framework for Program Evaluation that can help you evaluate your practice. Click the picture below to learn more: 

MCH NAVIGATOR Logo

Here you will find examples of practices from Innovation Hub that demonstrate each of the 6 Steps in the CDC’s Framework for Program Evaluation. You can also search our MCH Innovations Database to identify other practices that employ the evaluation framework.

Click to explore each step: 

Definition: Stakeholders are involved in or affected by the program, as well as those interested in the outcomes of the evaluation. Stakeholders can be a part of developing the evaluation, and should be informed of results (CDC, 1999)

Innovation Hub Example: Using Barbershops to teach Period of PURPLE Crying/Infant Development. This practice engaged stakeholders (specifically barber shop owners/customers) in evaluating program delivery. They tested the Period of PURPLE Crying curriculum to make sure it resonated with barber shop customers, and worked with barber shop owners to see how delivering the curriculum could fit in their existing workflow

Definition: As outlined by CDC, describing the program involves stating the need for the program, the expected effects of the program, the program activities and resources needed to conduct it, the stage the program is in (early development or later phases), the setting or context it is being delivered in, and the program’s logic model.

Innovation Hub Example: North Carolina’s Innovative Approaches: Community Systems Building Grants for Children and Youth with Special Health Care Needs includes clear objectives and a model outlining the program’s theory of change. Objectives include: : (1) to thoroughly examine the community system of care for CYSHCN; (2) to facilitate community identification of sustainable system changes and promising practices; and (3) to coordinate the implementation of these practices with agencies, providers, and families in the community.

Definition: This involves thoughtfully planning an evaluation strategy, including thinking about the evaluation’s purpose, users, and uses; the evaluation questions; methods; and partners in the evaluation process along with agreements necessary to work with those partners. (CDC, 1999)

Innovation Hub Example: Pono Choices: A Culturally Responsive Teen Pregnancy and STI Prevention Program involves a carefully-planned randomized controlled trial evaluation strategy involving collaboration with internal partners and external contractors.

Definition: Gathering credible evidence depends on your stakeholders and the audience for your evaluation results. While certain stakeholders may see quantitative data as valuable, others may value stories of success, or both. (CDC, 1999)

Innovation Hub Example: The South Carolina PASOS Program collected evaluation data from pre- and post-tests as well as qualitative data in the form of women’s stories to demonstrate their impact

Definition: Justifying conclusions can include comparing evaluation results to standards, analyzing and synthesizing results, interpreting meaning from synthesized findings, and making judgements about a program’s success or need for improvement. (CDC, 1999)

Innovation Hub Example: The Pathways Community HUB program uses evaluation results to make conclusions about program performance and has implemented a process of continuous quality improvement to implement changes. Check out the evaluation and lessons learned section. 

Definition: Evaluation findings should be shared and translated into action. The five elements of ensuring use and sharing lessons are: 1) evaluation design to understand who the target users of the results will be and how stakeholders will use the evaluation 2) preparation to consider how findings might impact the program (positively or negatively) 3) feedback from stakeholders and partners throughout 4) follow-up to ensure that evaluation results and lessons learned are not lost and to ensure that findings are not mis-used 5) dissemination of the results to stakeholders. (CDC, 1999)

Innovation Station Example: The Improving Oral Health Outcomes for Pregnant Women and Infants by Educating Home Visitors practice in Virginia conducted pilot studies. Positive evaluations from the pilot studies were used to spur policy change and expansion of training programs.