Skip to main content

Program Evaluation

In this lesson, you will learn about the importance of program evaluation as a source for determining what services to children and families are going well and what aspects of the program are in need of improvement. All staff and families may participate in the program evaluation. Training and curriculum specialists and program managers serve as the leadership team that guides the collection and analysis of information needed to conduct a thorough program evaluation.

Objectives
  • To learn about the importance of working together with staff and families to evaluate the program.
  • To learn about the importance of using program evaluation data to improve program outcomes for children and families.
  • To learn about the importance of modeling continuous quality improvement so that staff focus on program goals and outcomes.

Learn

Program Evaluation

In this lesson, you will learn about the importance of program evaluation as a source for determining what services to children and families are working well and what aspects of the program are in need of improvement. All staff and families may participate in the program evaluation. Program evaluation is typically completed on an annual basis in order to determine whether the program is effectively meeting its goals. The findings of the program evaluation are typically shared with all of the program's stakeholders: families, staff, and in some cases the community. In military child and youth programs these reports are also shared with the military leadership or higher headquarters within your Service.

The management team works with staff and families to collect relevant data, analyze it, and use the findings to make changes that will improve overall program quality. They also use this information to create a formal report or description of the program for those outside the program who are interested partners (families, advisory board members, funders, inspection teams, accrediting bodies, etc.). Program leaders will want to use both formative and summative evaluation methods when organizing the program evaluation process.

Formative Evaluation

Formative evaluation is used during the daily operation of the program to examine ongoing processes and to help improve the program. A formative evaluation examines day-to-day successes and deficits. It is often used when programs are just starting or a new policy has just been put into effect. Formative evaluation provides a fast feedback loop to influence program decisions and make necessary changes.

Formative Example: Tori, the program T&CS, decides to create a new schedule for herself where she observes two of the toddler classrooms each morning during their snack routine and leaves a brief note summarizing her observation in each teacher's staff mailbox. Both of the toddler rooms were starting a new routine at snack time and the teachers wanted Tori to observe snack several times to provide feedback about how it was working. At the end of two weeks Tori met with the toddler teachers to discuss her observations and to help them decide if the new routine was working well for the children and staff.

Summative Evaluation

A summative evaluation is typically conducted at the end of a program or after a program has been in existence for some time. The summative evaluation is often shared with those outside the program (advisory board, interagency council, funders) to provide data about the effectiveness of the program.)

Summative Example: Each July, Maria, the program manager writes an evaluation report. In it, she summarizes child and youth data, parent event data, child and parent satisfaction data, and the overall program budget report. This report also highlights progress on the program's goals (e.g., increase the number of volunteers in the after-school program). In many ways, a summative evaluation report may be of interest to anyone affiliated with the program, but is also of importance to individuals outside the program.

Although a summative evaluation report may only be written and shared once per year, the information that is used to create the report is collected across the year at many different points and includes the input of many different stakeholders (e.g., staff, parents, coaches, advisory board members).

Evaluations may be conducted by a staff member who is internal to the program (e.g., T&Cs or program manager) or the evaluation may be conducted by someone who is external to the program (e.g., paid consultant, university professor with expertise in child care program management, assessor for accreditation, or state licensing representative with expertise in conducting program evaluations). The program manager is typically responsible for conducting summative program evaluations, or coordinating and working in collaboration with external program evaluators; however T&Cs may assist at the request of the manager (e.g., to help gather some relevant data).

Planning the Program Evaluation

When planning a program evaluation the main question is "What is the purpose of the evaluation?"

  • Do you want to demonstrate why the program needs more after-care staff?
  • Is the purpose to show children's growth and development?
  • Is the evaluation being done to gather feedback from others about the program?

The purpose of the evaluation can be different for different stakeholders and will drive the types of questions the evaluation will answer:

  • Are we (the leadership team and the staff) doing what we are saying we're doing?
  • What is the program's impact on children and their families?
  • Do all families feel welcome and part of the program community?
  • Does the program comply with DoD certification requirements?

There are typically three major areas of consideration in Child Care and Youth Program Evaluations:

  1. Program quality
  2. Program outcomes (for children and youth)
  3. Consumer and stakeholder satisfaction with the program

It is important that those participating in an evaluation understand the purpose of the evaluation and how their responses will be reported. Planning and designing an evaluation includes creating a matrix in which the major aspects of the evaluation are organized. If the evaluation is not carefully designed then the results may not be accurate or useful to report. The resources in the Explore section will help you more deeply understand the program evaluation process, including what documentation or data may be most appropriate based on the questions you seek to answer, as well as different classroom- or program-quality assessment tools (e.g., environment rating scales) that can offer additional data about the program.

Data used to answer the evaluation questions may be collected through online surveys, focus groups, one-on-one interviews, use of classroom- or program-quality assessment tools, and document reviews (e.g., teacher's written lesson plans, children's progress reports or assessment data, budget summaries, program supply orders).

Sharing and Using Program Evaluation Data

When doing a formative or summative evaluation it is important to plan how the results will be reported and used. Parents will want to know if their time and attention to responding to survey questions results in higher quality after-school programming for their children. Program managers must decide the best way to share the evaluation results with those who participated in the data collection. Results may be posted on the program's website, in the newsletter, and in other places where parents may access it. Hard copies may be made available to staff and advisory board members. It is important for the program manager to also discuss the program evaluation findings with stakeholders. Discussions may take place during a regularly scheduled staff meeting or advisory board meeting. The use of the evaluation results to improve the program should be clear to the stakeholders. Staff and other relevant stakeholders should be made aware about how progress will be monitored.

Some programs are asked to share program evaluation data during interagency council meetings or with other groups. For example, if the after-care staff and the program manager apply for and receive a special grant from an organization (e.g., the Rotary Club) for a substantial amount of money to enhance the science activities offered during the after-care program, then the staff and program manager have a responsibility to evaluate the activities and students' learning and report how the money was spent. Again, choosing the most relevant evaluation questions is critical to planning the evaluation, data collection methods, and timeline. The evaluation report should be written, shared and discussed with the funders.

Watch the video below to hear how T&C's and program managers use program evaluations to document success and guide changes for future improvement.

Program Management: Program Evaluation

Program evaluations provide important information which can be used to document successes and guide changes for future improvement.

Model an Approach to Continuous Quality Improvement

The staff looks to the program manager and T&Cs for leadership. As the leadership team, they focus on a continuous quality improvement outlook. Programs that are committed to consciously improving their services to children, youth, and families demonstrate an openness to learning new knowledge and skills. They have a clear understanding and commitment to quality. They know what a high-quality program should provide for children, youth, and families. They are aware of what the program is doing well and what could be done better. This awareness is based on observation, data collection, and use of valid and reliable program evaluation tools.

A sample continuous quality improvement plan is provided at the end of the Learn section (see ISBE Continuous Quality Improvement Plan-ECERS-R). This sample plan is based on a preschool program observation using the Early Childhood Environmental Rating Scale-Revised (ECERS-R). This sample program chose to focus on improving all aspects of their program environment that fell below a "5" on the ECERS-R scale. The specific areas needing improvement are listed in the plan. The sample plan includes a timeline, activities and professional development, and person responsible. Writing a continuous quality improvement plan is a way for the leadership team to engage staff in sharing a vision of how they all may work together to attain higher quality services for children and their families.

Your Service, licensing agency, or accrediting body, may have a particular format in which they want to receive your continuous program improvement plan (which is one aspect of program evaluation). At the end of the Learn section, we provide an additional blank example of how Ohio asks child care and youth programs to structure their continuous improvement plans based on their state's quality rating and improvement system. Although each agency, state or Service may be slightly different in how they want the information formatted or delivered, they are typically seeking the same kinds of information.

  1. What are your goals for the program?
    1. How did you derive these goals? In other words, what is the current state of the program and how do you know this? What tools, documentation or data did you use to understand how your program currently functions?
  2. What are the specific action items to help address these goals?
    1. What activities or professional development training will help you meet these goals?
  3. What resources do you need?
  4. Who is responsible for the particular action items listed? For many goals, there may be multiple people responsible, as direct care staff members may be responsible for enacting new practices in their classrooms, while T&Cs are responsible for providing training relative to this new practice and observation and coaching to ensure it is happening appropriately.
  5. What is the timeline for implementation of the specific action items?
  6. How will you know your goal was achieved?
    1. You need to anticipate the tools, documents and data you will need to monitor progress toward your goal. This may be similar to forms of information you used to identify the goals to begin with. For example, in the Learn ISBE example, the program may re-administer the ECERS-R in 4 months to see if their scores in the identified areas are now above 5. However, they may be different. For example if you develop a goal to offer more family educational workshops based on feedback from family surveys, you may use the number of families who attend the workshops as one point of data to monitor progress.

Continuous quality improvement entails:

  • A commitment to lifelong learning
  • Program changes that result in better quality care-giving
  • Self-reflection
  • Shared leadership
  • Embedded job supports

When T&Cs and program managers consistently articulate and demonstrate a commitment to continuous quality improvement, they provide a model for all staff to reach for excellence within themselves. The leadership team is always ready to celebrate the program's successes, but also keeps striving to reach the next goal.

Explore

There are a variety of program evaluations available through websites and publishers. You will want to explore these resources to become familiar with program evaluation questions, documentation, data, and summaries to share with staff and families. Many states have a quality rating and improvement system (QRIS) to improve child-care services. In some states, a child care's QRIS rating may be tied to funding, licensing, etc. You may want to do a web search to see if your state has a QRIS rating system.

For more information about QRIS, program accreditation, and after-school program evaluation tools, follow these links: https://www.naeyc.org/our-work/public-policy-advocacy/research-and-evaluation and http://www.naeyc.org/files/naeyc/file/positions/CAPEexpand.pdf.

Your service branch may offer online trainings for program managers and T&Cs to learn more about how to plan for continuous quality improvement within a child and youth program.

Apply

The families you serve are major stakeholders in any program evaluation. Some programs do written surveys at least once per year (these can be paper surveys or electronic surveys). Some program managers like to interview parents when their children leave the program (an exit interview) to learn how the program can improve in its mission to create a caring community among the children, staff, and families.

First, try writing down the questions or writing prompts that you would include on a family feedback survey. For ideas, you can look at the following two samples of family surveys from the book Winning Ways for Early Childhood Professionals: Partnering with families (Schweikert, 2012).

Next, try writing a brief continuous improvement plan based off these surveys. Imagine that when reviewing the survey results (in the second example with the 1-5 scale), you notice that many families seem to consistently rate items 9 and 10 on the survey (i.e., “I feel informed about how my child is doing in the program,” and “I know about program activities or events.”) at a 3 or below. What would you do with this information? What goal(s) would you set, and specific action items would you write? What resources might you need to address this? Who are the responsible people? What is your timeline? And, finally, how will you access if you have moved forward in meeting this goal?

Glossary

Continuous quality improvement:
A system that seeks to improve the provision of services with an emphasis on future results

Demonstrate

True or false? Program managers do not share the results of program evaluations with parents.
A continuous quality improvement plan consists of…
When planning a program evaluation, the main question to ask yourself is...
References & Resources

Carran, D.T. (2009). Early childhood program evaluation. In J. M. Taylor, J. R. McGowan, & T. Linder (Eds.), The Program Administrator's Guide to Early Childhood Special Education (pp. 307-335). Baltimore: Paul H. Brookes Publishing Co.

Illinois State Board of Education Early Childhood Division Continuous Quality Improvement Plan for Early Childhood Environment Rating Scale-R (ECERS-R) Retrieved from https://www.isbe.net/Documents/cqip-ecers-sample-plan.pdf

Schweikert, G. (2012). Winning Ways for Early Childhood Professionals: Partnering with families. St. Paul, MN: Redleaf Press.