Skip directly to search Skip directly to A to Z list Skip directly to navigation Skip directly to page options Skip directly to site content

VFC Evaluation

Importance of Evaluating Your VFC and AFIX Programs

Overview

Given the amount of funding and considerable resources that are invested in implementing and managing programs such as VFC and AFIX at both the federal and awardee levels, it is important to ensure that these programs are managed appropriately and are achieving their desired outcomes. Program evaluation is an essential organizational practice in public health and is a awardee requirement as stated in the 2008 Program Announcement and the Immunization Program Operations Manual. Evaluation is also an integral component of VFC and AFIX programs. This module is intended to provide guidance to grantees about program evaluation.

Evaluation provides objective insight into a program and identifies opportunities to assess its impact, make improvements, or build program capacity. Evaluation enables programs to identify components that are achieving their desired effect as well as those that are not functioning adequately. In addition, program evaluation provides documentation for funding agencies such as state legislatures or the federal government that funds are being used appropriately and the desired effect or outcomes are being achieved.

For the VFC program, it is important to evaluate program processes and outcomes. Program processes that should be evaluated periodically include provider recruitment, enrollment, communications, provider satisfaction with the VFC program, provider storage and handling practices, site visits, and building collaborations with other organizations. The desired outcome of the VFC program is viable vaccine administered to eligible children.

The Assessment, Feedback, Incentive and eXchange of information (AFIX) program must be evaluated on a regular basis as well. AFIX is a natural companion to VFC. The desired outcome of the AFIX program is that viable vaccine is administered to children according to the recommended immunization schedule. The process to achieve this desired outcome is based on a Continuous Quality Improvement (CQI) strategy that identifies clinical and behavioral practices that are affecting immunization coverage levels among patients served by the office, providing feedback regarding the identified practices, and working with the staff to develop a realistic plan to change practices or behaviors. An important but frequently overlooked informal step is supporting the staff in making changes that will improve immunization coverage levels. This step requires the AFIX staff to make contact and follow up with the provider or clinic staff between AFIX visits.

 Top of Page

Steps Involved in the Evaluation Process

Determine the Focus of the Evaluation

The evaluation should be specific and focused on questions that are most relevant and important. You will achieve the best evaluation focus by understanding where the questions fit into the overall program. A program evaluation can focus on the program implementation/process and/or effectiveness/outcome. The type of evaluation selected should be based in part on the maturity of the program that is being evaluated. A process/implementation evaluation focuses on answering questions regarding program planning and implementation. (For example, what resources were required to implement the program? What program activities were accomplished and implemented as planned?) Effectiveness/outcome evaluation measures the program's success in producing the desired outcomes or measures progress toward the program's objectives and goals, which may include short-term, intermediate, and long-term outcomes. As a program matures, outcome evaluation must be added to process measures.

Determine Who Should be Involved in the Evaluation Process

Stakeholders need to be part of the design, implementation and evaluation of any program for it to make a difference. Stakeholders are individuals or agencies that have a vested interest in the success of your programs. They may include those involved in program operations, those who participate in the program, or anyone with a particular interest or expertise in the program activity being evaluated. Stakeholders are much more likely to support the evaluation and act on the results and recommendations if they are involved in the evaluation process. Conversely, without stakeholder support, your evaluation may be ignored, criticized, or resisted.

Staff from within the immunization program, both field and management staff, should be included in the entire process to obtain a complete picture of the program. External stakeholders should include staff from the state Medicaid agency and other collaborating agencies or organizations. Planners should think comprehensively and identify partners at the outset in order to build momentum and assess willingness. It is important to monitor communications and relationships throughout the evaluation process because it is fluid.

Apply Evaluation Framework

CDC has developed a framework to assist programs in the evaluation process. An immunization program may decide to use the following steps to conduct an evaluation of the VFC and AFIX programs or as part of their overall programmatic evaluation efforts.

  1. Establish the workgroup, its objectives, and timeframe to achieve the objectives.
  2. Describe the program to the workgroup. Present a complete picture of the program to the workgroup members. A comprehensive program description clarifies all the components and intended outcomes of the program, which helps focus the evaluation on the most central and important questions. A comprehensive description includes the following components: need, targets, outcomes, activities, outputs, resources/inputs, and relationship of activities and outcomes. One way of developing this description is by using a logic model. This method draws a "soup-to-nuts" picture of the program. Every activity and outcome related to the program is written on an individual piece of paper, note card or adhesive note. Once all activities and outcomes are identified and documented, the activities and outcomes are arranged in a way that depicts a causal relationship between activities and their intended outcomes . By using this method to describe the program, each workgroup member can visualize the complete program. The logic model can be created by working backwards from the intended outcomes, or forward by asking, “What happens next as a result of this activity?”
  3. Focus the evaluation. What are the desired outcomes and did the intended outcomes occur? At what cost were the activities implemented and outcomes achieved? Identify and list all possible evaluation questions that could be asked and then determine the key essential questions to be evaluated. If the list is long, reduce this list to only one or two questions so that the evaluation process will be manageable.
  4. Determine what information is needed to answer the questions.
  5. Gather credible information for the evaluation. Determine how to collect the information and implement data collection.
  6. Justify conclusions. Review and interpret data/evidence to determine successes or failures of the program.
  7. Use the lessons learned from the evaluation. Use results in a meaningful way to improve program areas that need to be strengthened. As appropriate, promote and distribute the findings that show intended outcomes are being met and/or how changes were made to the program based on the evaluation. Evaluation results should be shared with all interested stakeholders and appropriate CDC staff.
  8. Repeat the process on regular basis. Evaluation is an ongoing process. Depending upon the results of the initial evaluation, there may be a need to evaluate changes that are made to the program. Future evaluation activities might focus on aspects of the program that were not initially evaluated.

Data Sources for Use in Evaluation

CDC requires grantees to gather information from each enrolled provider visited regarding various aspects of both the VFC and AFIX programs. Specific data from individual providers are aggregated and used to complete the VFC Management Survey that is submitted to CDC annually. Data from VFC and AFIX provider site visits as well as the aggregated information in the VFC Management Survey are excellent sources of evaluation data for a awardee. These and other sources of data, and their use in evaluating VFC and AFIX programs, are described below.

VFC Provider Site Visit Questionnaire

The VFC Provider Site Visit Questionnaire is completed during the VFC site visit. The information collected provides the awardee with a measurement of how successfully the VFC program is being implemented at an individual provider's office. Information from the VFC Provider Site Visit Questionnaire should be used to provide feedback to the office on how well it is meeting the requirements for participation in the VFC program. The high-priority questions in the questionnaire are easily identified with a red exclamation mark ( !) in front of the question. If a provider's response to one of these questions is unacceptable, a corrective action plan must be developed and implemented to correct the situation. (For more information about the VFC Site Visit Questionnaire, see Module 9 of this manual.) CDC requires grantees to aggregate the responses of selected high-priority questions and report the results annually in the VFC Management Survey.

At the awardee level, the aggregated information on the high-priority questions can be analyzed to determine the strengths and weakness of a awardee's VFC program. The high-priority questions are categorized into those focusing on administrative practices and those having to do with vaccine storage and handling. The questions related to administrative practices include maintaining administrative fees within the maximum regional charge, conducting VFC eligibility screening, using most current VISs, maintaining written policies on vaccine management, and conducting monthly physical inventories of VFC vaccine. The other high-priority questions focus on vaccine storage and handling practices. Areas of weakness can become the focus of quality improvement projects for the awardee. For example, if the aggregate data indicate that a significant portion of the providers who received a VFC site visit in a given year did not rotate their vaccine stock, specific educational interventions could be developed for providers regarding how to incorporate rotating vaccine stock into everyday office practices. Repeat site visits could be done at predetermined times after the intervention to evaluate its success.

VFC Management Survey

The VFC Management Survey is a web-based questionnaire that grantees are required to complete and submit annually by March 1 for the previous calendar year's activities. The survey has several different sections. The first section requires information on the number of enrolled providers and how the VFC program operates within the awardee's geographic area. The second section of the Survey covers VFC/AFIX activities from the previous year, and the final section requires grantees to report on VFC accountability activities conducted in the previous calendar year. The aggregated information collected on the high-priority questions in the VFC questionnaire is reported in this section of the Survey. Thoughtful review of the answers to the questions in the first section can assist grantees with identifying program activities that would benefit from intervention and evaluation.

Awardee Progress Reports

Each awardee is mandated by the grant requirements to submit progress reports on a routine basis to CDC. The notice of award provides specific information about the due date for these progress reports. These progress reports can identify program areas that could benefit from more in-depth evaluation to determine why the objective was or was not successful.

VFC-Enrolled Providers

Providers enrolled in the VFC program can be some of the best sources of information for evaluating what aspects of the VFC program are or are not working optimally. One method for collecting provider opinions is through a provider satisfaction survey. A generic example of a provider satisfaction survey is in Appendix 7. If a survey is done, the awardee will need to determine the specifics regarding which VFC providers to include, what questions to ask in the survey, how to conduct the survey and how to analyze the results. In addition to evaluating operational components, provider surveys can be used to gather information on the educational needs of enrolled providers or their response to education provided. Findings can determine what quality improvement projects are undertaken by the awardee. For example, if survey results indicate that a significant portion of the providers who received education on completing a VFC accountability form are not completing the form because it is too complicated, a quality improvement project might be needed to simplify the form.

Evaluation Resources

Several resources are available to grantees requiring assistance or further information on program evaluation. The best place to start is with the CDC project officer assigned to each immunization awardee. The project officer can direct the awardee to specific individuals who can assist in developing, implementing, or interpreting evaluation measures for the VFC or AFIX programs.

CDC has a website to assist programs and individuals in learning more about program evaluation. The website includes tools to assist with the evaluation process as well as links to other evaluation websites.

 Top of Page
Top