top of page

PROGRAM EVALUATION

Reflection:

Program evaluation has been one of the most daunting classes that I was nervous about taking. I had no idea what could be encompassed in the class or how I would use it or what jobs used it. All I knew for sure was that I did not want to take it or use it or think about it. Breaking it down, evaluation is a critique. I assumed this class would help us to learn how to critique programs in Public Health but it has also helped me as I navigated changing classes in a pandemic.

​

To do a program evaluation, there is much more than a simple critique. You must use both quantitative and qualitative data to ensure all things are looked over. Stakeholders who work on the program and receive the program all have different roles and it is up to you to identify them. Creating logic models help ideas flow and create timelines of programs. Essentially a program evaluation seeks to see if a program is efficiently running or not, and if not, how can it be improved? This is done by assessing and documenting the program implementation, the outcomes, and the efficiency and cost-effectiveness of the activities. 

​

Program evaluation has been utilized extensively in the past few weeks in my graduate teaching assistant-ship as the University has been switched from on campus to online learning. In a matter of days, I have had to help take two undergraduate courses and one graduate level class into an entirely online format and adjust three months worth of curriculum and assignments. I found myself making timelines of how assignments could flow better, adjusting deadlines, providing new resources, and taking into account student performance more heavily. Flexibility and constant monitoring of this new online program have been the largest factor of making sure students can successfully complete the course. As a pandemic unfolds around us, to be expected to also stay on top of coursework is a daunting task. Focus groups were established with students to give feedback of how the courses needed to change. Assessments of grades and missing assignments have been crucial in figuring out a speed at which to allow the material to flow. You cannot create an online course out of the blue and stay rigid with how it will operate. You must be evaluating it at every step and turn to ensure the user has the most success and the courses themselves are successful.

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

Background:

I chose to do my program evaluation on the HEALTHY Trial and childhood obesity. Type 2 diabetes has recently become increasingly more common in the pediatric and adolescent population. Prior to 1994, type 2 diabetes was unusual in the pediatric population; some clinics now report that up to one-third of children and youth presenting with diabetes between the ages of 10 and 19 years have type 2 diabetes (Dean, 1992). Type 2 diabetes in youth, as in adults, is due to a combination of insulin resistance and relative beta cell failure, and is seen almost exclusively in those with excess adipose and increased rates of obesity. Insulin resistance is most often the first abnormality that develops due to a combination of genetic factors and environmental triggers. The increased prevalence creates additional health care costs and services. (McCarthy, 2010) 


A primary prevention approach called the HEALTHY trial was created that tackled risk factors for type 2 diabetes in youth, focusing most on obesity.  Over 6,000 students across 42 middle schools in the United States were involved in the study.  The students were split in half where one section received an integrated intervention program that encompassed and addressed school food environment, lifestyle behavior, physical education, and promotional messaging that was more comprehensive and funded than previous attempts. The primary objective was to reduce the percentage of overweight and obese youth in schools using the intervention program. 

 

Methods:

The following quantitative and qualitative questions were asked:

QUALITATIVE
How likely does the individual feel like they can maintain these healthy habits learned?
What limitations at home does the individual face that hinders continuing these healthier habits?
What emotions does the individual feel at the beginning of the program compared to the end
about their health and wellness?
QUANTITATIVE
How many schools saw a decrease in BMI after the program?
How many minutes of exercise on average did individuals continue to do after the program
ended?
What was the rate of satisfaction an individual felt after this program was completed?

​

Key stakeholders were identified as followed:

​

 

​
 

​

​

An evaluation plan takes a broad approach to gauge the extent that interventions are delivered and received as intended. It assesses fidelity of intervention delivery which is the extent the intervention is delivered as it was supposed to.  The intervention dose describes how much of the intended intervention is delivered, and the reach to the groups targeted by the intervention which is the proportion of recipients intended who actually participate in the intervention. By monitoring the delivery of key intervention components and providing timely feedback to the intervention staff, the data from the evaluation plan can be used to help ensure fidelity of intervention delivery. Monitoring and providing feedback on the intervention will ensure that there is satisfactory implementation of the intervention components. Reach provides information on the ability to perforate the intervention targets. In combination, process evaluation data can be used to help explain the outcome of the study. Process evaluation uses both quantitative and qualitative methods which can include structured observations, , semi-structured interviews, focus groups, questionnaires, and logs.

Findings:

Data collection is taken by semester for a total of three years, from the 6th grade to the 8th grade. The first three sections are focused on the students. It takes quantitative data in the beginning and end of the entire program. The next section focuses on budget collection then the next three sections are individual intervention collections. The final section touches on the elevation collection of data.  

The program’s goals will be answered using these specific data collection methods for each intervention area:
Physical education

  • structured observation of participating the PE teachers in class

  • % of students in the grade who are participating in the study PE classes

  • interview with the coordinator of physical activity about PE intervention delivery

Nutrition/food environment

  • structured observation of changes that are occurring in the total school food service environment

  • interview with the research dietitian about nutrition/food environment intervention delivery

Behavior

  • structured observation of FLASH sessions in-class

  • % of students who are participating in FLASH

  • proportion of FLASH delivered

  • interview with health promotion coordinator about behavior intervention delivery

 Overall program

  • interview with each school administrator about progress of the program

  
Comparisons are made between students recruited in 6th grade that are still present in the cohort in 8th grade versus those who are no longer in the cohort in 8th grade. Another is between students in the 8th grade cohort versus students newly enrolled in 8th grade just for end of the study data collection. Odds ratios and 95% confidence intervals are obtained from generalized estimating equation (GEE) models to analyze the differences between intervention and control schools. The GEE provides a method for analyzing data when the responses are correlated and can either be continuous, discrete, or count data. The GEE models allow for fixed like gender, and race, and time-varying covariates like Tanner stage and waist. Adjustment for individual-level and cluster-level covariates like baseline values are also included. Fitting a random effects parameter for school is taken into account by clustering of observations within different schools. Correlation between all interschool observations is taken into account by the selection of covariance structure.

 

​

Recommendations:

To help improve this program, I would say that there needs to be more focus on physical activity and nutrition outside of school. I plan to leave this program with more insight into leading healthy lives focused on well-being outside of an educational building
Benefits:

  • Better breakfast and dinner choices leading to healthier diets

  • After school resources that utilize physical activity

  • Less reliance on technology to keep children occupied

​

bottom of page