Skip to main content
Public Health Workforce Training
Link to beginning section of Advanced Program Evaluation

Constraints

ShapeTracker Intervention Stakeholders

  • Commissioner Middleton County (MC)
  • Middleton County (MC) Health Department (HD) ShapeTracker (ST) team
  • MC HD Information Technology staff
  • MC HD marketing/communications staff
  • Other MC HD staff
  • CDC
  • Group that developed ST for CDC
  • MC community members
  • Wellness advocates in MC
  • Obesity prevention programs in MC
  • Healthy eating related resources in MC
  • Exercise resources in MC
  • Smartphone providers in MC
  • California communities that implemented ST
  • Smartphone companies
  • MC government

Logic Model

Inputs Activities Outputs Short-term Outcomes Intermediate Outcomes Long-term Outcomes
ShapeTracker App Mobilize local resources and customize ST for MC # of times ST app downloaded in MC Decrease perceived barriers to healthy eating and physical activity Increase healthy eating in MC Reduce obesity in MC
Funding from HD and CDC Market ST in MC # of MC residents actively using ST Increase self-monitoring of healthy eating and physical activity Increase physical activity in MC Reduce obesity related health problems in MC
Team of experienced staff Keep app updated and maintain content Amount of use for different ST functions Increase self-efficacy for healthy eating and physical activity    
High smart phone use in MC community Provide technical assistance to users User satisfaction with ST Increase social supprt for healthy eating and physical activity    
Marketing & communication resources in HD   # of friends referred to ST      
IT resources in HD   # of MC residents actively using ST long-term      
Local obesity prevention resources in MC          

ShapeTracker Intervention Evaluation Questions

Process Questions
  1. Was a broad range of local resources successfully engaged in the project?
  2. Was ST successfully disseminated to a broad segment of MC community?
  3. What proportion of MC residents have:
    • downloaded the app?
    • became active users of the app?
    • became active long-term users of the app?
  4. For those who stopped using ST, why did they discontinue?
  5. How much are ST users using different functions in ST?
  6. How satisfied are MC residents with the ST app and its specific features?
Outcome Questions
  1. Does use of ST increase:
    • self-monitoring of healthy eating and physical activity?
    • self-efficacy for healthy eating and physical activity?
    • social support for healthy eating and physical activity?
  2. Does ST decrease perceived barriers to healthy eating and physical activity?
  3. Does use of ST increase healthy eating and physical activity among county residents?
  4. Does use of ST lead to lower BMI among those who are obese or overweight?
  5. Does use of ST help those with normal BMI prevent overweight?

The evaluation questions we have chosen to address in our evaluation plan.

Budget, Time, Data and Political Constraints

Most evaluations are conducted in environments with serious constraints for budget, time, data and politics
(Bamberger, Rugh & Mabry, 2006).

This means that evaluators are constrained by:

  • Timelines and time pressures
  • Limited budgets and resources
  • Not having control group, comparison group and/or baseline data available
  • Challenging expectations from stakeholders

Below are some general strategies for addressing these constraints suggested by Bamberger and colleagues. Note that some of them will increase threats to scientific validity of the evaluation.

Strategy: Simplify design
Addresses: Budget, Time
Strategy: Clarify information needs and cut non-essential
Addresses: Budget, Time
Strategy: Look for useful secondary data
Addresses: Budget, Time
Strategy: Reduce sample size
Addresses: Budget, Time
Strategy: Use economical data collection methods
Addresses: Budget
Strategy: Use rapid data collection methods
Addresses: Time
Strategy: Hire more staff
Addresses: Time
Strategy: Reconstruct baseline data (if not available)
Addresses: Data
Strategy: Reconstruct comparison groups/data (if not available)
Addresses: Data
Strategy: Understand stakeholder perspectives
Addresses: Political
Strategy: Use a participatory process
Addresses: Political
Strategy: Provide frequent feedback to stakeholders
Addresses: Political

For example, if the ShapeTracker evaluation team had severe time and/or budget constraints, the only feasible design might be a pre-post assessment of Shape-Tracker users in Middleton county. This would represent a serious trade-off of scientific rigor over the comparison group design we selected above, limiting the validity of the findings.

Let’s say the Commissioner is questioning the comparison group design we selected, saying that it will be too expensive and take too long to complete.

What would you do and why?

Question 1:

How would you adjust the evaluation plan? - or -
How would you make the argument for keeping the evaluation plan we selected?

Please write your answer in 1-3 sentences.

Feedback:

Depending on how strong the pressure from the Commissioner is, you may have to adjust the evaluation plan. There are many ways of simplifying the design, such as a one-group pre-post design with ShapeTracker users only.

But you may want to remind the Commissioner of the reasons why a strong evaluation of ShapeTracker is needed. Many other stakeholders may question the investment in ShapeTracker, and the Commissioner will need rigorous evaluation data to back it up. You may also need to argue that pre-post data from one group has limited validity because a change in an outcome cannot be attributed to the intervention.