Skip to main content
Public Health Workforce Training
Link to beginning section of Advanced Program Evaluation

Instrument Design and Sampling

ShapeTracker Intervention Stakeholders

  • Commissioner Middleton County (MC)
  • Middleton County (MC) Health Department (HD) ShapeTracker (ST) team
  • MC HD Information Technology staff
  • MC HD marketing/communications staff
  • Other MC HD staff
  • CDC
  • Group that developed ST for CDC
  • MC community members
  • Wellness advocates in MC
  • Obesity prevention programs in MC
  • Healthy eating related resources in MC
  • Exercise resources in MC
  • Smartphone providers in MC
  • California communities that implemented ST
  • Smartphone companies
  • MC government

Logic Model

Inputs Activities Outputs Short-term Outcomes Intermediate Outcomes Long-term Outcomes
ShapeTracker App Mobilize local resources and customize ST for MC # of times ST app downloaded in MC Decrease perceived barriers to healthy eating and physical activity Increase healthy eating in MC Reduce obesity in MC
Funding from HD and CDC Market ST in MC # of MC residents actively using ST Increase self-monitoring of healthy eating and physical activity Increase physical activity in MC Reduce obesity related health problems in MC
Team of experienced staff Keep app updated and maintain content Amount of use for different ST functions Increase self-efficacy for healthy eating and physical activity    
High smart phone use in MC community Provide technical assistance to users User satisfaction with ST Increase social supprt for healthy eating and physical activity    
Marketing & communication resources in HD   # of friends referred to ST      
IT resources in HD   # of MC residents actively using ST long-term      
Local obesity prevention resources in MC          

ShapeTracker Intervention Evaluation Questions

Process Questions
  1. Was a broad range of local resources successfully engaged in the project?
  2. Was ST successfully disseminated to a broad segment of MC community?
  3. What proportion of MC residents have:
    • downloaded the app?
    • became active users of the app?
    • became active long-term users of the app?
  4. For those who stopped using ST, why did they discontinue?
  5. How much are ST users using different functions in ST?
  6. How satisfied are MC residents with the ST app and its specific features?
Outcome Questions
  1. Does use of ST increase:
    • self-monitoring of healthy eating and physical activity?
    • self-efficacy for healthy eating and physical activity?
    • social support for healthy eating and physical activity?
  2. Does ST decrease perceived barriers to healthy eating and physical activity?
  3. Does use of ST increase healthy eating and physical activity among county residents?
  4. Does use of ST lead to lower BMI among those who are obese or overweight?
  5. Does use of ST help those with normal BMI prevent overweight?

The evaluation questions we have chosen to address in our evaluation plan.

Instrument Design

Design of the data collection instruments, whether they are surveys, interview questions, or something else, is often a big evaluation task. Stakeholder input is very important and helpful during instrument design.

Sampling

Evaluators also need to define their sampling strategies and sample size requirements. For our primary outcome evaluation design, we are lucky to have access to county-wide household surveys that are designed to reach representative samples of county residents.

Major types of sampling strategies include:

Sampling Strategy: Simple random sampling
Definition: Every unit has an equal probability of being selected for the sample.
Sampling Strategy: Stratified random sampling
Definition: Random sample that ensures representative samples of subgroups (strata) within the overall sample. For example, an ethnic minority group may be oversampled to obtain a representative subsample.
Sampling Strategy: Cluster sampling
Definition: Clusters of units are randomly selected. Clusters may be, for example, schools or census tracts. For example, if the county has 20 elementary schools, 5 of the 20 would be randomly selected to represent all students in the county.
Sampling Strategy: Census sampling
Definition: All units within a population are included.
Sampling Strategy: Convenience sampling
Definition: Also known as non-probability sampling. Units are not randomly selected.
Sampling Strategy: Snowball sampling
Definition: Participants provide referrals to other potential participants. This is useful, for example, when recruiting hard-to-reach populations or experts on a topic.

A power analysis can be conducted to calculate a required sample size for an evaluation. Many tools are available for sample size calculations, although they require some understanding of the principles of statistical inference. This is an area where many evaluators reach out to biostatisticians for help.

A final note about the Evaluation Design section: It is also important to have clear written protocols for data collection activities. These may need to be separate documents outside of the Evaluation Plan. They will list all steps that need to be carried out with each data collection activity, including instructions for recruitment and informed consent.

Now, let’s review what we have just discussed about indicators, data collection methods and sampling.

Question 1:

Which of the following is true?

Show answer

The correct answer is D:
Program evaluation is applied social science research and evaluators should consider the full range of social science research methods when designing evaluation. There are many potential data collection methods that could be used to answer any given evaluation question. Design of data collection instruments is typically a challenging and time-consuming task. Among sampling strategies, the most rigorous is called simple random sampling, which means that every unit has an equal probability of being selected.