0920-12IG Part B_08062013

0920-12IG Part B_08062013.docx

Targeted Surveillance and Biometric Studies for Enhanced Evaluation of CTGs

OMB: 0920-0977

Document [docx]
Download: docx | pdf







Targeted Surveillance and Biometric Study for Enhanced Evaluation of

Community Transformation Grants






New







Supporting Statement


Part B—Collection of Information Employing Statistical Methods








Original Submission: July 23, 2012

Revised August 6, 2013




Robin Soler, Ph.D.

Contracting Officer Representative (COR)

Division of Community Health

Centers for Disease Control and Prevention (CDC)

4770 Buford Hwy, N.E. MS K-45

Atlanta GA 30341

Telephone: (770) 488-5103

E-mail: [email protected]



TABLE OF CONTENTS

PART B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS



LIST OF ATTACHMENTS


Attachment Number



Attachment 1A Authorizing Legislation: Public Health Service Act

Attachment 1B Authorizing Legislation: ACA Section 4201

Attachment 2 60-day Federal Register Notice

Attachment 3A IRB Approval Letter, Standard Protocol

Attachment 3B IRB Approval Letter, Enhanced Protocol

Attachment 4A Strategic Directions and Examples of CDC-Recommended Evidence- and Practice-Based Strategies

Attachment 4B CTG Evaluation Plan

Attachment 4C List of CTG Awardees Included in the Targeted Surveillance and Biometric Study

Attachment 5 Other Data Sources Consulted

Attachment 6A Standard Protocol: Consent to Participate in Research (Paper)

Attachment 6A-S Standard Protocol: Consent to Participate in Research (Paper) – Spanish

Attachment 6B Standard Protocol: Consent to Participate in Research (Phone)

Attachment 6B-S Standard Protocol: Consent to Participate in Research (Phone) – Spanish

Attachment 6C Enhanced Protocol: Youth Assent Forms

Attachment 6C-S Enhanced Protocol: Youth Assent Forms – Spanish

Attachment 6D Enhanced Protocol: Consent to Participate in Research (Adults Only)

Attachment 6D-S Enhanced Protocol: Consent to Participate in Research (Adults Only) – Spanish

Attachment 6E Enhanced Protocol: Parental Permission to Participate in Research (Children Ages 3-17)

Attachment 6E-S Enhanced Protocol Parental Permission to Participate in Research (Children Ages 3-17) – Spanish

Attachment 7A Adult Targeted Surveillance Survey – Paper Booklet

Attachment 7A-S Adult Targeted Surveillance Survey – Paper Booklet – Spanish

Attachment 7B Adult Targeted Surveillance Survey – Paper Booklet FAQ Guide

Attachment 7B-S Adult Targeted Surveillance Survey – Paper Booklet FAQ Guide – Spanish

Attachment 7C Adult Targeted Surveillance Survey – Telephone

Attachment 7C-S Adult Targeted Surveillance Survey – Telephone – Spanish

Attachment 7D Adult Targeted Surveillance Survey – Documentation of Question Provenance

Attachment 8A ATSS Gift Form

Attachment 8A-S ATSS Gift Form – Spanish

Attachment 8B Letter Sent with Gift for Completing ATSS

Attachment 8B-S Letter Sent with Gift for Completing ATSS – Spanish

Attachment 9A Caregiver Survey

Attachment 9A-S Caregiver Survey – Spanish

Attachment 9B Youth Survey

Attachment 9B-S Youth Survey – Spanish

Attachment 9C Youth and Caregiver Survey – Documentation of Question Provenance

Attachment 10A Adult Biometric Measures Recruitment Screener (ATSS CATI Completes)

Attachment 10A-S Adult Biometric Measures Recruitment Screener (ATSS CATI Completes) – Spanish

Attachment 10B Invitation to Participate in Enhanced Protocol, Included with the Mailed ATSS

Attachment 10B-S Invitation to Participate in Enhanced Protocol, Included with the Mailed ATSS – Spanish

Attachment 10C Adult Biometric Measures Recruitment Screener (Paper Invitation Call-Ins)

Attachment 10C-S Adult Biometric Measures Recruitment Screener (Paper Invitation Call-Ins) – Spanish

Attachment 10D Enhanced Protocol: Paper Telephone Information Sheet

Attachment 10D-S Enhanced Protocol: Paper Telephone Information Sheet – Spanish

Attachment 11A Lead Letter Sent to Standard Protocol Sample in Advance of Telephone Contact

Attachment 11A-S Lead Letter Sent to Standard Protocol Sample in Advance of Telephone Contact – Spanish

Attachment 11A1 Lead Letter Sent to Enhanced Protocol Oversample

Attachment 11A1-S Lead Letter Sent to Enhanced Protocol Oversample – Spanish

Attachment 11A2 Letter Sent with First Mailing of Paper Questionnaire

Attachment 11A2-S Letter Sent with First Mailing of Paper Questionnaire – Spanish

Attachment 11A3 Letter Sent to Households Attempted by Telephone and Then Sent a Paper Questionnaire

Attachment 11A3-S Letter Sent to Households Attempted by Telephone and then Sent a Paper Questionnaire – Spanish

Attachment 11A4 Letter Sent with Second Mailing of Paper Questionnaire

Attachment 11A4-S Letter Sent with Second Mailing of Paper Questionnaire – Spanish

Attachment 11B ATSS Reminder Postcard

Attachment 11C Enhanced Protocol: Field Interviewer Script for Parent/Guardian of Youth Ages 12–17

Attachment 11C-S Enhanced Protocol: Field Interviewer Script for Parent/Guardian of Youth Ages 12–17 – Spanish

Attachment 11D Enhanced Protocol: Field Interviewer Script for Youth Ages 12-17

Attachment 11D-S Enhanced Protocol: Field Interviewer Script for Youth Ages 12-17 – Spanish

Attachment 11E Enhanced Protocol: Field Interviewer Script for Caregivers of Children Ages 3-11

Attachment 11E-S Enhanced Protocol: Field Interviewer Script for Caregivers of Children Ages 3-11 – Spanish

Attachment 11F Enhanced Protocol: Field Interviewer Script for Adult Participants

Attachment 11F-S Enhanced Protocol: Field Interviewer Script for Adult Participants - Spanish

Attachment 12A Adult Biometric Measures

Attachment 12A-S Adult Biometric Measures – Spanish

Attachment 12B Youth Biometric Measures (Ages 3–17)

Attachment 12B-S Youth Biometric Measures (Ages 12–17) – Spanish

Attachment 12C Adult Biometric Measures – Documentation of Question Provenance

Attachment 13A Accelerometry Instructions for Participants

Attachment 13A-S Accelerometry Instructions for Participants – Spanish

Attachment 13B Adult Activity Diary

Attachment 13B-S Adult Activity Diary – Spanish

Attachment 13C Youth Activity Diary

Attachment 13C-S Youth Activity Diary – Spanish

Attachment 13D Accelerometry Reminder Scripts

Attachment 13D-S Accelerometry Reminder Scripts – Spanish


B. Collection of Information Employing Statistical Methods

B.1 Respondent Universe and Sampling Methods

Respondent Universe

The respondent universe for the Targeted Surveillance and Biometric Study started with the 61 Community Transformation Grants (CTG) Program awardees that were initially funded by the Centers for Disease Control and Prevention (CDC) in 2011 to implement the program. Of these 61 awardees, geographic areas surrounding 20 of the CTG awardees were then purposively selected to provide the respondent universe for this study using the following criteria:

  • Only implementation (and not capacity-building) awardee areas were considered for inclusion.a

  • Some of these awardee areas were then excluded because they represented geographic areas that were too geographically or demographically dissimilar to other areas of the country (e.g., tribes, territories).

  • Awardee areas were separated out by geographic area to ensure adequate representation by region in the United States, type of awardee, and predominant racial and ethnic subgroups (e.g., states, state minus large counties, counties, rural vs. urban locations).

  • Awardees needed to be planning interventions that would be implemented jurisdiction-wide, expecting to affect a large proportion of the local population.

Once the 20 awardees were identified and approved by CDC, the contractor began working with the selected awardees to determine the areas within their designated geographies where they plan to implement the interventions with the greatest potential for impact (e.g., jurisdiction-wide, reaches a high number of residents) on the key outcomes (e.g., proper nutrition). Of these 20, 12 awardees will participate exclusively in the Standard Protocol, in which adults complete the Adult Targeted Surveillance Survey (ATSS) (Track A-1 or A-2 in Exhibit A.1.1). The remaining eight awardees have been selected to participate in the Standard Protocol and an Enhanced Protocol, which involves additional in-home data collection of adult and youth biometric measures and a Youth Survey completed by a selected child aged 12–17 years or a Caregiver Survey completed by a parent or an identified caregiver on behalf of a selected child aged 3–11 years (Track B or C of Exhibit A.1.1). A list of the 20 CTG awardees selected for the Targeted Surveillance and Biometric Study is provided in Attachment 4C.

The contractor will use an address-based sampling (ABS) approach to select a stratified simple random sample of households in the awardee areas targeted for interventions by the 20 selected CTG awardees. The source of the ABS frame is the Computerized Delivery Sequence (CDS) file, a list of addresses that originates from the United States Postal Service (USPS). The CDS file contains more than 97% of all addresses, post office boxes, and rural-route addresses. Although the CDS file also contains business addresses, only the residential portion of the file will be used for sampling purposes. Geographic information systems technology will be used to match each household address to a census block, which will facilitate construction of a sampling frame that is linked to the intervention geographies of each awardee.

The geographic areas surveyed for each awardee will be restricted to the regions where the awardee is planning intensive intervention activities. We restricted the eligible population to intensive intervention areas because these are the areas where we expect the greatest program impact and that therefore have the largest potential for change in health outcomes within each awardee area. To link the targeted areas to the sample frame, the areas are defined by census geographies: counties, census tracts, block groups, or blocks. The targeted areas are further refined to restrict the sampled geographies to those that have one million population members in the two largest awardee areas to minimize design effects. As an example of the targeted areas, Exhibit B.1.1 displays the targeted areas for Maryland, an awardee designated as “state minus large counties” (i.e., all counties except Baltimore), whose targeted areas are defined in terms of counties. Exhibit B.1.2 displays the targeted areas for Denver County, a community awarded at the county level whose targeted areas are defined by census blocks.

Exhibit B.1.1. Map of Survey Areas for the State of Maryland


Exhibit B.1.2. Map of Survey Areas for Denver County



Residents from rural areas and African American and Hispanic individuals will be oversampled to allow monitoring of the CTG Program intervention effects on reducing health disparities in populations that historically have exhibited a greater burden of chronic diseases. The oversampling strategy will ensure that sample sizes for the rural, African American, and Hispanic subpopulations will have adequate power to detect changes in means and prevalences for the data collected in the Standard Protocol across the data collection periods. Additionally, in the eight awardee areas selected to receive the Enhanced Protocol, households with available telephone numbers and children 3–17 years of age will be oversampled to facilitate recruitment into the Enhanced Protocol and to achieve the sample-size goals for children.

The following are the target sample-size goals (number of completions) for the Standard Protocol during each data collection period:

  • 20,000 total respondents (i.e., 1,000 respondents in each of 20 geographical areas)

  • 4,000 total rural respondents

  • 4,000 total African American respondents

  • 4,000 total Hispanic respondents

The following are the target sample-size goals (number of completions) for each of the eight awardee areas selected for the Enhanced Protocol during each data collection year:

  • 500 adult respondents

  • 300 child respondents (3–17 years old)

To achieve these goals, we will initially include greater numbers in the sample to allow for noncontacts and refusals.

For each data collection period (Exhibit B.1.3), the target sample sizes are 20,000 Standard Protocol respondents, plus an additional 3,695 adults participating in the Enhanced Protocol who complete the Standard Protocol (i.e., ATSS); and 4,000 adult and 2,400 child respondents who complete the in-home visit.

Exhibit B.1.3. Target Sample Size for Each Data Collection Period

Type of Awardee

# of Awardees

Standard Protocol

Enhanced Protocol

Awardees receiving Standard Protocol only

12

12,000

(1,000 per awardee)

0

Awardees receiving both Standard and Enhanced Protocol

8

11,695

(1,461 per awardee)

4,000 adults

(500 per awardee)

2,400 children

(300 per awardee)

Total across all awardees

20

23,695

4,000 adults

2,400 children


Sample Selection

To achieve the target sample size, we will subdivide the sampling frame into strata consisting of all combinations of the characteristics listed below. The sample selection for the Standard and Enhanced Protocols begins with the same frame:

  • Awardee

  • Rural/urban designation

  • African American density (high, medium, and low)

  • Hispanic surname (yes/no)—from a list of the 650 most common Hispanic surnames

  • Presence of child (yes/no)

  • Telephone number match (yes/no)

Standard Protocol

For selection of the sample to be recruited for the Standard Protocol, we start by stratifying the frame on the above-mentioned characteristics. We stratify by rural/urban, African American density, and Hispanic surname to control the sample size of the subpopulations. The proportion of sample that responds in each stratum will be monitored throughout the course of data collection, and sampling probabilities for each stratum will be adjusted if necessary to ensure that we achieve our sample-size targets.

For each of the 20 awardee areas, we will geocode the frame addresses and assign them to census blocks, block groups, tracts, and counties. To identify an address as rural or urban, we will apply the National Center for Health Statistics (NCHS) Urban-Rural Classification Scheme for Counties.b

We assign addresses to high, medium, or low African American density using 2010 census data for each address’s block group. An address is defined as high-density African American if it is in a block group with 75% or more African Americans; medium-density African American if it is in a block group with 50%–75% African Americans; and low-density African American if it is in a block group with less than 50% African Americans.

Addresses in the frame are flagged for Hispanic surname (yes/no), presence of a child in the household (yes/no), and telephone number match (yes/no) by merging the Acxiom InfoBase consumer databasec with the sample frame, matching on addresses. We anticipate that 75% or more of the addresses will have a corresponding record on the Acxiom InfoBase consumer database; addresses without a match will be combined with the “no child flag,” “no Hispanic surname flag,” and “no telephone number flag.” Preliminary work with the Acxiom database suggests that approximately 10% of households in the United States will be assigned a Hispanic surname flag and 20% of households will be flagged as having a child resident. We also estimate that approximately 50% of each sample will have a “telephone flag,” so that those households can be among those selected initially for a telephone interview.

The stratification divides households on the frame into mutually exclusive and exhaustive strata. Every stratum will be sampled, albeit at different sampling fractions. Consequently, every frame member has a chance of being selected. For example, a household that has one or more Hispanic individuals but that was not flagged as having a Hispanic surname still has a probability of selection. The household- and individual-level sampling weights are based in large part on this probability of selection. By taking the sampling weights into account when analyzing the data, we ensure that statistics computed from the sample data accurately reflect (or are unbiased estimates of) the entire target population from which we have sampled.

When we select the sample, we will use a nonlinear optimization procedure (SAS OPTMODEL) in which an objective function (unequal weighting effect) is minimized subject to various constraints (sample-size requirements). That is, we are minimizing the increase in variance due to the unequal selection probabilities subject to the sample-size requirements described previously. At the onset of the study, we assume that the different strata have the same yield rate (ratio of completed interviews to sample fielded). However, we know from experience that there will be a range in yield rates. In each wave, we will monitor the yield rate and adjust the allocation of subsequent waves based on these data.

Enhanced Protocol

The Enhanced Protocol contains an oversample of the ABS sample frame (see Exhibit A.1.1) to facilitate obtaining the required sample size of households with children. Children are critical to include in this study because data on outcomes specific to their health behaviors are limited, particularly when trying to assess objective changes in body mass index (BMI) and other core outcomes specified in the ACA legislation. The additional sample will primarily come from the stratum that consists of addresses identified as containing a child in the household (i.e., a “child flag”). Oversampling households with children in the eight awardee areas will accommodate the sample-size goal of completing a child interview in 60% of the households that participate in the Enhanced Protocol.

In addition to the standard components of the Enhanced Protocol, households in four of the eight awardee areas will be asked to participate in collection of physical activity data through an accelerometer worn by the respondent for a period of 7 days (Attachments 13B, 13C, 13B-S, and 13C-S). A total of 500 adult/child pairs (125 dyads per awardee area) will be recruited to wear an accelerometer, record their activities in an Activity Diary, and return the materials and device by mail after the data collection period.

Recruitment Strategies

Once the sample has been selected, we will recruit households by both mail and telephone. Those selected for initial telephone interviews will be randomly selected from households in the sample with a “phone append” (i.e., a telephone associated with the address). Throughout the recruitment process, we will be monitoring response rates so that recruitment strategies can be adjusted to maximize response rates. Once a household has been contacted, the adult respondent with an upcoming birth date closest to the interview date will be selected. The following details the recruitment strategies for both the Standard and Enhanced Protocols.

Standard Protocol

Exhibit B.1.4 depicts the strategy for recruiting households into the Standard Protocol. We will send an initial mailing packet to all sampled households, asking them to participate. Some households will receive a paper questionnaire packet and others will receive a telephone lead letter packet. (Please see section B.2 for details about the contents of these packets.) Paper questionnaire packet recipients will not receive a phone call and are labeled in the figure as “mail contact exclusively.” Households with an associated telephone number will receive a telephone lead letter packet and are labeled in the figure as “phone contact attempted.” Exhibit B.1.4 shows initial targets for each of these types of mailings, split out by awardee type. The proportions of mail and telephone contacts may vary over time as we adjust for yield rates.

Exhibit B.1.4. Recruitment (per awardee) into the Standard Protocol

Enhanced Protocol

Exhibit B.1.5 depicts the strategy for recruitment into the Enhanced Protocol. Because recruitment of households for in-home visits is likely to be challenging, we have designed two tracks through which to recruit a sufficient number of households into the Enhanced Protocol. These two tracks are (as described in Section A.1):

1. Invitation to households in the eight CTG awardee areas selected for the Enhanced Protocol to participate in this protocol (Track B of Exhibit A.1.1 and B.1.5), which involves an in-home visit by a trained field interviewer

2. Oversample of households with child flags (from our sampling frame) in the eight CTG awardee areas and obtain their consent by telephone to participate in both the Standard Protocol and Enhanced Protocol (Track C of Exhibit A.1.1 and B.1.5)

Invitations to participate in the Enhanced Protocol will be given to adults in the eight CTG awardee areas once they complete the ATSS (Exhibit A.1.1). This will involve simply asking the adult if he or she would be willing to participate in an in-home visit so the contractor can collect biometric measures. If a child lives in the home, the adult will be asked to either consent for the child to participate in an in-home visit or the interviewer will request that a parent or caregiver provide consent over the telephone to schedule a visit. Regardless of recruitment method for the household, if more than one child in a household is eligible to participate, the child with the upcoming birth date closest to the interview date will be selected. However, using this process, we estimate that less than 15% of Track B households (or approximately 145 adults and 85 children per awardee area) will consent to an in-home visit. To increase the number of Enhanced Protocol respondents, we will field an oversample (Track C) from strata containing households with children aged 3-17 (“child flag”), yielding an initial sample of approximately 9,000 households. This oversample will enable us to meet our Enhanced Protocol sample-size targets of 500 adults and 300 children per awardee area (see Exhibit B.1.5).

Exhibit B.1.5. Recruitment (per awardee) into the Enhanced Protocol



Statistical Power

Our sample-size goals were developed to give us adequate statistical power to test for changes in key outcomes over the study period. This power analysis will quantify the effect size we will be able to detect with 80% power for various domains. For the Standard Protocol, we powered the study for a change in a binomial proportion because most of the key outcomes are categorical. For the Enhanced Protocol, we powered the study for a change in a continuous outcome because most of the biometric measurements in the Enhanced Protocol are continuous.

Standard Protocol

In keeping with the Specific Aims of the CTG Targeted Surveillance and Biometric Study (Section A.1), we performed power calculations to quantify the effect sizes we can detect with 80% power for three sample domains: (1) in all 20 awardees combined, (2) across each health disparities population, and (3) within one awardee. The effect sizes are expressed as an absolute change in the prevalence of a health outcome. For example, if smoking rate is 20%, then a reduction to 15% would be a 5% absolute change. In contrast, a reduction to 19% would be a 5% relative change. Exhibit B.1.6 displays the target sample sizes. The nominal sample size refers to the quantity of respondents. The effective sample size is the nominal sample size divided by the unequal weighting effect, which we estimate to be two. The effective sample size is the sample size that a simple random sample would have to be to produce an outcome with the same variance as the nominal sample size. The difference between the effective sample size and nominal sample size is primarily a consequence of the unequal probabilities of selection and differential response rates. To account for the sample design with an estimated unequal weighting effect of 2, when computing the power, we use the effective sample size, but we report the power in terms of the nominal sample size.

Exhibit B.1.6. Target Sample Sizes for the Standard Protocol

Domain

Nominal Sample Size

Effective Sample Size

All 20 awardees

20,000

10,000

Black/African American

Hispanic

Rural

4,000 of each

2,000 of each

One awardee

1,000

500


The rationale for the proposed sample-size targets is based on power calculations for the repeated cross-sectional design with pre- and post-intervention measures. We used one-tailed tests in our calculations because primary evaluation questions relate to whether there have been improvements in health behaviors and health outcomes. The traditional convention of using a two-tailed test to protect against an unexpected effect in the opposite direction was considered, but the increase in the sample size needed to be able to detect an intervention effect in the unexpected direction was not justifiable given evidence that the interventions being implemented by CTG awardees will lead to positive effects. Thus, we assume that each of the key outcomes can be described as having a direction considered to be an improvement. As an example, a reduction in the prevalence of adult smoking would be an “improvement.” For simplicity, we only present findings for the case where a positive change corresponds to improvement. The following sections provide details on the power calculations for each of the three domains listed previously for the Standard Protocol.

We begin by considering a test of whether the prevalence of a binary outcome has changed between the first and last data collection period. Power for this test was based on comparing two independent proportions.d Exhibit B.1.7 presents the minimum improvement in the prevalence of a binary outcome (vertical axis) between two time intervals detectable with 80% power. The detectable change in proportion at 80% power is a function of the prevalence of the first measurement, which is designated on the horizontal axis. The three curves correspond to three target sample sizes:

  • The red line corresponds to all study respondents with a nominal target sample size of 20,000.

  • The blue line corresponds to the race/ethnic and rural subpopulation with a nominal target sample size of 4,000.

  • The black line corresponds to a particular awardee with a nominal target sample size of 1,000.

In all power calculations we assume a Type I error rate of 0.05 (alpha = 0.05) and an unequal weighting effect of 2.0.

The following is an example of how to read the graph in Exhibit B.1.7 to obtain the detectable effect size at 80% power for a given percent. Suppose in the first time point, 33% of the rural population exercised in the last 24 hours, as indicated by the vertical gray line at 33% on the horizontal axis. We would have 80% power to detect a 3.8% absolute increase in the percentage of individuals who exercised in the last 24 hours in the rural domain as indicated by the horizontal gray line at 3.8% (i.e., a change in exercise prevalence from 33.0% to 36.8%).

The red line in the figure shows the power for analyses for the domain of all awardees combined. We have 80% power to detect a difference between two time intervals for a change in proportions of less than 1.8% in the most conservative case, where the initial proportion is 0.5. Even smaller differences could be detected with the sample size for other baseline proportions.

The blue line in the figure shows the power for the health disparity subpopulations of size 4,000: African American, Hispanic, and rural. We have 80% power to detect a difference between two time intervals for a change in proportions of less than 4.0% in the most conservative case, where the initial proportion is 50%. Even smaller differences could be detected with the sample size for other baseline proportions.

The black line in the figure shows the power for analyses for the domain of an individual awardee. We have 80% power to detect difference between two time intervals for a change in proportions of less than 7.8% in the most conservative case, where the initial proportion is 0.5. Even smaller differences could be detected with the sample size for other baseline proportions.

Exhibit B.1.7. Population Percentage Difference Detectable with 80% Power as a Function of the Percentage at First Measurement Occasion and the Sample Size

Enhanced Protocol

For the Enhanced Protocol, we powered the study for a change in a continuous outcome because most of the biometric measurements in the Enhanced Protocol are continuous (e.g., BMI, waist circumference, or cotinine level). In this repeated cross-section design, changes in the sample means, such as BMI, will be computed for each measurement occasion. Power measures the probability of detecting differences in means between measurements at two time points. Detecting differences in means is a function of both the difference between the means of the samples and the variance of the samples, which we assume to be equal. Cohen defines effect size, measured in standard deviation units, as the ratio of the absolute difference between two means and the variance of the samples.35 Cohen then defines the effect sizes of 0.2 as small, 0.5 as medium, and 0.8 as large.

For this power analysis, we used a two-sample t-test assuming equal variances. We used a one-sided test for the reasons described previously. We assumed a Type I error rate of 0.05 (alpha = 0.05) and an unequal weighting effect of 2. Exhibit B.1.8 presents sample sizes for four domains and the effect size detectable at 80% power expressed in standard deviation units. We have a target nominal sample size of 4,000 adults across the eight awardees. This provides an effective sample size of 2,000. This gives us the ability to detect an effect size of 0.08 standard deviation units, which is small by Cohen’s definition. Hence, our study is well powered to detect changes in means for adults in all eight awardees. For the domain of children across the eight awardees we have an effect size of 0.1, which is also small. When we consider adults and children within a single awardee, the detectable effect sizes are 0.22 and 0.29 standard deviation units, respectively. This is modestly larger than Cohen’s cutoff of 0.2 for small effect sizes. Consequently, we consider the study adequately powered, even for domains comprising an individual awardee, to observe awardee-level impacts on childhood obesity- and tobacco-related outcomes.

Exhibit B.1.8. Target Sample Sizes for the Enhanced Protocol and Effect Size Detectable at 80% Power

Domain

Nominal Sample Size

Effective Sample Size

Effect Size

Standard Deviation Units

Adults in all eight awardees

4,000

2,000

0.08

Children in all eight awardees

2,400

1,200

0.10

Adults in one awardee

500

250

0.22

Children in one awardee

300

150

0.29


Exhibit B.1.9 presents a graph of the effect size expressed in standard deviation units (vertical axis) detectable with 80% power as a function of the sample size (horizontal axis) assuming a Type I error rate of 0.05 and an unequal weighting effect of 2. This graph is useful for the reader to determine the detectable effect size at 80% power for domains with sample sizes other than the four described in Exhibit B.1.8. The gray lines highlight the detectable effect size expressed in standard deviation units for the four domains in the table (nominal sample sizes: 300, 500, 2,400, and 4,000).

Although the effect size is expressed as standard deviation units, it may be translated into relevant units of biometric measurements. For example, in a previous study, the difference in waist circumference between adolescents with high metabolic syndrome scores and low metabolic syndrome scores was 2.4 cm with a standard deviation = 8.4 cm.36 Our study has enough power to detect a differences in this range (0.29 * 8.4 = 2.44 cm).

For salivary cotinine, a change of 0.29 standard deviation units is approximately 0.94 ng/mL,37 a difference that is smaller than differences between homes with total smoking bans and homes with no smoking restrictions.



Exhibit B.1.9. Population Mean Difference Detectable with 80% Power as a Function of Sample Size


B.2 Procedures for the Collection of Information

The Targeted Surveillance and Biometric Study uses multiple modes: mail or telephone initially, followed by an in-home visit for eligible households. The Standard Protocol interview, the Adult Targeted Surveillance Survey (ATSS), will be conducted by mail questionnaire and CATI (computer-assisted telephone interview). Mail is often a less expensive mode of data collection and ensures that all households have a probability of selection, including those without a telephone number match. CATI is more efficient for ATSS completions that must be coupled with an Enhanced Protocol in-home visit because (a) telephone recruitment to an in-home survey yields more participants than pure mail recruitment and (b) telephone interviewers can schedule an in-home visit immediately after participants complete the ATSS over the telephone, thus minimizing the time gap between collection of ATSS and in-home visit data. The Standard Protocol will be conducted in all 20 CTG awardee areas. An Enhanced Protocol will be conducted in respondent homes in 8 of the 20 awardee areas.

All components of the CTG Targeted Surveillance and Biometric Study will be conducted in English or Spanish. The instrument for the Standard Protocol, the ATSS, is shown in Attachments 7A in English and 7A-S in Spanish. The Enhanced Protocol instruments are also shown in English and Spanish: the Youth Survey (Attachments 9B, 9B-S), Caregiver Survey (Attachments 9A, 9A-S), and instructions for completing the Adult Biometric Measures (Attachments 12A, 12A-S) and Youth Biometric Measures (Attachments 12B and 12B-S).

Who Collects the Data: Data for the Standard Protocol will be collected through the mail from a printed questionnaire or by trained telephone interviewers under the contractor’s project supervisors. Field interviewers will collect the Enhanced Protocol data with oversight by the contractor’s data collection project manager.

Where/What: The Standard Protocol will be administered biennially in a representative sample of residents living in geographic areas targeted for interventions by 20 CTG awardees. A target goal of 23,695 adults will be recruited to complete the Standard and Enhanced Protocols by completing the ATSS. As described in Section A.1 and illustrated in Exhibit A.1.1, respondents will be recruited into the study through one of four possible tracks: as part of the Standard Protocol either through mail (Track A-1) or telephone (Track A-2), invitation to participate in the Enhanced Protocol after completion of the ATSS (Track B—for eligible households only) or from the start as a household invited to participate in both the Standard and Enhanced Protocol (Track C). We estimate that approximately 30% of the households completing the Enhanced Protocol will be recruited from Track B and approximately 70% will be recruited from Track C. Data from all households participating in the Enhanced Protocol will include information about an adult participant and up to 2,400 of the 4,000 households (300 of the 500 in each awardee area) will also include data collection from an eligible child.

Frequency: Data collection will be conducted biennially (starting in 2013, upon Office of Management and Budget [OMB] approval), with both the Standard Protocol and Enhanced Protocol being applied during each data collection period (2013, 2015, 2017).

Procedures:
Sample Frame Construction

The United States Postal Service (USPS) CDS file of addresses will be used to create the ABS frame. A complete description of the sampling procedures to be used in both the Standard Protocol and Enhanced Protocol appears in Section B.1 (Sample Selection section). Addresses in the sample frame will be geocoded and assigned to census blocks, block groups, tracts, and counties. Census data at the tract and block-group levels will be appended to the frame; census data includes total population, percentage African American, percentage Hispanic, and other demographic variables. The NCHS county-level urban or rural designation will be appended to the frame. The frame will also be matched to the Acxion marketing database to obtain indicators for households and household members including Hispanic surname, date of birth and other age variables, and flags indicating the availability of landline and cell phone numbers. Although some of the flags and indicators may be incomplete, they nevertheless provide a useful way of stratifying the frame to sample certain subpopulations more efficiently. The sample will be selected from all strata and weighted appropriately to produce unbiased estimates for the targeted population totals. The selected sample will then be sent to two vendors to have actual telephone numbers appended to sample records. The sample will be divided into replicates and into recruitment modes for data collection. The proportions assigned to mail or telephone mode may vary over time as we evaluate the costs and yields of the modes.

Modes of Contact

Some data collection procedures will vary depending on type of community and attributes of the household. Three flow charts (Exhibits B.2.1a, B.2.1b, B.2.1c) presented in this section depict the data collection procedures for each of four tracks (described in Section A.1 and illustrated in Exhibit A.1.1). These tracks include:

  1. Standard Protocol — Data collection procedure in the 20 CTG awardee areas in which we invite one adult in each sampled household to complete the ATSS (Exhibit B.2.1a).

      • Mail (Track A-1, Exhibit B.2.1a). In this track, we invite participation in ATSS by sending paper questionnaire packets. (Contents of these packets are described below.)

      • Telephone (Track A-2, Exhibit B.2.1a). ). In this track, we invite participation in ATSS by sending telephone lead letter packets. We then follow up by attempting to contact households by telephone.



  1. Enhanced Protocol — Data collection procedure in the eight CTG awardee areas in which we invite some adults in sampled households to complete both (a) the ATSS and (b) an in-home visit, for biometric data collection.

      • Enhanced Protocol Invitation After ATSS. (Track B, Exhibit B.2.1b) In this track, we invite in-home visit participation only after the selected adult has completed the ATSS.

      • Enhanced Protocol Invitation Before ATSS. (Track C, Exhibit B.2.1c) — This track will boost the total number of in-home biometrics visits, especially ones that include children aged 3-17 years. On this track, we invite participation in both the ATSS and the in-home visit and only start the ATSS if the selected adult respondent expresses interest in both. We also oversample for households containing children aged 3-17 years and may screen out some households lacking children.



  1. Standard Protocol

In all 20 awardee areas, the Standard Protocol will be followed to recruit most of the adult participants into the study. Exhibit B.2.1a provides a visual depiction of the data collection flow for these Standard Protocol awardee areas. Track A-1 shows the progression of attempts to contact sample members whose first mailing includes a paper questionnaire (Attachments 11A2 and 11A2-S). Track A-2 shows the progression of attempts to contact the sample members whose first mailing includes an invitation to complete the ATSS over the telephone (Attachments 11A and 11A-S). The following explains the step-by-step process to be followed in recruiting households into the Targeted Surveillance and Biometric Study through initial contact with a sampled adult recruited into the Standard Protocol.

First Contact Attempt. All selected addresses will be mailed a tailored letter (based on the track to which they are assigned) introducing the study, inviting the household to participate, and giving the household the option of calling a toll-free number to either ask questions about the study or complete the ATSS over the telephone. In this first mailing to households, all envelopes will also contain a $2 bill as a gift for helping with the screening process.

  • Paper Questionnaire Packets—Track A-1, Exhibit B.2.1a. Some households will receive packets with materials needed to participate on paper:

  • A letter inviting participation using the enclosed questionnaire, in addition to offering the option of calling a toll-free number to participate (Attachment 11A2)

  • A $2 bill

  • A printed questionnaire (Attachment 7A)

  • A frequently asked questions pamphlet (Attachment 7B)

  • Thank-you gift information form (Attachment 8A)

  • Postage prepaid return envelopes

  • For households in the Enhanced Protocol awardee areas, a recruitment flyer inviting them to call in about completing the Enhanced Protocol in a home-visit (Track A-1 and A-2; Attachment 10B)

  • For households with a Spanish surname or located in a high-density Hispanic area, Spanish versions of the materials will be enclosed (Attachments 11A2-S, 7A-S, 7B-S, 8A-S, 10B-S)

  • Telephone Lead Letter Packets—Track A-2, Exhibit B.2.1a. Some households will receive packets focusing on participation by telephone:

  • A letter that focuses only on details of participation by telephone (Attachment 11A).

  • A $2 bill

  • For households with a Spanish surname or located in a high-density Hispanic area, Spanish versions of the letters will also be enclosed (Attachments 11A-S and 11A1-S)

We estimate that around 92% of the first-class mailings will be delivered successfully.

Intermediate Contact Attempts.


Mail — Track A-1


  • Postcards. All addresses that were sent a paper questionnaire packet (Track A-1 in Exhibit B.2.1a) will also be sent a reminder postcard (Attachment 11B). All postcards sent will have a line in Spanish, letting Spanish-speakers know that Spanish-speaking staff members are available to receive their calls, but no Spanish-only postcards will be sent. The postcard will reiterate the initial invitation to respond via mail or by calling a toll-free number to complete the ATSS via CATI.

  • Second Paper Questionnaire Packets. Approximately 4 weeks after an initial paper questionnaire has been sent (Track A-1), a list of nonrespondents will be generated. All deliverable addresses on this list will be sent a second questionnaire packet that is the same as the first, with the exception of slightly altered wording in the cover letters (Attachments 11A4 and 11A4-S) and lack of a $2 bill. The $2 gift is not planned for enclosure in this mailing because a $2 bill will have already been sent to these households in the initial mailing.

Telephone Track A-2

  • Telephone Call-outs. For selected telephone-matched addresses, interviewers will attempt to contact the household by telephone, select an adult respondent, and then complete the ATSS by CATI (Attachment 7C). During telephone contacts, connection of the telephone number with the sampled address will be verified. If the telephone number listed for a case does not belong to anyone living at the sampled address, the telephone number will not be called again.

  • Paper Questionnaire Packets for Telephone Noncontacts. If a telephone number is invalid for a sampled address, or if eight call attempts fail to result in a contact, up to two paper questionnaire packets will be sent to the sampled address. These paper questionnaire packets will contain the same materials sent to households never attempted by telephone, except that the first packet will not contain $2 or make reference to it in either its English or Spanish cover letter (Attachments 11A3, 11A3-S), because a $2 bill was already sent to the household with the telephone lead letter.

Final Contact—In addition to receiving a packet in the mail at the start of the study, each participant will receive a final mailing that includes a gift for their time (Attachments 8B and 8B-S).

  1. Enhanced Protocol.

Invitation After ATSS. (Track B, Exhibit B.2.1b)

In the eight CTG awardee areas selected for collection of the Enhanced Protocol, adults who complete the ATSS will be invited to participate in the Enhanced Protocol over the telephone (Attachments 10A and 10A-S) or via a pamphlet (Attachments 10B and 10B-S) inserted in the paper questionnaire packets described previously (Track B). If they agree to participate, the telephone interviewer (TI) will schedule a time for an in-home visit for the additional data collection by a field interviewer (FI). If the respondent resides with at least one child aged 3–17 years, the child’s parent or guardian will also be asked to grant permission for the child’s participation in the Enhanced Protocol (Attachments 10A and 10A-S). The scheduled appointment will be confirmed by the FI through a reminder call prior to visiting the home.

Invitation Before ATSS. (Track C, Exhibit B.2.1c)

Recognizing the difficulty in recruiting households to participate in an in-home visit, particularly households with children, the Targeted Surveillance and Biometric Study includes an oversampling (from the same ABS frame as the Standard Protocol) of households we think are more likely to contain at least one child aged 3-17 years . This oversampling (Track C) will provide additional eligible households to reach targets for participation of children. Track C will be initiated by sending either a telephone lead letter packet or a paper information packet.

  • Telephone Lead Letter Packets—Track C, Exhibit B.2.1c. The letter sent to Enhanced Protocol oversample members differs from the one sent to Standard Protocol sample members, but the packets are otherwise the same.

  • The letter invites participation to both protocols and lets potential participants know that we will call them or they may call in if they are interested in participating (Attachment 11A1)

  • A $2 bill

  • For households with a Spanish surname or located in a high-density Hispanic area, Spanish versions of the letter will also be enclosed (Attachments 11A-S and 11A1-S)

  • Paper Information Packets—Track C, Exhibit B.2.1c. Households in the Enhanced Protocol oversample that do not have a telephone match will receive a recruitment letter that enables them to express an interest by providing a telephone number where they can be reached or by calling in.

  • A letter that invites participation to both protocols and offers the option of either calling in to complete the initial screening or providing a telephone number on paper, so that we can call them instead. (Attachment 10D).

  • A $2 bill.

  • For households with a Spanish surname or located in a high-density Hispanic area, Spanish versions of the letter will also be enclosed (Attachment 10D-S)

If an adult in the home agrees to participate in both protocols (i.e., complete the ATSS and grant permission for an in-home visit), the ATSS will be completed by telephone, and the in-home visit will be scheduled. If a child resides in the household, he or she will be invited to participate in the in-home visit and his or her parent or caregiver will be asked to grant permission for participation. If a child participates, the interviewer will administer the Caregiver Survey to the adult caregiver of the child, if under the age of 12, or the Youth Survey to the child him- or herself if the child is aged 12–17 years.

For collection of biometric measures from adults and children, the FI will bring all the necessary equipment to obtain the data during the in-home visit. The FI will begin collection by explaining each measure that will be collected and administering a short survey (to the adult and then the youth, if aged 12–17 or caregiver, if youth is aged 3–11) to obtain data relevant to the validity of the biometric measures (e.g., current hypertension medications, exposure to secondhand smoke). The FI will first collect the biometric measures from the adult (weight, height, waist circumference, blood pressure, saliva). Then the interviewer will collect the biometric measurements (height, weight, waist circumference, and saliva sample) for the selected child.

In four of the awardee areas, the interviewer will invite adult-child pairs to participate in the accelerometry substudy. Consenting households will be given one accelerometer for the adult and one for the child to be worn for one week (Attachments 13A and 13A-S). Adults will be asked to fill out a daily activity diary for both people (Attachments 13B and 13C or 13B-S and 13C-S). Participants will be called on the third or fourth wearing day. This call is to ensure that the meter is being worn correctly and to answer any questions. It also helps to remind people to start wearing it and establishes a pattern of frequent communication between the participant and the recruiter (Attachments 13D and 13D-S). At the end of the week, the adult will be asked to mail in the accelerometers and diaries.

Exhibit B.2.1a. Mode and Order of Contacts in the 12 Standard Protocol–only Awardee Areas


Exhibit B.2.1b. Mode and Order of Contacts in the 8 Enhanced Protocol Awardee
Areas (Track B)


Exhibit B.2.1c. Mode and Order of Contacts in the 8 Enhanced Protocol Awardee
Areas (Track C)


B.3 Methods to Maximize Response Rate and Minimize Nonresponse

For the Targeted Surveillance and Biometric Study, some nonresponse can be expected. Nonresponse may arise from noncontact, refusals, and inability to schedule the in-home examination during the data collection window (for the Enhanced Protocol). Nonresponse is a potentially serious methodological threat to the interpretation of the study findings, particularly if it occurs differentially across the years of data collection or across subpopulations (i.e., nonignorable nonresponse). To reduce the potential for nonresponse bias, several strategies will be used; these are presented subsequently.

Methods to Maximize Response

1. Minimizing Noncontacts

Standard Protocol

Every sample member contacted by telephone for a CATI interview (see left side of Exhibit B.1.4) will be sent a lead letter (Attachments 11A, 11A-S) explaining the survey and letting them know that they will be called and asked to participate. Every sample member contacted by mail for a paper interview (see right side of Exhibit B.1.4) will receive a $2 bill as a gift in the first mailing. Sample members who do not respond to the first paper questionnaire mailing will receive a second mailing to invite one adult in the household to complete the survey. Lead letters and follow-up letters have both been shown to increase survey response rates.38,39

Interview staff will make at most eight attempts to contact sample members with a valid, working telephone number if no person answers the telephone. Interview staff will make at most 16 attempts to contact sample members if a contact has been made but the interview was not completed. Exceptions to the maximum attempts rule will only be made under appropriate circumstances, such as fewer calls when a sample member completes an interview, fewer calls when a sample member refuses participation, or when a sample member requests an appointment that requires additional calls.

Enhanced Protocol

Every potential household to be included in the Enhanced Protocol will already have had an adult participate in the ATSS (CATI or mailings) and agree to the in-home examination. FIs will make up to five attempts to schedule the in-home visit.

2. Avoidance of Refusals

Participation rates will be maximized through several means: gifts, interviewer training, and administration of the Standard and Enhanced Protocols in Spanish or English.

Standard Protocol

Gifts. The lead letter sent to every potential Standard Protocol respondent will state that $20 will be given as a token of appreciation for completing the ATSS. The lead letter sent to potential Enhanced Protocol respondents will refer to the total of gifts offered for participation in all the data collection, which totals $60 for adults and $10 for children. CATI staff will also mention the relevant gift(s) as they introduce the study. Offering a gift will help gain cooperation from a larger proportion of the sample and compensate respondents on cell phones for the air time used. Promised gifts have been found to be an effective means of increasing response rates in telephone surveyse (e.g., Cantor, Wang, and Abi-Habib40) and reducing nonresponse bias by gaining cooperation from those less interested in the topic.41-43 All sampled households will also receive $2 along with their first invitation letter, as an additional gift to participate (Exhibit B.2.1). Small prepaid incentives have been found to produce modest improvements in screener response rates.44

Interviewer Training and Contact Procedures (CATI). Response rates tend to vary greatly across interviewers.45 Improving interviewer training has been found effective in increasing response rates, particularly among interviewers with lower response rates.46 For this reason, extensive interviewer training is a key aspect of the success of this data collection effort. The following interviewing procedures will be used to maximize response rates for the ATSS CATI survey:

1. Interviewers will be briefed on the potential challenges of administering this survey. Well-defined conversion procedures will be established.

2. If a sample household member initially refuses to participate or gives a noncommittal response (i.e., neither refusing nor agreeing to participate), a member of the interviewing staff will call back in an attempt to recontact the household to explain the importance of participation. Exceptions will be made when appropriate, such as when a sample household member’s initial refusal was especially intense or if a specific category of recontact attempts are forbidden by an Institutional Review Board.

Any call backs to cases with prior refusals will be made by conversion staff. Conversion staff are highly experienced interviewers who have demonstrated success in eliciting cooperation. The main purpose of this contact is to ensure that the sample household members understand the importance of the survey and to determine whether anything can be done to make the survey process easier (e.g., schedule a more convenient contact time). At no time will staff be allowed to pressure or coerce sample household members to change their mind about participation in the survey, and this will be carefully monitored throughout survey administration to ensure that no undue pressure is placed on potential respondents.

3. Should a respondent interrupt an interview for reasons such as needing to tend to a household matter, the respondent will be given one or both of these two options: (1) the interviewer will reschedule the interview for completion at a later time; or (2) the respondent will be given a toll-free number, designated specifically for this project, for him or her to call back and complete the interview at his or her convenience.

4. Interviewing staff will be able to provide reluctant respondents with the name and telephone number of the contractor’s project manager who can provide them with additional information regarding the importance of their participation.

5. The contractor will establish a toll-free number, dedicated to the project, so potential respondents may call to confirm the study’s legitimacy.

Special attention will be given to scheduling callbacks and refusal procedures. The contractor will work closely with CDC to set up these rules and procedures. Examples include the following:

  • Detailed definition when a refusal is considered final.

  • Monitoring of hangups, when they occur during the interview, and categorizing a case as a refusal if three hang-ups are reached.

  • Calling will occur during a wide variety of times, both during weekdays and on the weekends. Calls will be made up to 9 PM in the sample household’s time zone, except on days that the contractor’s call center closes earlier.

Refusal avoidance training will take place approximately two to four weeks after data collection begins. During the early period of fielding the survey, supervisors, monitors, and project staff will observe interviewers to evaluate their effectiveness in dealing with respondent objections and overcoming barriers to participation. They will select a team of refusal avoidance specialists from among the interviewers who demonstrate special talents for obtaining cooperation and avoiding initial refusals. These interviewers will be given additional training in specific techniques tailored to the interview, with an emphasis on gaining cooperation, overcoming objections, addressing concerns of gatekeepers, and encouraging participation. If a respondent does refuse to be interviewed or terminates an interview in progress, interviewers will attempt to determine their reason(s) for refusing to participate, by asking the following question: “Could you please tell me why you do not wish to participate in the study?” The interviewer will then code the response and any other additional relevant information. Particular categories of interest include “Don’t have the time,” “Not interested,” “Don’t participate in any surveys,” and “Opposed to government intrusiveness into my privacy.”

Languages of Survey Administration. Both the CATI and paper versions of the ATSS will be offered in Spanish (Attachments 7A-S and 7C-S) or English (Attachments 7A and 7C). Thus, Spanish-speaking sample members who might otherwise have refused to participate because of their inability to complete the surveys in English may instead complete them in Spanish. Spanish-speaking interviewers will be trained to inform Spanish-speaking sample members about the survey and engage them in the process of participation, thereby reducing refusals in this population.

Enhanced Protocol

Gifts. A gift of $40 will be given to adults completing the Adult Biometric Measures, $10 will be given to children aged 12–17 completing the Youth or Caregiver Survey and Youth Biometric Measures, and $10 will be given to caregivers of children aged 3–11 who complete the Caregiver Survey (no additional gift is provided to the 3–11–year-old child completing the Youth Biometric Measures). Proposed gifts are based on both the age of the participant (child vs. adult) and the level of participation. The gifts are slightly lower than for participants of the longer (5.9 hours) National Health and Nutrition Examination Survey (NHANES) examination, for which a gift of $70 is given to persons aged 16 and older, and $30 is given to children 2–15 years of age. A gift of $20 will be given to adults who complete accelerometry procedures; $10 will be given to children aged 3–17 who complete accelerometry procedures.

Appointment Procedures. ATSS CATI interviewers will recruit and schedule appointments for the in-home data collection at the end of the ATSS call. Field staff will reschedule appointments if necessary. Respondents who complete the paper ATSS will be encouraged to call a toll-free number to schedule an in-home visit, if eligible. A toll-free number will be given to the recruited households if they need to reschedule.

FIs will meet the sample members at the sampled address, at the appointed time. If a respondent is unavailable when the FI visits, another appointment will be scheduled. If a potential respondent refuses participation at the time of the examination, the FI will leave the premises and code the respondent as having refused participation in the Enhanced Protocol.

Field Interviewer Training. Refusal will be mitigated through a wide array of methods, including hiring high-quality bilingual scheduling staff and FIs, implementing quality assurance procedures such as close supervision of the FIs by the Data Collection Supervisor, and offering comprehensive training. FIs will attend a centralized training on participant rescheduling, the interview and examination protocol, and handling and field storage procedures for samples. The goal of training will be to prepare staff to successfully perform field survey tasks in a consistent and standardized fashion as described in a manual of procedures. FIs will be required to show competency in general interviewing techniques (e.g., asking questions and recording answers appropriately; contacting participants correctly; demonstrating professional behavior, standards, and ethics) and gaining cooperation and refusal conversion.

Languages of Survey Administration or Conduct of Examination. Every participant (adult or child) will be given the option of completing the Youth or Caregiver Survey or following instructions for biometric measurements in Spanish or English. Thus, Spanish-speaking sample members who might otherwise have refused to participate because of their inability to complete the Youth or Caregiver Survey or Adult or Youth Biometric Measures in English may complete them in Spanish instead. Adults who provide accelerometry data will maintain an Activity Diary for themselves, recording the time of getting up in the morning and going to bed and the time and reason the device was removed for five minutes or more for any activity. Activity Diaries for children aged 3–11 years will be maintained by caregivers, and older children (aged 12–17) will complete their own diary. These diaries will be provided in either English or Spanish.

Methods for Investigating the Impact of Nonresponse. Simple descriptive statistics, such as counts and frequencies, will be tabulated for respondents and nonrespondents at relevant stages of the sampling process (e.g., from telephone contact to completion of ATSS, from completion of ATSS to participation in the Enhanced Protocol). Nonrespondent statistics will be tabulated overall and by subtype (refusal vs. not contacted). Response rates will be calculated and comparisons will be made between respondents and nonrespondents on sociodemographic characteristics (e.g., age, sex, and race/ethnicity) and other relevant factors. Techniques to minimize the potential bias resulting from nonresponse will be considered. If changes in protocol are warranted, plans will be developed for implementation after Institutional Review Board (IRB) and OMB review.

Management of Missing Data and Other Issues

1. Missing Data

Missing data will be handled with a number of strategies to minimize their impact on the final analysis, including ongoing monitoring of data collection to identify problem areas of each instrument and address them immediately, monitoring of individual interviewer performance to ensure that items are not being skipped or inaccurately coded, and imputation of data for those with missing data once the analysis begins.

During the data collection period, interviewers will be closely monitored by Data Collection Supervisors (on the telephone or in the field) who will review data as they are entered and oversee data collection by listening to or attending interviews to ensure that interviewers are using their training to maximize complete responses. In addition, as data are entered into the surveillance system created by the contractor, statisticians will review responses on a monthly basis to identify patterns of nonresponse for particular items and address problem areas of the instruments. These findings will be used to potentially tweak questions on the instruments or identify areas where skip patterns can be improved to maximize responses.

Of particular concern is the break-off phenomenon, in which respondents tire and quit before the end of the questionnaire. Break-offs could lead to higher nonresponse for items in the latter part of the questionnaire, particularly for the ATSS. If we find that a large number of respondents are dropping out of the study before ATSS completion, we will explore the extent to which we can transfer some questions from the ATSS to the data collection conducted in the home. This strategy would be helpful because break-offs are far less common in person. However, not all respondents would have this option because some will not be invited to participate in the Enhanced Protocol or will refuse to complete an in-home visit. We anticipate that households that agree to schedule an in-home visit for the Enhanced Protocol will have fewer missing data in their responses because the FI will be collecting the data in person. Several steps will be taken to ensure that these appointments are kept, including having an FI call the respondent within one day or completion of the ATSS to confirm the data of the in-home visit, and then calling the respondent prior to the visit for an appointment reminder. Once in the home, the FI will have received training on the various scenarios they may encounter in collecting data for each instrument and will be able to attend to missing responses.

For households invited to participate in the accelerometry data collection procedures will be implemented to minimize missing data: reminder calls to participants will be made twice during the week they have been asked to wear the monitor; the participant will be instructed that receiving the gift is dependent on providing at least four days with at least 10 hours of data following the NHANES accelerometry protocol.47 Imputation techniques will be considered to address intermittently missing data.48 If data are not complete according to the previous criteria, participants will be asked to wear the accelerometer for an additional seven days. Implementation of “rewear” strategies has been found to increase completeness of the physical activity database.49

Once we begin data analysis for variables with less than 10% missing data, an imputation strategy may be applied to estimate the missing data based on specific variables from the distribution of the entire sample from each awardee area such as age, sex, race/ethnicity, education, and household income.

2. Seasonality

Both the Standard and Enhanced Protocols will be executed over a 12-month period of data collection for each time period. We do not anticipate seasonal effects to be a methodological issue because data collection in all awardees will occur over a 12-month period in years 2013 (upon OMB approval for 12 months thereafter), 2015 (repeat of the same time period as 2013), and 2017 (upon OMB renewal for 12 months thereafter, to match the same time periods as in 2013 and 2015). Nevertheless, the date of data collection will be recorded for the Youth or Caregiver Surveys as well as the Adult and Youth Biometric Measures to account for potential seasonality effects, if necessary.

Description of Sample Weighting

In all analyses, data will be weighted to account for the unequal probability of selection and response. Sample weights will be developed to reflect the probability of selection and response for specific sets of respondents. At least two sets of sample weights will be developed for specific sets of respondents: one for use in analyses of ATSS questionnaire data and the other for use in analyses of data collected from the in-home examination. The methodology for creating the two sets of weights is very similar. We will describe the creation of the weights once but highlight the difference and not reproduce the methodology twice.

Weighting Overview

There are four steps in creating the sampling weights:

  1. Calculate the initial weights as in the inverse of the probability of selection with an adjustment for unknown eligibility.

  2. Adjust for nonresponse.

  3. Adjust for household size.

  4. Poststratify.

Step 1: Calculate the Initial Weights

The following formula defines an initial weight which is the inverse of the probability of selection of the address for the jth frame member that adjusts for known eligibility status.

= The initial weight of the jth address in stratum i,

= the quantity of addresses in stratum i,

= the quantity of sample respondents in stratum i,

= the quantity of nonresponding addresses in stratum i,

= the quantity of ineligible addresses in stratum i, and

= quantity of addresses selected in stratum i.

(For clarification, in this notation, superscripts refer to the step in weighting, not a power.) After defining the initial weights, ineligible frame members are removed. The sum of the initial weights is an estimate of the eligible population in the targeted surveillance area.

Step 2: Adjust for Nonresponse

The sample receives a model-based nonresponse adjustment. The sample contains two stages of selection. The household is selected in the first stage. The individual is selected in the second stage. In some studies, adjustments for nonresponse occur at both stages of selection. In this study we will adjust for these two stages of nonresponse using one model. The rationale for combining the two stages is that for mail contacts we will not be able to distinguish between household- and individual-level nonresponse. For telephone contacts, we expect 95% of the nonresponse to occur at the household level. If we used two models for nonresponse, one at the household level and one at the individual level, we would not have enough data to produce a good model at the individual level. Consequently, we adjust for household- and individual-level nonresponse in one model.

American Community Survey (ACS) data are appended to the sample frame. A logistic regression model is fit predicting the probability of response using ACS data and stratification variables as predictors.

The following independent variables are used in the model that predicts nonresponse.

From the sample frame:

  • Awardee

  • Rural/Urban—based on the National Center for Health Statistics (NCHS) Urban-Rural Classification Scheme for Counties

  • African American density category

  • Hispanic surname indicator

  • Indicator that a child is in the household

  • Indicator that the household has a telephone number

From the ACS:

  • Proportion Hispanic in the block group in which the address is located

  • Ratio of households that are owner occupied to rentals in the block group in which the address is located

  • Proportion of the population with a bachelor’s degree or higher in the block group in which the address is located

  • Proportion of the population in poverty in the block group in which the address is located

The continuous variables from the ACS are made into categorical variables of four levels by collapsing based on the quartiles of the distribution.

We will fit two weighted logistic models, incorporating the weights calculated up to this step and applying SAS software Proc SURVEYLOGISTIC to fit the following model: . The independent variables are the 10 variables in the previous list. We will test for collinearity and potentially reduce the number of independent variables based on model fitting diagnosis. Once we fit the final models, we calculate the probability of response for each sample member: . The index i refers to one of the strata. The index j refers to one of respondents within stratum i. A new weight is calculated: ,

where

= nonresponse adjusted probability for the jth frame member in the ith stratum and

= the predicted probability of response for the jth respondent in the ith stratum from the logistic model.

Because the sum of the nonresponse adjusted weights ( ) is not exactly equal to the sum of (  ), we make the following ratio adjustment:

.

Step 3: Adjust for Number of Eligible Household Members

Each subject has a weight that reflects the inverse of the probability of selection and a nonresponse adjustment. In this step we adjust for number of eligible residents in the household.

where

= number of eligible residents (adults or children, as appropriate) in the household of the selected respondent for the jth listed address in stratum i.

Step 4: Poststratify

The last step of weighting is poststratification to the latest population estimates. We poststratify to the following domains:

  • Awardee by age category (3–9, 10–14, 15–17, 18–34, 35–49, 50–64, 65+)

  • Awardee by sex

  • Awardee by race (white, black, other)

  • Awardee by Hispanic status

The population totals for the poststratification domains will come from the latest available version of the ACS five-year summary file.

B.4 Test of Procedures or Methods to be Undertaken

Internal tests were conducted with all the instruments and screeners to be used in both the Standard and Enhanced Protocols, and made adjustments according to the results. The purpose of the internal testing was to:

  • test and revise materials for participant recruitment and survey administration;

  • ensure clarity of survey language; and

  • identify timing, skip patterns, and other complex conceptual issues that may not be readily obvious from simple reading of the survey.

Given that the majority of the items were drawn from previously fielded surveillance instruments (Attachment 5) that have been shown to be valid and reliable with the appropriate age groups, we expected that the majority of the questions would be easily understandable and accurately answered by the target group of respondents; problems resulting from use of vocabulary and complex sentence structure or validity problems resulting from misinterpretation of the questions were minimal. Therefore, perceived instances of misunderstandings, incomplete concept coverage, and inconsistent interpretations were rare, and only a few words and answer choices were altered to address these concerns. The pilot testing focused on:

  • Consistency—We tested to ensure that the instrument was applicable for all modes of administration and allowed maximal comparison to data from the source instruments from which the questions were drawn. For example, the paper ATSS needed to be adapted slightly from the CATI format, and questions were aligned between the Youth or Caregiver Surveys to allow aggregation of data at the time of analysis. In addition, wording or answer choices were adjusted to permit the best comparisons between the instrument and the source instruments from which they were drawn.

  • Length—To ensure no excessive burden to respondents and to achieve the approximate time estimates provided in the 60-day Federal Register Notice (Attachment 2) for this Information Collection Request (ICR), we deleted questions based on the pilot results.

  • Question sequencing and overall flow—We pilot tested the full process of the computer-assisted personal interviewing (CAPI), including introduction, respondent selection method, and questionnaire flow. Based on the results, we eliminated redundancy and shifted the ordering of items to ensure a smooth flow of the data collection process to maximize the efficiency of collecting accurate responses. Skip patterns were adjusted to reflect changes in the ordering of items that were made to improve flow and eliminate potential errors.

  • Salience—Based on our pilot findings, we modified the recall periods to ensure as much consistency as possible, while permitting comparison to the source instruments. We also inserted the exact dates for recall as autofill (e.g., “During the past 12 months, that is since January 1, 2011”). We also confirmed that allowing respondents to choose their reporting period was helpful (e.g., offering day, week, or month for reporting foods eaten). We also modified the ordering of certain questions to better assist respondents with recall.

  • Ease of administration and response—Interviewers did not note any difficulty in administering the instruments. As expected, a few respondents did struggle to complete the food frequency items. Instruments were modified to provide consistent recall periods, and reference dates as noted above. Based on the pilot test results, we offered additional examples within questions; for example, we updated the computer time use question to include time spent using an iPad.

  • Acceptability to respondents—Results from the pilot test suggested that participants were comfortable answering questions and that the range of response options were generally comprehensive. However, additional response choices were added to certain questions as a result of the pilot. Importantly, respondents did not report that questionnaire items were too sensitive to answer.

After review by external experts in each of the content areas specific to the CTG Program evaluation, the ATSS (both CATI and mail versions) and the Youth and Caregiver Surveys were revised and programmed for administration via CATI or CAPI, respectively (Exhibit B.4.1). Since the initial OMB submission, survey questions that have been changed or adapted by the contractor were reviewed by experts identified in Section A.8. Questions have been programmed and tested for accuracy, flow, implementation of skip patterns (as described above), and testing for other features and content of the instrument as described above.

Exhibit B.4.1. Survey Instruments and Materials for Pilot Testing

Study Protocol

Survey Instrument

Pilot Testing Data Components

Standard

ATSS Telephone Survey (CATI)

  1. Adult Targeted Surveillance Survey Telephone Screener

  2. ATSS Telephone Survey

  3. Adult Biometric Measures Recruitment Screener (Phone)

ATSS Paper (Mail)

  1. ATSS Paper (Mail) Survey

  2. Adult Biometric Measures Recruitment Invitation (Paper)

Enhanced

Caregiver Survey

  1. Caregiver Survey

  2. Youth Biometric Measures

Youth Survey

  1. Youth Survey

  2. Youth Biometric Measures


B.5 Individuals Consulted on Statistical Aspects or Analyzing Data

Robin Soler, PhD (770-488-5103), Division of Community Health, CDC, is the Principal Investigator and Technical Monitor for the study. She has overall responsibility for overseeing the design and administration of the surveys, and she will be responsible for analyzing the survey data.

RTI International is the project contractor responsible for developing the instruments and data collection protocols; providing training to interviewers; and collecting and analyzing from the Standard and Enhanced Protocols. Diane Catellier, DrPH (919-541-6447), is the primary contact with the Technical Monitor and oversees work on all tasks related to the Targeted Surveillance and Biometric Study.

The survey instruments, sampling and data collection procedures, and analysis plan were designed in collaboration with researchers at the Department of Health and Human Services (HHS), CDC, and RTI (Exhibits B.5.1 and B.5.2). The following personnel have been involved in the design of the protocol and data collection instrument (note that additional experts will be asked to review instruments before they are pilot tested and finalized but the respondent burden will not change):

Exhibit B.5.1. List of Individuals and Organizations That Were Consulted for the Study

Name

Organization

Contact Information

Seraphine Pitt Barnes, PhD, MPH, CHES

Division of Population Health;

National Center for Chronic Disease Prevention and Health Promotion (NCCDPHP)

Phone: (770) 488-6115

[email protected]

Nilka Burrows, MPH

Division of Diabetes Translation; NCCDPHP

Phone: (770) 488-1057 [email protected]

Dan Chapman, PhD

Division of Population Health; NCCDPHP

Phone: (770) 488-5463 [email protected]

Pyone Cho, MD

Division of Diabetes Translation; NCCDPHP

Phone: (770) 488-2041

[email protected]

Kristine Day, MPH

Division of Community Health; NCCDPHP

Phone: (770) 488-5446

[email protected]

Martha Engstrom, MS

Office of Smoking and Health;

NCCDPHP

Phone: (770) 488-5749 [email protected]

Charlotte Kent, PhD

Division of Community Health; NCCDPHP

Phone: (770) 488-6471

[email protected]

Brian King, PhD

Office of Smoking and Health; NCCDPHP

Phone: (770) 488-5107

[email protected]

Rosemarie Kobau MPH

Office of Noncommunicable Diseases; Injury and Environmental Health

Phone: (770) 488-6087 [email protected]

Youlian Liao, MD

Division of Community Health; NCCDPHP

Phone: (770) 488-5299

[email protected]

Fleetwood Loustalot, PhD

Division of Heart Disease and Stroke Prevention; NCCDPHP

Phone: (770) 488-5198 [email protected]

Louise Murphy, PhD

Division of Population Health; NCCDPHP

Phone: (770) 488-5102 [email protected]

Rashid Njai, PhD

Division of Community Health; NCCDPHP

Phone: (770) 588-5215

[email protected]

Tatiana Nwankwo, MS

Division of Health and Nutrition Examination Surveys; National Center for Health Statistics (NCHS)

Phone: (301) 458-4813

[email protected]

Diane Orenstein, PhD

Division of Community Health; NCCDPHP

Phone: (770) 488-8003

[email protected]

Yechiam Ostchega, PhD, RN

Division of Health and Nutrition Examination Surveys; NCHS

Phone: (301) 458-4408

[email protected]

Paul Siegel, MD, MPH

Division of Community Health; NCCDPHP

Phone: (770) 488-5296

[email protected]

Robin Soler, PhD

Division of Community Health; NCCDPHP

Phone: (770) 488-5103

[email protected]

Matthew Zack, MD, MPH

Division of Population Health; NCCDPHP

Phone: (770) 488-5460 [email protected]


Exhibit B.5.2. Leads in Data Collection, Research/Sampling Design, and Data Analysis

Task

Lead

Affiliation

Reviewer

Contact Information

Data Collection

a. All Data Collection Activities for Standard Protocol

Brenna Muldavin, MS

RTI

Kristina Peterson, PhD, MA

Phone: (919) 541-6389
E-mail: [email protected]

b. All Data Collection Activities for Enhanced Protocol

Jane Hammond, PhD

RTI

Dan Zaccaro, MS

Brenna Muldavin, MS
Kristina Peterson, PhD, MA

Phone: (301) 770-8207
E-mail:
[email protected]

Study Design

  1. Standard Protocol Sampling Design

Burton Levine, MS, MA

RTI

Rachel Harter, PhD Diane Catellier, DrPH
Debra Holden, PhD
Todd Rogers, PhD

Phone: (919) 541-1252
E-mail: [email protected]

  1. Standard Protocol Survey Design

Andrea Anater, PhD

RTI

Debra Holden, PhD Matthew Farrelly, PhD
James Nonnemaker, PhD, MSPH
Carol Schmitt, PhD, MA
Todd Rogers, PhD

Phone: (919) 541-6977
E-mail: [email protected]

  1. Enhanced Protocol Sampling Design

Burton Levine, MS, MA

RTI

Diane Catellier, DrPH
Debra Holden, PhD
Todd Rogers, PhD

Phone: (919) 541-1252
E-mail: [email protected]

  1. Enhanced Protocol Survey Design

Andrea Anater, PhD

RTI

Debra Holden, PhD
Dan Zaccaro, MS Todd Rogers, PhD

Phone: (919) 541-6977
E-mail: [email protected]

  1. Sample Weighting Design

Burton Levine, MS, MA

RTI

Diane Catellier, DrPH
Rachel Harter, PhD
Dan Zaccaro, MS

Phone: (919) 541-1252
E-mail: [email protected]

  1. Enhanced Protocol Biometric Sample Collection

Jane Hammond, PhD

RTI

Aten Solutions, Inc. (A10)

Phone: (301) 770-8207
E-mail:
[email protected]

Data Analysis

  1. Data Analysis for Standard Protocol

Diane Catellier, DrPH

RTI

Rachel Harter, PhD
Debra Holden, PhD
Todd Rogers, PhD

Phone: (919) 541-6447
E-mail: [email protected]


  1. Data Analysis for Enhanced Protocol

Jane Hammond, PhD

RTI

Diane Catellier, DrPH

Dan Zaccaro, MS
Rachel Harter, PhD
Debra Holden, PhD
Todd Rogers, PhD

Phone: (301) 770-8207
E-mail:
[email protected]



B.6 References

  1. Rosner, B. (1995). Fundamentals of biostatistics (4th ed.; p. 384). Belmont, CA: Duxbury Press.

  2. Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associates, Inc., Publishers.

  3. Kwon, J. H., Jang, H. Y., Rho, J. S., Jung, J. H., Yum, K. S., & Han, J. W. (2011). Association of visceral fat and risk factors for metabolic syndrome in children and adolescents. Yonsei Medical Journal, 52(1), 39–44.

  4. Halterman, J. S., Borreli, B., Tremblay, P., Conn, K. M., Fagnano, M., Montes, G., & Hernandez, T. (2008). Screening for environmental tobacco smoke exposure among inner-city children with asthma. Pediatrics, 122(6), 1277–1283.

  5. Link, M., & Mokdad, A. (2005). Advance letters as a means of improving respondent cooperation in RDD studies: A multi-state experiment. Public Opinion Quarterly, 69(4), 572–587.

  6. De Leeuw, E., Callegaro, M., Hox, J., Korendijk, E., & Lensvelt-Mulders, G. (2007). The influence of advance letters on response in telephone surveys. A meta-analysis. Public Opinion Quarterly, 71(3), 413–443.

  7. Cantor, D., Wang, K., & Abi-Habib, N. (2003). Comparing promised and pre-paid incentives for an extended interview on a random digit dial survey. Proceedings of the Survey Research Methods Section of the ASA, Nashville, TN.

  8. Singer, E., Van Hoewyk, J., & Maher, M. P. (2000). Experiments with incentives in telephone surveys. Public Opinion Quarterly, 64(2), 171–188.

  9. Groves, R. M. (2006). Nonresponse rates and nonresponse bias in household surveys. Public Opinion Quarterly, 70(5), 646–675.

  10. Groves, R. M., Singer, E., & Corning, A. (2000). Leverage-saliency theory of survey participation—description and an illustration. Public Opinion Quarterly, 64, 299–308.

  11. Cantor, D., O’Hare, B. C., & O’Connor, K. S. (2007). The use of monetary incentives to reduce non-response in random digit dial telephone surveys. In J.M. Lepkowski, C. Tucker, J. M. Brick, E. de Leeuw, L. Japec, P. J. Lavrakas, M. W. Link, & R. L. Sangster (eds.), Advances in telephone survey methodology. John Wiley & Sons.

  12. O’Muircheartaigh, C., & Campanelli, P. (1999). A multilevel exploration of the role of interviewers in survey non-response. Journal of the Royal Statistical Society, 162, 437–446.

  13. Groves, R. M., & McGonagle, K. A. (2001). A theory-guided interviewer training protocol regarding survey participation. Journal of Official Statistics, 17, 249–265.

  14. Troiano, R. P., Berrigan, D., Dodd, K. W., Mâsse, L. C., Tilert, T., & McDowell, M. (2008). Physical activity in the United States measured by accelerometer. Medical Science Sports and Exercise, 40, 181–188.

  15. Catellier, D. J., Hannan, P. J., Murray, D. M., Addy, C. L., Conway, T. L., Yang, S. & Rice, J. C. (2005). Imputation of missing data when measuring physical activity by accelerometry. Medicine and Science in Sports and Exercise, 37(11 Suppl), S555–S562.

  16. Tudor-Locke, C., Camhi, S. M., & Troiano, R. P. (2012). A catalog of rules, variables, and definitions applied to accelerometer data in the National Health and Nutrition Examination Survey, 2003–2006. Preventing Chronic Disease, 9.





a CTG Program Implementation awards were made to 35 communities to use proven programs and strategies to improve their community’s health and wellness. CTG Program Capacity-building awards were made to 26 communities to build a solid foundation for community prevention efforts to ensure long-term success.


b NCHS Urban-Rural Classification Scheme for Counties is described here: http://www.cdc.gov/nchs/data_access/urban_rural.htm.

c Acxiom’s Web site states, “Acxiom’s InfoBase is the world’s largest compilation of timely, up-to-date consumer intelligence with 176 million consumers and 111 million households” (http://lists.nextmark.com/market;jsessionid=B728DEBDA6B7648AD977CE27304780A5?page=order/online/datacard&id=131838).

d The test for the difference of two proportions is described in many places including Rosner.34

e Singer and colleagues41 have been cited as providing evidence toward the ineffectiveness of promised incentives to increase survey response rates. However, approximately 200 sample cases were assigned to each condition (with or without incentive) in their experiments, requiring very large differences to reach statistical significance. The pattern supported the effectiveness of promised incentives, as in all four of their experiments the response rate was higher in the condition with an incentive. Furthermore, the experiments were conducted in 1996 with response rates close to 70%, seemingly more difficult to be increased through incentives relative to the lower current response rates (below 50% on that same survey).

xliii

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorTodd Rogers
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy