OMB Supporting Statement B

OMB Supporting Statement B.docx

National Survey of Community Based Policy and Environmental Supports for Healthy Eating and Active Living

OMB: 0920-1327

Document [docx]
Download: docx | pdf













National Survey of Community Based Policy and Environmental Supports for Healthy Eating and Active Living



New



Supporting Statement B















Program Official/Contact

Deborah Galuska, MPH, PhD

Associate Director of Science

National Center for Chronic Disease Prevention and Health Promotion

Centers for Disease Control and Prevention

P: 770-488-6017

F: 404-235-1802

[email protected]



June 15, 2020



ATTACHMENTS


1. Public Health Service Act [42 U.S.C. 241]

2a. 60-Day Federal Register Notice

2b. Response to public comments from 60-Day Federal Register Notice

3a. Web questionnaire

3b. Self-administered hardcopy questionnaire

4. Pre-notification letter

5a. Survey invitation letter

5b. Frequently asked questions

6a. Email invitation letter

6b. Reminder email template

6c. SAQ cover letter

6d. Last chance email

7. Telephone contacting scripts

8. Critical Item Survey

9. Web questionnaire screenshots

10. Privacy Narrative

11. Human subjects document non-research determination

12. Privacy impact assessment form




B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B1. Respondent Universe and Sampling Methods

As a nationally representative study, the respondent universe for this data collection consists of the universe of municipalities within each of the 50 States in the United States. Municipal governments, as defined by the U.S. Census Bureau, refer to political subdivisions within which a municipal corporation has been established to provide general local government for a specific population concentration in a defined area; it includes all active governmental units officially designated as cities, boroughs (except in Alaska), towns (except in the six New England States and in Minnesota, New York, and Wisconsin), and villages. Therefore, municipal governments are the primary sampling unit (PSU) for this data collection effort.


The sampling frame is constructed using the same methodology as used in the 2014 survey (National Survey of Community-Based Policy and Environmental Supports for Health Eating and Active Living, OMB Control Number 0920-1007). Specifically, it will be constructed from the most recent U.S. Census of Governments (COG) file, which provides a listing of municipalities and townships for each State. COG data for the name, type (e.g., city, town, village), and total population of each municipality will be used to construct the sampling frame. In constructing the sample frame, the following variables from the COG files will be used, organized into two sections depending on their utility, for (1) the design and (2) for recruitment:

  1. Variables useful for the design:

    1. Community name

    2. Population

    3. County name

    4. Place as indicated by Federal Information Processing Standards (FIPS)

    5. Political description

  2. Variables useful for recruitment to the study:

    1. Community name

    2. Title of political leader (e.g. city or town manager or planner or a person with similar responsibilities for the sampled municipality)

    3. Physical address of government office

    4. Web address of government site (if applicable, not available for all communities)


The development of the sample frame requires a classification process to ensure there is no geographic overlap between municipalities and towns or townships. This process will prevent counting populations twice that are under the jurisdiction of both forms of government, which would inflate numerators in the population estimates the study will develop. To eliminate the potential for geographic overlap in the sample frame, we will use the same eligibility criteria as the previous survey and isolate governments with the classification of Municipality in the COG data (rather than a Township) for inclusion. By using municipalities as the PSU (the finest grain), it will be possible to provide a representative sample of policies passed at the municipal level.


Threshold for inclusion in the sample frame. The original project’s pilot study (OMB No. 0920-0934, “OMB “Pilot Study of Community-Based Surveillance and Supports for Healthy Eating/Active Living”, expiration 5-31-2013), which did not have a population size threshold for inclusion in the sample frame, investigated the tradeoff involved in adopting a size threshold below which communities are considered too small to be included in the frame. The findings from the pilot study confirmed that municipalities with a population size of less than 1,000 were unlikely to respond to the study because they were too small to have many of the policies the survey asks about. The pilot study also confirmed that undue burden would be placed on city managers or planners of these small cities in completing the survey because they are part-time staff. While the use of a threshold leads to some coverage loss, it also leads to gains in efficiency in response rates (overall and item response rates) and in reaching the subset of communities that can meaningfully provide the kind of data envisioned by the study. These findings informed the sample frame of the 2014 baseline study, and the same population size restriction remains in effect for the current study.


The sample size for the national study, 4,417 communities, was developed to ensure precise estimation and sufficient statistical power to investigate comparisons across subgroups. The sample size was determined based on two criteria: (1) the ability to ensure 95% confidence intervals within ± 3 percentage points for all study estimates, taking into account the anticipated design effect of 1.1 and anticipating a response rate of 60%; and (2) the ability to ensure the detection of differences between subgroup percentages as small as 5% (alpha = 0.05). This total sample size will detect differences in subgroup comparisons as small as 5% with 80% power. Comparing results to those from the previous survey, this should also be sufficient to detect a 3% change nationally in an arbitrary proportion with 85% power.


Sample Selection. The sample for the national survey will be stratified by a number of relevant dimensions including region, population size, and urban status. The sampling design will consider an effective mix of explicit and implicit stratification. Explicit regional stratification will be by region to ensure a sufficient number of sample communities in each region. Additional explicit stratification by urban status will also ensure a sufficient number of urban and nonurban (rural) communities in the sample. Implicit stratification by size will ensure that the sample is distributed across varying community size categories. Implicit stratification, which allows the possible use of continuous variables such as population size in stratification, will be implemented by sorting the frame by the population size within each explicit stratum.


The sampling frame will be explicitly stratified by Census Region and by urban status. The urban versus rural dichotomy will follow the Census Bureau classification. Table B1.a provides an overview of the States by their specific Census Region.


Table B1.a. Definition of Census Regions in the United States

Region

States

1. Northeast

Connecticut, Maine, Massachusetts, New Hampshire, Rhode Island, Vermont, New Jersey, New York, Pennsylvania

2. Midwest

Indiana, Illinois, Michigan, Ohio, Wisconsin, Iowa, Kansas, Minnesota, Missouri, Nebraska, North Dakota, South Dakota

3. South

Delaware, District of Columbia, Florida, Georgia, Maryland, North Carolina, South Carolina, Virginia, West Virginia, Alabama, Kentucky, Mississippi, Tennessee, Arkansas, Louisiana, Oklahoma, Texas

4. West

Arizona, Colorado, Idaho, New Mexico, Montana, Utah, Nevada, Wyoming, Alaska, California, Hawaii, Oregon, Washington


To quantify urban status, we will link Census SF-1 data for the Census Place. Both the SF-1 file and the COG file provide a Place FIPS (ID) that can be used in the linking of the two data sets. We will use a classification of urban and rural areas that is consistent with the Urban Area to Place Relationship File.1 The Census Bureau’s urban areas represent densely developed territory and encompass residential, commercial, and other nonresidential urban land uses. For the 2017 Census, an urban area comprises a densely settled core of census tracts and/or census blocks that meet minimum population density requirements, along with adjacent territory.


Rural areas encompass all population, housing, and territory not included within an urban area. We used a measure of the proportion of a Census place’s population that resides within a Census-designated urban area (specifically, the variable PLPOPPCT, whose calculation is described in the reference cited above), based on the distribution of this measure for eligible places and whether they fell above or below the 30th percentile of the national (not region-specific) distribution. Within each stratum defined by region and urban status, the frame will be sorted by size. Population data for 2017, available from the COG files, will be used as the community size variable for the sort.


The sample will be allocated to the explicit strata defined by region and urban status to meet precision requirements for stratum comparisons. Implicit stratification by size also helps to ensure proportional representation of communities with varying sizes. In addition, we will sample communities systematically with equal probabilities within strata. Table B1.b. presents the proposed sample sizes for the current data collection by stratum.


Table B1.b. Proposed Sample Sizes by Stratum

Stratum

# Municipalities on 2017 frame

Proposed sample size (respondents)

Margin of Error (MOE)

Initial sample (assuming 60% response)

Fraction sampled

Urban, NE

1,281

400

4.1%

667

52%

Urban, MW

2,371

500

3.9%

833

35%

Urban, S

2,374

500

3.9%

833

35%

Urban, W

1,179

400

4.0%

667

57%

Rural, NE

178

100

6.5%

167

94%

Rural, MW

1,239

300

4.9%

500

40%

Rural, S

1,354

300

5.0%

500

37%

Rural, W

324

150

5.9%

250

77%

TOTAL

10,300

2,650

2.9%

4,417

43%


Estimation Precision. To ensure precision for percentages of all magnitudes, we assumed the worst possible scenario for the prevalence (50%, where the variances and standard errors have maximum values). Our calculations took into account the design effect and the finite population correction (FPC). For the single-stage sampling design proposed for the study, design effects (DEFFs) very near 1.0 can be expected. The DEFF is defined as the variance under the actual sampling design divided by the variance of a simple random sample (SRS) of the same size. Design effects near 1.0 indicate efficient designs whose variance is comparable to that under SRS designs. The sampling approach assumes DEFFs of 1.2 to account for minor effects of unequal weighting (e.g., due to stratified sampling). The COG lists about 19,500 (municipalities). After the exclusion of very small municipalities, this will be reduced to about 10,300 units. To achieve a 95% confidence interval of 3% or less for estimated proportions nationally, a minimum sample size of 1,332 participants is then necessary for the required precision levels, corresponding to 1,903 selections when inflated for the anticipated response rate of 60%. This calculation takes into account the sample size necessary for the finite population correction. If the finite population correction had not been taken into account, the initial sample size would have been 1,677 selections.


Detecting Small Differences. The study’s sampling approach develops conservative sample sizes that allow the detection of differences of 5% (two-tailed test) with 80% power for the worst-scenario effects, incorporating FPC. Table B1.c shows two example scenarios comparing percentage estimates between two sample subgroups of 25% versus 20% and 52.5% versus 47.5%. The table illustrates that under the worst scenario—percentages of 52.5% and 47.5%—sample sizes of 1,308 participants per group, and a total of 2,616 participants are needed to achieve 80% power. Assuming a 60% participation rate, the number of selected communities needs to be 3,738 to accommodate a final completion number of 2,616 communities to achieve the desired participation rate.


Table B1.c. Example Sample Sizes Needed to Detect 5% Differences Between Groups for 80% Power (Two-Tailed Test)

Group 1 Percentage

Group 2 Percentage

N per Group

Total Selections (Municipalities)

Total Selections (Municipalities) Required Assuming 60% Completion Rate

20.0%

25.0%

914

1,829

2,616

47.5%

52.5%

1,308

2,616

3,738


As noted earlier in this section, our proposed stratum-specific sample size of 4,417 for this study exceeds the 3,738 sample size (and thus meets precision requirements).


B2. Procedures for the Collection of Information

This one-time national survey requires a study recruitment approach that can support the achievement of a 60% response rate. The targeted survey respondent is the city or town manager or planner or a person with similar responsibilities for the sampled municipality.


Data Collection Instrument. The National Survey of Community-Based Policy and Environmental Supports for Healthy Eating and Active Living is a multi-mode Web-based (Attachment 3a), self-administered hardcopy questionnaire (Attachment 3b), and phone questionnaire that consists of 60 items divided into seven sections.

  • Section 1: Structure of Your Local Government asks if the local government contains different departments or offices (e.g. transportation, parks and recreation, housing).

  • Section 2: Communitywide Planning Efforts for Healthy Eating and Active Living asks questions about the planning documents local municipalities may have in place that support healthy eating and active living.

  • Section 3: The Built Environment and Policies That Support Physical Activity asks respondents to indicate what policies, standards, and practices they have in place to support aspects of the built environment that support physical activity.

  • Section 4: Zoning That Supports Healthy Eating and Active Living contains questions related to zoning or development code that support healthy eating and active living.

  • Section 5: Public Transportation Policies that Support Healthy Eating and Active Living covers policies and procedures related to public transportation in relation to healthy eating and active living.

  • Section 6: Other Policies and Practices That Support Access to Healthy Food and Healthy Eating asks questions related to policies that affect access or support an increase in access to healthy food options within the community. This section also asks questions on healthy food access within convenience stores and access to farmers markets.

  • Section 7: Policies That Support Employee Breastfeeding covers policies that support employee breastfeeding.


Data Collection Procedures. Data collection will involve a series of mailings and nonresponse follow-up activities. All communications to the municipalities sampled for the project – including pre-notification letter (Attachment 4), invitation letter (Attachment 5a), and reminder letters/emails (Attachment 6a-6d) – will be personalized with respondent name, web survey link, and individual Personal Identification Number (PIN). The letters will also contain a project specific email address and a toll-free number that respondents can use to contact project staff.


Pre-notification and Survey Invitation Letter Mailings. All selected municipalities will be sent an initial pre-notification letter using CDC letterhead, which announces the upcoming data collection and importance of participation (Attachment 4).


Following the pre-notification to the survey, each sampled municipality will be sent a survey invitation letter via USPS (Attachment 5a). The letter will included the survey URL and a unique PIN for the sampled respondent to access the survey. The survey invitation letter will be sent on CDC letterhead and will include a list of frequently asked questions on the reverse side (Attachment 5b). A letter of support from a supporting association, such as the National Association of County and City Health Officials (NACCHO) or the American Planning Association (APA) will be included within the invitation letter mailing. The mailing envelope used will include NORC’s return address to process undeliverable returns. NORC receipt control clerks will be trained on proper receipt control procedures and how to record the case status within the case management system.


Reminder letters and emails. Personalized reminder letters or emails will be sent to all survey non-responders throughout the field period (Attachments 6a-6d). Each letter, whether paper or email, will be personalized and contain the web URL and PIN to access the survey. Beginning in week 3 after the survey invitation is sent, reminders will be sent each week on an alternating basis, so that a reminder email is mailed in week 3, a reminder letter in week 4, a reminder email in week 5, and so on. Near the end of data collection (approximately week 14), the final reminder will be a last-chance letter to all non-respondents that will stress the importance of the survey and alert the respondent that their opportunity for inclusion is coming to an end.


Telephone follow-up. Starting at week 6 of data collection, telephone prompting will begin for any non-responders at that time. NORC telephone interviewers will be trained on general interviewing techniques and project-specific protocols. For this project, telephone interviewers will call respondents to remind them to complete the survey, but also offer the option to complete the survey by telephone if they are able to respond immediately (Attachment 7).


Critical item survey. The critical item survey will be a shortened version of the full survey, only to be used during the final weeks of data collection. The critical item survey will collect data for 7 key survey questions from the remaining non-responders (Attachment 8).


B3. Methods to Maximize Response Rates and Deal with No Response

This study anticipates a response rate of 60%, which is higher than the achieved response rate from the 2014 iteration of the study (approximately 45%). Web-based surveys that have targeted this population of respondents generally have a response rate of 30%–40%; however, these studies have only made one to three contacts with respondents over the fielding period.2 This data collection effort seeks to attain a higher response rate of 60% overall based on the proposed study recruitment strategies and nonresponse follow-up techniques, including the introduction of a critical items survey. Similar to the approach taken in 2014, we will contact respondents in a variety of manners (i.e. email, mail, telephone) in order to prompt and encourage response. Each contact will convey the importance of participation.


The national survey will use these methods for maximizing the response rate and addressing nonresponse: (1) sample validation, (2) study recruitment using a study invitation letter, (3) a nonresponse follow-up strategy consisting of e-mail, mail, telephone contacts, and (4) a critical item survey.


Sample Validation Activities. Sample validation activities will occur after the municipalities are selected for inclusion. NORC will verify the structure of the municipal government and identify one office for each selected municipality to serve as the primary respondent. NORC will also partner with the National Association of County and City Health Officials (NACCHO) and use the 2017 COG file and internet searches to identify/verify the name of the office holder.


These sample validation operations will result in an up-to-date sample, thus minimizing the likelihood for nonresponse based on outdated information. Additional validation over the course of the data collection will occur as needed, such as when e-mails are undeliverable. The initial contact with the selected individual within the municipality will be a postal and email pre-notification letter to establish legitimacy and to alert him/her to the presence and purpose of the survey.


Study Recruitment Activities. To achieve the desired response rate of 60%, the study will use a survey invitation packet to recruit city or town manager/planner or a person with similar responsibilities for the sampled municipality. The study invitation packet was designed to emphasize the importance of each key informant participating in the study as a senior staff member for his or her municipality while valuing his or her time commitment for completing the national survey. The invitation letter presents the survey as an important feedback mechanism for the sampled municipality, as survey data will be shared with the sampled communities. In this initial letter (and subsequent letters), municipalities will be informed that they will have access to a visual representation of their data after data collection is complete.


This letter also communicates the low burden for completing the survey. The study invitation letter will use CDC letterhead and explain the study’s objectives, the instructions, and the information necessary for accessing the survey over the web. The invitation packet will contain information on a dedicated toll-free telephone line and an e-mail address will be provided so participants can request technical assistance or make inquiries about the survey or the survey process. Study staff will monitor daily both the toll-free line and the e-mail account to provide assistance, as needed.


Nonresponse Follow-Up Activities. The total data collection window for the study is 16 weeks, with a variety of nonresponse data collection activities and prompts starting in Week 3 and concluding with a critical items survey and final reminder letter/email prompt. Exactly 2 weeks after the study packet is sent, the study staff will send an e-mail notification reminder about the survey to the entire study sample. This e-mail reminder will be issued to those who have not completed the survey every two weeks (e.g. for weeks 3, 5, 7, 9, and so on). The data management system will update the status of completed surveys.


Starting with week six of data collection, trained telephone interviewers will begin calling non-respondents to prompt for completion of the survey over the web or offer the option to complete on the telephone. After the survey has been in the field for 8 weeks, a paper version of the survey with a specially tailored reminder letter will be mailed to those respondents who have been unresponsive to either a follow-up call or e-mail and have not started the survey.


Critical Item Survey. At week 12, a critical items survey will be sent to non-responding agencies. The critical items survey will be a shorter version of the survey, used to capture the most important survey items from municipalities who have not yet responded. Sent near the end of data collection, the survey items identified are a subset of items needed for analysis and to help mitigate non-response from non-responding agencies.


Table B.3.a. Proposed Data Collection Nonresponse Follow-up Activities and Schedule

Data Collection Nonresponse Follow-Up Activities

Schedule

Pre-notification letter mailed/emailed

Week 1

Initial web invitation letter mailed

Week 2

Email reminders sent to non-respondents

Weeks 3, 5, 7, 9, 11, 13

Reminder letters sent to non-respondents

Weeks 4, 6, 8, 10, 12

Telephone non-response follow-up

Weeks 6-7, 12

Paper version of the survey sent to non-respondents

Week 8

Critical items survey

Week 12

Final reminder letter/email

Week 14


Project staff will work to convert refusals in a sensitive manner that respects the voluntary nature of the study. These refusal conversion techniques will remind the municipalities of the importance of their response and the final report they will receive at the end as well as troubleshoot any barriers to responding to the survey, without unduly pressuring the individuals.


When telephone follow-up calls are made and the key informant is not available, the data collection staff will leave a voicemail message indicating the call was for this specific research study. Once data have been submitted by a municipality, all reminders and prompts (i.e. emails, letters, telephone calls) will cease.


Monitoring and mitigating nonresponse. The survey instrument for the current data collection employs similar content as the survey from the initial data collection. However, the content has been simplified and reorganized in a way to optimize flow between sections. Additionally, the survey items have been grouped into sections that will allow the sampled city planners to identify which sections they can readily answer and which sections may require assistance from a more knowledgeable individual in the department (e.g. a transportation director).


In addition to a more streamlined survey instrument and the non-response follow-up activities described above, at week 12, a brief critical items survey will be sent to non-responders to help mitigate non-response from non-responding agencies.


Nonresponse will also be mitigated by the use of an automated case management system that will obtain real-time data on whether a respondent has accessed the survey, started it, or submitted it. All undeliverable e-mails, or bounce-backs, will be tracked and followed up to obtain updated information. Project staff will try to determine another viable e-mail address where the reminder can be sent independently. Similarly, all undeliverable invitation and reminder letters will be tracked and captured within the case management system. This data management system will monitor the rate at which the survey is being accessed and completed. On the basis of this data, nonresponse follow-up telephone calls and personalized e-mail reminders will be tailored to the specific circumstances of the respective municipality. Once a municipality has completed the survey, the municipality will be removed from further contact.


Estimation Procedure. Following the survey data collection, we will develop survey weights that account for unequal probabilities of selection and varying rates of nonresponse. Nonresponse adjustments will capitalize on known population totals available from the COG files for various community groupings defined by size and geography. A sampling weight (W1) will be computed for all communities that reflects the probabilities of selection.


While every effort will be made to obtain a completed interview from the municipalities in our selected sample using the methods described above, our response rate may be less than 80%. If the final response rate is less than 80%, a non-response bias analysis will be conducted to determine the kind of nonresponse adjustments that will be necessary to minimize the bias potential, based on factors such as region, urban status, and community size. In the critical items survey, responses for particularly important items from harder-to-reach municipalities may serve as proxies for non-respondents. These responses will provide insight into how not only response rates but also actual answers to survey items themselves differ by the factors examined.


While the categorical region and urban status variables can be directly used to define weighting classes, size categories will be created for this purpose. We plan to use two size categories based on the median size within each cell defined by region by urban status to lead to equal-sized cells in terms of number of sample communities.


Within weighting classes informed by the nonresponse analysis, a nonresponse adjustment factor fNR will be computed as the sum of the weights of all sampled communities in the cell divided by the sum of the weights of all participating communities in the cell; where the sampling weight is W1,:

.


The final adjusted weight (W2) can be then expressed in terms of the sampling weight W1:



Final adjusted survey weights for each community will be assigned so that national estimates can be computed with minimal bias.


B4. Tests of Procedures or Methods to be Undertaken

The data sampling and collection methods are similar to those used in the 2014 survey. In addition, over 60% of the questions used in this survey are the same or similar to those used in the 2014 survey. These questions were cognitively tested in 2014. For this subset of questions, we reviewed questions that had a high percentage of “don’t know” responses and modified to help improve comprehension. For the proposed national study, a set of cognitive interviews was conducted to confirm comprehension and clarity of old and new items as well as to estimate time burden associated with completing the survey. This pretest was conducted within the required OMB guidelines with a sample of 7 randomly selected respondents who are city managers and planners or equivalent. Cognitive interviews assessed how respondents interpreted items; evaluated the adequacy of response options, definitions, and other descriptions provided within the questionnaires; and assessed the appropriateness of specific terms or phrases. Empirical estimates of respondent burden were also obtained through the cognitive testing of the questionnaire.


B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Within the agency, the following individuals will be responsible for receiving and approving contract deliverables and will have primary responsibility for the data collection and analysis:


Deborah Galuska, MPH, PhD, Technical Monitor
Associate Director of Science
Division of Nutrition, Physical Activity, and Obesity
National Center for Chronic Disease Prevention and Health Promotion
Centers for Disease Control and Prevention
4770 Buford Highway NE, MS/K-24
Atlanta GA, 30341-3717
770-488-6017

[email protected]


The representatives of the contractor responsible for conducting the planned data collection are:


Stephanie Poland, MS, MA

Project Director

NORC at the University of Chicago

55 East Monroe Street, 30th Floor

Chicago, IL 60603

312-759-4261

[email protected]


Christopher Johnson, MS

Statistician

NORC at the University of Chicago

1447 Peachtree Street NE

Atlanta, GA 30309

470-898-3856

[email protected]



REFERENCES

1. U.S. Census Bureau. Explanation of the 2010 Urban Area to Place Relationship File. Retrieved at https://www2.census.gov/geo/pdfs/maps-data/data/rel/explanation_ua_place_rel_10.pdf.


2. Hollander, M., Levin Martin, S., & Vehige, T. (2008). The surveys are in! The role of local government in supporting active community design. Journal of Public Health Management Practice, 14(3), 228237.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorStephanie Poland
File Modified0000-00-00
File Created2021-02-04

© 2024 OMB.report | Privacy Policy