Part B CBS Natl Survey_12 04 2013

Part B CBS Natl Survey_12 04 2013.docx

National Survey of Community-based Policy and Environmental Supports for Healthy Eating and Active Living

OMB: 0920-1007

Document [docx]
Download: docx | pdf

OMB SUPPORTING STATEMENT:


PART B

COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


National Survey of Community-Based Policy and Environmental Supports for Healthy Eating and Active Living





Submitted by:


Division of Nutrition, Physical Activity, and Obesity

National Center for Chronic Disease Prevention and Health Promotion


Centers for Disease Control and Prevention

Department of Health and Human Services





Technical Monitor: Deborah Galuska, MPH, PhD

Associate Director of Science
Division of Nutrition, Physical Activity, and Obesity

National Center for Chronic Disease Prevention and Health Promotion

4770 Buford Highway NE

MS K-24

Atlanta, GA 30341



December 4, 2013



Table of Contents



Exhibits


Appendices

Appendix A. Authorizing Legislation

Appendix B1. Federal Register Notice

Appendix B2. Summary of Public Comments

Appendix C1. National Survey of Community-Based Policy and Environmental Supports for Healthy Eating and Active Living

Appendix C2. Screen Shots of Survey Questionnaire

Appendix D. Study Invitation Materials

D1. Study Invitation Letter

D2. Study Question and Answers

D3. Instructions for Accessing the Web Survey

D4. Telephone Nonresponse Follow-Up Contact Script

D5. E-mail Nonresponse Follow-Up

Appendix E. Example Tables for Analysis

Appendix F. References

B.1. Respondent Universe and Sampling Methods

B.1.a. Universe and Sampling Frame

The respondent universe for this data collection consists of the universe of municipalities within each of the 50 States in the United States because the study is designed to be nationally representative. Municipal governments, as defined by the U.S. Census Bureau, are the primary sampling unit (PSU) for the data collection.1 The sampling frame will be constructed from the most recent U.S. Census of Governments (COG), which provides a listing of municipalities and townships for each State. COG data for the name, type (e.g., city, town, village), and total population of each municipality or township will be used to construct the sampling frame. In constructing the sample frame, the following variables from the COG files will be used, organized into two sections depending on their utility, for (1) the design and (2) for recruitment:

  1. Variables useful for the design:

  • Community name

  • Population

  • County name

  • Place as indicated by Federal Information Processing Standards (FIPS)

  • Political description (e.g., municipality)

2. Variables useful for recruitment to the study:

  • Community name

  • Title of political leader (e.g., city or town manager or planner or a person with similar responsibilities for the sampled municipality)

  • Physical address of government office

  • Web address of government site (if applicable—not available for all communities)

The development of the sample frame requires a tailored decision process to ensure there is no geographic overlap between municipalities and towns or townships. This decision process will prevent counting populations twice that are under the jurisdiction of both forms of government, which would inflate numerators in the population estimates the study will develop.2 To eliminate the potential for geographic overlap in the sample frame, the COG data will be edited using the following procedures. Any geographic areas within States that are under the jurisdictional authority of both a municipal and a township government will be identified. In instances where this occurs, the COG data will be edited to exclude the township. For the purposes of this study, the sample will focus on municipalities as the geographic level where relevant decisions are made. By using municipalities as the PSU (the finest grain), it will be possible to provide a representative sample of policies passed at the municipal level.

Thirty States have only communities identified as municipalities. In the other 20 States, the COG file must be reviewed to confirm no geographic overlap exists between two local governing bodies (i.e., municipalities and towns or townships). In these 20 States, located primarily in the Northeast and Midwest, the term “town or township governments” is also applied to organized governments. Of these States, 10 with townships present no challenges due to overlapping units.3 In those 10 States, both municipalities and townships can be used as sampling units without any double counting after careful review of the COG file data. The 10 remaining States, listed in Exhibit B.1, present some overlap between the two types of units (municipalities and townships). For some of these States, the term “villages” is used to describe the second jurisdiction.

Exhibit B.1. States With Geographic Overlap Between Municipalities and Townships (or Villages)

States With Geographic Overlap Between Municipalities and Townships (or Villages)

Connecticut

Illinois

Indiana

Kansas

Michigan

Minnesota

Missouri

Nebraska

New York

Vermont

The 10 States (Exhibit B.1) may be further divided into two subgroups, depending on the type of unit they include other than municipalities (e.g., townships or villages). While the first subgroup of these 10 States includes townships, the second subgroup includes villages (or “third-class” municipalities). Specifically for the six States listed below, the process of editing to ensure there is no geographic overlap will delete all townships from the frame to correct the overlap:

  • Connecticut

  • Illinois

  • Indiana

  • Minnesota

  • Missouri

  • Nebraska

In the remaining four States listed below, the overlap will be corrected by removing villages, as well as municipalities labeled as third class:

  • Kansas Remove third-class municipalities from the municipal government listing

  • Michigan Remove villages from the municipal government listing

  • New York Remove villages from the municipal government listing

  • Vermont Remove villages from the municipal government listing

Threshold for Inclusion in the Sample Frame. The project’s previous pilot study (Pilot Study of Community-Based Surveillance and Supports for Healthy Eating/Active Living, OMB No. 0920-0934, exp. 5-31-2013), which did not have a population size threshold for inclusion in the sample frame, investigated the tradeoff involved in adopting a size threshold below which communities are considered too small to be included in the frame. The findings from the pilot study confirmed that municipalities with a population size of less than 1,000 were unable to respond to the study because they were too small to have the policies the survey asks about. The pilot study also confirmed that undue burden would be placed on city managers or planners of these small cities in completing the survey because they are part-time staff. While the use of a threshold leads to some coverage loss, it also leads to gains in efficiency in response rates (overall and item response rates) and in reaching the subset of communities that can meaningfully provide the kind of data envisioned by the study.

B.1b. Sample Sizes and Estimation Precision

The sampling strategy for the study is designed to yield 95% confidence intervals within ± 3 percentage points for all study estimates, taking into account the anticipated design effect of 1.1 and anticipating a response rate of 70%. It is also designed to accommodate detecting a 5% difference in prevalence estimates between two subgroups of approximately equal sample sizes with an alpha of 0.05. Example subgroups of interest are urban versus nonurban (rural) communities and small versus large communities.

Determining the Sample Size. The sample size for the national study, 4,484 communities, was developed to ensure precise estimation and sufficient statistical power to investigate comparisons across subgroups. The sample size was determined based on two criteria: (1) the ability to ensure 95% confidence intervals within ± 3 percentage points for all study estimates and (2) the ability to ensure the detection of differences between subgroup percentages as small as 5% (alpha = 0.05). This total sample size will generate subgroup comparisons with 80% power and detect differences as small as 5%.

Estimation Precision. Sample sizes were developed to achieve 95% confidence intervals of ± 3% or less for estimated proportions or percentages. To ensure precision for percentages of all magnitudes, we assumed the worst possible scenario for the prevalence (50% where the variances and standard errors have maximum values). Our calculations took into account the design effect and the finite population correction (fpc). For the single-stage sampling design proposed for the study, design effects (DEFFs) very near 1.0 can be expected. The DEFF is defined as the variance under the actual sampling design divided by the variance of a simple random sample (SRS) of the same size. Design effects near 1.0 indicate efficient designs whose variance is comparable to that under SRS designs. The sampling approach assumes DEFFs of 1.1 to account for minor effects of unequal weighting (e.g., due to nonresponse adjustments). The COG lists about 36,000 units (municipalities and townships). After the exclusion of very small communities, this will be reduced to about 20,000 units. To achieve a 95% confidence interval of 3% or less for estimates proportions, a minimum sample size of 1,109 participants is then necessary for the required precision levels, corresponding to 1,509 selections when inflated for the anticipated response rate of 70%. This calculation takes into account the sample size necessary for the finite population correction.4

Detecting Small Differences. The study’s sampling approach develops conservative sample sizes that allow the detection of differences of 5% (two-tailed test) with 80% power for the worst-scenario effects. Exhibit B.2 shows two example scenarios comparing percentage estimates between two sample subgroups of 25% versus 20% and 52.5% versus 47.5%. The table illustrates that under the worst scenario—percentages of 52.5% and 47.5%—sample sizes of 1,569 participants per group, and a total of 3,139 participants are needed to achieve 80% power. Assuming a 70% participation rate, the number of selected communities needs to be 4,484 to accommodate a final completion number of 3,139 communities to achieve the desired participation rate.





Exhibit B.2. Sample Sizes Needed to Detect 5% Differences Between Groups
for 80% Power (Two-Tailed Test)

Group 1 Percentage

Group 2 Percentage

N per Group

Total Selections (Municipalities) for a 70% Completion Rate

Total Selections (Municipalities) Required Assuming Attrition

20.0%

25.0%

1,094

2,188

3,126

47.5%

52.5%

1,569

3,139

4,484



B.1.c. Statistical Methodology for Stratification and Sample Selection

The sample for the national survey will be stratified by a number of relevant dimensions including region, population size, and urban status. The sampling design will consider an effective mix of explicit and implicit stratification. Explicit regional stratification will be by region to ensure a sufficient number of sample communities in each region. Additional explicit stratification by urban status will also ensure a sufficient number of urban and nonurban (rural) communities in the sample. Implicit stratification by size will ensure that the sample is distributed across varying community size categories. Implicit stratification, which allows the possible use of continuous variables such as size in stratification, will be implemented by sorting the frame by the size variable within each explicit stratum.

The sampling frame will be explicitly stratified by Census Region and by urban status. The urban versus rural dichotomy will follow the Census Bureau classification. Exhibit B.3 provides an overview of the States by their specific Census Region.




Exhibit B.3. Definition of Census Regions in the United States

Region

States

1. Northeast

Connecticut, Maine, Massachusetts, New Hampshire, Rhode Island, Vermont, New Jersey, New York, Pennsylvania

2. Midwest

Indiana, Illinois, Michigan, Ohio, Wisconsin, Iowa, Kansas, Minnesota, Missouri, Nebraska, North Dakota, South Dakota

3. South

Delaware, District of Columbia, Florida, Georgia, Maryland, North Carolina, South Carolina, Virginia, West Virginia, Alabama, Kentucky, Mississippi, Tennessee, Arkansas, Louisiana, Oklahoma, Texas

4. West

Arizona, Colorado, Idaho, New Mexico, Montana, Utah, Nevada, Wyoming, Alaska, California, Hawaii, Oregon, Washington

To quantify urban status, we will link Census SF-1 data for the Census Place. Both SF-1 file and the COG file provide a Place FIPS (ID) that can be used in the linking of the two data sets. We will use a direct classification of urban and rural areas that is consistent with the 2010 Census Urban and Rural Area Classification.5 The Census Bureau’s urban areas represent densely developed territory and encompass residential, commercial, and other nonresidential urban land uses. For the 2010 Census, an urban area comprises a densely settled core of census tracts and/or census blocks that meet minimum population density requirements, along with adjacent territory. The Census Bureau identifies two types of urban areas:

  • Urbanized Areas (UAs) of 50,000 or more people

  • Urban Clusters (UCs) of at least 2,500 and less than 50,000 people

Rural areas encompass all population, housing, and territory not included within an urban area. Within each stratum defined by region and urban status, the frame will be sorted by size. Population data, available from the COG files, will be used as the community size variable.

The sample will be allocated proportionally to the explicit strata defined by region and urban status. A proportional allocation will maximize the precision of overall survey estimates by leading to a nearly self-weighting sample (i.e., equal probabilities for sample communities). Implicit stratification by size also helps to ensure proportional representation of communities with varying sizes. In addition, we will select the sample communities with equal probabilities within strata. For example, we will select the sample with systematic random sampling from the sorted frame within explicit strata.

Computing the Sample Weights. Following the survey data collection, we will develop survey weights that account for unequal probabilities of selection and varying rates of nonresponse. Nonresponse adjustments will capitalize on known population totals available from the COG files for various community groupings defined by size and geography. A sampling weight will be computed for all communities that reflects the probabilities of selection (W1). A nonresponse analysis will determine the kind of nonresponse adjustments that will be necessary to minimize the bias potential. Nonresponse adjustment classes will be based on regions, community size, and urban status. While the categorical region and urban status variables can be directly used to define weighting classes, size categories will be created for this purpose. We plan to use two size categories based on the median size within each cell defined by region by urban status to lead to equal-sized cells in terms of number of sample communities.

The weight will be adjusted for nonresponse by multiplying the sampling weight, W1, by the inverse of the proportion of sample municipalities that respond to the data collection. The nonresponse adjustment factor (fNR) is computed as the sum of the weights over all selected communities divided by the sum of the weights over participating communities only. The final adjusted weight (W2) can be then expressed in terms of the sampling weight W1:



When the sampling weight is constant, the adjustment factor simplifies to the ratio of selections to respondents, or the reciprocal of the response rate (within a stratum). Final adjusted survey weights for each community will be assigned so that national estimates can be computed with minimal bias.



B.2. Procedures for Collection of Information

The national survey requires a study recruitment approach that can support the achievement of a 70% response rate with a Web-based survey. The literature on Web-based data collections indicates that this is a challenging response rate to achieve. However, the recently concluded pilot study suggests that with intensive outreach to confirm participation after a study invitation is sent and nonresponse follow-up via telephone and e-mail, this response rate is feasible.

The targeted survey respondent is the city or town manager or planner or a person with similar responsibilities for the sampled municipality. After the sampling is completed, the sample will be validated to confirm the name and address of the city or town manager or planner, who will then receive the study recruitment invitational packet (Appendices D1–D3) on behalf of the sampled municipality. All municipalities will receive a hard-copy correspondence that includes an invitation letter from the Centers for Disease Control and Prevention (CDC) describing the purposes of the study, instructions on how to access the Web-based questionnaire, an informed consent document, and additional study background materials.

Data Collection Instrument. The National Survey of Community-Based Policy and Environmental Supports for Healthy Eating and Active Living is a self-administered, Web-based questionnaire (Appendix C) that consists of 38 items.

Data Collection Procedures. To initiate the data collection, each sampled municipality will be contacted to validate the sample and to confirm the name and contact information for a key informant—usually the city manager or planner—to whom all correspondence will be directed.

The city or town manager or planner, or a person with similar responsibilities for the sampled municipality is designated as the primary respondent for the survey because he or she typically has the broadest knowledge of a municipality’s policies. The previous pilot study and an expert panel affirmed that the city manager or planner is the most appropriate point of contact for the survey. After sample validation is complete, the sampled municipalities will then receive via Federal Express an invitational packet containing the following items:

  • An invitational letter from CDC (Appendix D1) that explains the study

  • A project fact sheet in question-and-answer format (Appendix D2) on methods through which municipalities can participate, burden expectations, and timeline for participation

  • Instructions on how to access and complete the questionnaire (Appendix D3)

The invitational packet will serve to recruit the sampled community to participate in the study and will provide them with the needed information to access the Web-based survey system. Receipt of Federal Express packets will be tracked online at www.FedEx.com. Project staff will place telephone calls to municipalities where packets cannot be delivered to obtain updated mailing information and the packets will be re-sent.

After the invitation packets are sent to the sampled communities, designated project staff (the study recruiters) will make follow-up telephone calls that will confirm receipt of the invitation packet and carefully review the intent of the study to obtain initial support for a completion. The role of these recruiters is to identify and overcome any barriers to participation via this immediate follow-up telephone call. The recruiters will contact the sampled municipalities within 1–3 days after package tracking displays the study invitation packet’s arrival, confirming its receipt and reviewing details of the packet. This conversation will also confirm that respondents can use their unique access code to log into the survey Web site.

Respondents will complete the self-administered survey through the Web-based data collection system. Sampled municipalities will be assigned a unique identifier, or token, which will provide the key informant with a security-enabled access to this Web-based data collection system where they can complete and submit the questionnaire. Alternatively, the respondent can elect to complete the questionnaire in a paper format and return it by mail to the study headquarters. Instructions for using the paper survey are provided in the invitation letter from CDC so respondents can print the specially formatted paper questionnaire from the survey’s Web link, complete the questionnaire, and return the questionnaire by mail. It is anticipated that a small number of municipalities will choose this paper survey option; however, the pilot study found that fewer than 1% of the sampled municipalities chose this option.

The data collection window for the study is 12 weeks, with an additional 4 weeks of time allotted for nonresponse survey follow-up activities. Formal e-mail reminders and scheduled telephone follow-up for the data collection will occur at 2-week intervals, but study recruiters will conduct additional telephone survey follow-up on a case-by-case basis for the duration of the study. If the survey has not been started after 2 weeks, the study recruiters will conduct telephone follow-up calls to remind sampled communities about the survey. Once key informants have completed the survey, they will submit the completed survey via the Web-based system or by mailing the printed survey to the contractor’s headquarters. The data collected from paper questionnaires will be entered directly into the Web-based data collection system by the contractor’s trained field staff.

Experienced survey field managers and study recruiters will be trained on the Web-based system and survey questionnaire and will also receive a 1-day refresher training in refusal-conversion techniques. All study staff will be required to comply with the data security protocols established for the national survey.


B.3. Methods to Maximize Response Rates and Address Nonresponse

B.3.a. Expected Response Rates

This study anticipates a response rate of 70%. Web-based surveys that have targeted this population of respondents generally have a response rate of 30%–40%; however, these studies have only made one to three contacts with respondents over the fielding period.6 This data collection effort seeks to attain a higher response rate of 70% overall based on more intense study recruitment strategies and nonresponse follow-up techniques. The assumption of a 70% response rate is based upon a comparison of methods used in similar methodological studies, such as the 2010 New York Physical Activity and Nutrition Study and the most recent cycle of the national School Health Policies and Programs Survey (OMB No. 0920-0445, exp. 8/31/2016). These two studies showed response rates closer to 75%. To achieve the proposed 70% response rate, the national survey data collection protocol was designed to include sample validation activities, study recruitment activities (section B.2b), and a more intense approach to nonresponse follow-up activities (section B.3.b.). As demonstrated by the pilot study that used similar strategies and procedures, a response rate of 68% over a 9-week data collection period was achieved. It is reasonable to assume that the anticipated response rate in the present study will be higher than in the pilot study for two reasons: (1) the data collection period is longer and (2) communities with fewer than 1,000 persons are excluded from the study.

B.3.b. Methods for Maximizing Responses and Handling Nonresponse

The national survey will use three methods for maximizing the response rate and addressing nonresponse: (1) sample validation, (2) study recruitment using a study invitation letter, and (3) an intense nonresponse follow-up strategy consisting of telephone, e-mail, and mail contacts.

Sample Validation Activities. Sample validation activities will occur before the study invitation materials are sent. Sample validation will confirm the contact name, title, work address, work e-mail, and work telephone number for the key informants, who are the city or town manager or planner or a person with similar responsibilities for the sampled municipality. These sample validation operations will result in an up-to-date sample, thus minimizing the likelihood for nonresponse based on outdated information. Additional validation over the course of the data collection will occur as needed, such as when e-mails are undeliverable.

Study Recruitment Activities. To achieve the desired response rate of 70%, the study will use a survey invitation packet to recruit city or town manager/planner or a person with similar responsibilities for the sampled municipality. The study invitation packet was designed to emphasize the importance of each key informant participating in the study as a senior staff member for his or her municipality while valuing his or her time commitment for completing the national survey. The invitation letter presents the survey as an important feedback mechanism for the sampled municipality, as survey data will be shared with the sampled communities. This letter also communicates the low burden for completing the survey. The study invitation packet will consist of a formal invitation letter from CDC that explains the study’s objectives, the instructions, and the token for accessing the survey. It will also include a letter of support from a State public health department. The invitation packet will contain information on a dedicated toll-free telephone line and e-mail address will be provided so participants can request technical assistance or make inquiries about the survey or the survey process. Study staff will monitor daily both the toll-free line and the e-mail account to provide assistance, as needed.

Nonresponse Follow-Up Activities. The total data collection window for the study is 16 weeks:
12 weeks of data collection, with an additional 4 weeks for more intense non-response follow-up activities. Exactly 2 weeks after the study packet is sent, the study staff will send an e-mail notification reminder about the survey to the entire study sample. This e-mail reminder will be issued to those who have not completed the survey every two weeks, for weeks 2, 4, 6, 8, 10 and 12 of the initial data collection period. The data management system will update the status of completed surveys. The study’s recruiters will review the data management system and will place telephone calls every 2 weeks for municipalities that have not completed and submitted their surveys. This calling activity will occur in weeks, 3, 5, 9 and 11 of the initial data collection period. After the survey has been in the field for
6 weeks, a paper version of the survey with a specially tailored reminder letter will be mailed to those respondents who have been unresponsive to either a follow-up call or e-mail and have not started the survey.

Exhibit B.4. Data Collection Nonresponse Follow-Up Activities and Schedule

Data Collection Nonresponse Follow-Up Activities

Schedule

Invitation packets sent

Week 1

Initial study recruitment follow-up completed

Week 1

Telephone reminder call to nonrespondents based on case status for non-respondents

Weeks 3, 5, 9, 11

E-mail reminder sent to nonrespondents

Weeks 2, 4, 6, 8, 10, 12

Paper version of survey mailed to nonrespondents with reminder letter

Weeks 6, 10

E-mail reminder to nonrespondents

Weeks 13, 14, 15, 16

Nonresponse follow-up telephone call based on case status

Weeks 13, 14, 15, 16

If nonresponsiveness continues at week 10, another mailing of a paper survey will occur. Over the last 4 weeks of the data collection window, weekly e-mail reminders and weekly personalized telephone follow-up calls will occur. Project staff will earnestly work to convert refusals in a sensitive manner that respects the voluntary nature of the study. Although these refusal conversion techniques will remind the municipalities of the importance of their response and will troubleshoot any barriers to responding to the survey, the refusal conversion techniques will not unduly pressure the individuals. When telephone follow-up calls are made and the key informant is not available, the data collection staff will leave a voicemail message indicating the call was for this specific research study. Once data have been submitted by a municipality, e-mail and telephone reminders to that respondent will cease.

Methods to address nonresponse also include procedures for tracking and delivering survey reminders. All undeliverable e-mails, or bounce-backs, will be tracked and followed up to obtain updated information. Project staff will try to determine another viable e-mail address where the reminder can be sent independently and then, if needed, a follow-up telephone call will be made to confirm the new e-mail address.

Nonresponse will also be mitigated by the use of an automated case management system that will obtain real-time data on whether a respondent has accessed the survey, started it, or submitted it. This data management system will monitor the rate at which the survey is being accessed and completed. On the basis of this data, nonresponse follow-up telephone calls and personalized e-mail reminders will be tailored to the specific circumstances of the respective municipality. The data management system will also provide the ability to send a confirmatory e-mail to respondents who have completed the survey. These respondents will then be removed from further contact.

B.4. Tests of Procedures and Methods to Be Used

Over a period of 2½ years, from 2010 to early 2013, CDC conducted the “Pilot Study of Community-Based Surveillance and Supports for Healthy Eating/Active Living”, (OMB No. 0920-0934, expiration 5-31-2013), which was designed to address methodological issues that would impact the design and implementation of the national survey. The pilot study was conducted in two States with a sample of 400 communities. The pilot study findings clarified the need for a population threshold for the sample, confirmed the city manager or planner as the correct point of contact for the sampled communities, and confirmed that the majority of questionnaire items could be answered by the city manager or planner. The pilot study results also highlighted questionnaire items requiring additional revisions to provide meaningful data. The pilot test instrument was cognitively tested with a set of nine respondents before the pilot was fielded. The pretest was conducted within OMB guidelines with nine randomly selected city managers and city planners. The pretest sample was diversified according to the size of the municipality using the sampling strategy and simulated the conditions of a Web survey.

For the proposed national study, another set of cognitive interviews was conducted to confirm additional questionnaire revisions made as a part of the review of pilot test findings. This pretest was conducted within the required OMB guidelines with a sample of nine randomly selected respondents who are city managers and planners or equivalent. Cognitive interviews assessed how respondents interpreted items; evaluated the adequacy of response options, definitions, and other descriptions provided within the questionnaires; and assessed the appropriateness of specific terms or phrases. Empirical estimates of respondent burden were also obtained through the cognitive testing of the questionnaire.

B.5. Consultation on Statistical Aspects of the Study Design

Statistical aspects of the study have been reviewed by:

Ronaldo Iachan, PhD

Senior Sampling Statistician

ICF International

11785 Beltsville Drive, Suite 300

Calverton, MD 20705

301-572-0538

Within the agency, the following individuals will be responsible for receiving and approving contract deliverables and will have primary responsibility for the data collection and analysis:

Deborah Galuska, MPH, PhD, Technical Monitor
Associate Director of Science
Division of Nutrition, Physical Activity, and Obesity
National Center for Chronic Disease Prevention and Health Promotion
Centers for Disease Control and Prevention
4770 Buford Highway NE, MS/K-24
Atlanta GA, 30341-3717
770-488-6017

[email protected]

The representatives of the contractor responsible for conducting the planned data collection are:

Erika Gordon, PhD

Project Director

ICF International

11785 Beltsville Drive, Suite 300

Calverton, MD 20705

301-572-0881

[email protected]








Alice M. Roberts, MS

Technical Specialist

ICF International

11785 Beltsville Drive, Suite 300

Calverton, MD 20705

301-572-0290

[email protected]


Renee Ray, MA

Technical Specialist and Content Expert

ICF International

3 Corporate Square NE, Suite 370
Atlanta, GA 30329

404-592-2241

[email protected]

1 As defined by U.S. Census Bureau statistics on governments, the term “municipal governments” refers to political subdivisions within which a municipal corporation has been established to provide general local government for a specific population concentration in a defined area; it includes all active governmental units officially designated as cities, boroughs (except in Alaska), towns (except in the six New England States and in Minnesota, New York, and Wisconsin), and villages.

2 Municipal” and “township” governments are distinguished primarily by the historical circumstances surrounding their incorporation. In many States, most notably in the Northeast, municipal and township governments have similar powers and perform similar functions; the difference between a municipality and a township within a particular State is merely a semantic one. However, the scope of governmental services provided by these two types of governments varies widely from one State to another, as does the amount of geographic overlap between these two units, thus requiring a tailored decision on inclusion or exclusion of townships in the sampling frame on a state-by-state basis.

3 These States are Maine, Massachusetts, New Hampshire, New Jersey, North Dakota, Ohio, Pennsylvania, Rhode Island, South Dakota, and Wisconsin.

4 If the finite population correction had not been taken into account, the initial sample would have been 1,677 selections.

5 See reference at http://www.census.gov/geo/reference/ua/urban-rural-2010.html.

6 Hollander, M., Levin Martin, S., & Vehige, T. (2008). The surveys are in! The role of local government in supporting active community design. Journal of Public Health Management Practice, 14(3), 228237.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy