Suppporting Statement Part B_8152013

Suppporting Statement Part B_8152013.docx

Public Health Systems, Mental Health and Community Recovery Project

OMB: 0920-0993

Document [docx]
Download: docx | pdf


Supporting Statement B



Public Health Systems, Mental Health and Community Recovery Project





New






Centers for Disease Control and Prevention

Office of Public Health Preparedness and Response

Division of State and Local Readiness

Applied Science and Evaluation Branch

Asha Z. Ivey-Stephenson, MA, PhD

(404) 639-7581

[email protected]

August 15, 2013
















Table of Contents


Part B. Collections of Information Employing Statistical Methods



B1. Respondent Universe and Sampling Method

B2. Procedures for the Collection of Information

B3. Methods to Maximize Response Rates and Deal with No Response

B4. Tests of Procedures or Methods to Be Undertaken

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data




List of Attachments




Attachment A

Section 301 of the Public Health Service Act (42 U.S.C. 241)

Attachment B

60 Day FRN

Attachment C

Key Informant Interview: Introductory Letter

Attachment D

Household Survey for General Public and Consent

Attachment E

Key Informant Interview Guide_ PH/MH Agency Staff

Attachment F

Key Informant Interview Guide_ Community Organization Respondents

Attachment G

Key Informant Interview Guide_ Consent Form

Attachment H

Household Survey for General Public_ Study Screener

Attachment I

ICF IRB Approval Form

Attachment J

Household Survey for General Public_ Letter for Addresses with Matched Phone Numbers

Attachment K


Household Survey for General Public_ Letter for Addresses without Matched Phone Numbers

Attachment L

Household Survey for General Public_ Postcard to Indicate Interest








The proposed qualitative study component will not use statistical methods for selection of study respondents or for analysis. For more information about the sampling plan and data collection methodology for the qualitative aspect of this study, please refer to Part A: Justification.


B1. Respondent Universe and Sampling Method


Quantitative data will be gathered through a household survey conducted by telephone in the four disaster-exposed regions, supported by extant data on health outcomes, service use, demographics, and economic conditions available for these regions. Within each region, all households within a set number of miles from the Super Outbreak of 2011 tornado track will make up the household sampling frame (Table 1).


Table 1. Regions to be Surveyed

Region Number

County FIPS1

County Names

Occupied Housing Units

Approximate Width Around Track

Estimated Frame Households

1

01073

Jefferson, AL


236,568

10 miles

93,353

2

01125

Tuscaloosa, AL


76,141

10 miles

49,918

3

01093, 01059

Marion, AL and Franklin, AL


24,937

15.5 Miles

13,000

4

28095,

28017

Monroe, MS and Chickasaw, MS


22,514

16 Miles

13,000



Selection of households will be achieved through an Address-Based Sample (ABS) design. The sample of households will be selected from a list of addresses that has been verified and updated by the U.S. Postal Service’s (USPS’s) computerized delivery sequence file. The sampling frame will include all residential delivery points within the storm-affected areas within each of the four regions, including post office boxes and rural routes.


An ABS design will provide coverage of landline and wireless-only households to ensure a representative sample of the household population, with the geographic specificity required to reach residents within a defined radius of the tornados. A probability sample of respondents will be selected using random sampling; every person with a listed address and a telephone (either landline or cellular phone) has a known probability of being selected for the study.

Due to the data collection methodology, approximately 2.2 % of households with no telephone will be excluded from data collection, in addition to roughly 2 % of residents not covered by the address sampling frame.

In each of the four regions, we intend to achieve 860 completed interviews for a total of 3440 interviews. This sample size is based on detecting statistically significant differences in perceived health between any two regions. The sample size will provide 80 % power to detect a minimum difference of 8 points in the percentage of adults who report 7 or more unhealthy days in the past month (1 potential outcome that could be estimated using an existing Behavioral Risk Factor Surveillance System [BRFSS] question), which is estimated to be 20 %. This is based on testing at the 5 % significance level and a design effect of 2.0. We expect to achieve adjusted response rates of 50 % or better, on the basis of our experience conducting similar surveys.

We used our experience in conducting BRFSS and other telephone surveys to develop an estimate of 13,000 households needed in order to achieve 860 completed surveys in a region. After selecting addresses within each of the region, we will match addresses to phone numbers. On the basis of prior experience, we expect 50 % of addresses to have a phone number. Addresses that have a matching phone number will receive a letter in advance, followed by Computer-Assisted Telephone Interviewing (CATI) to complete the interview. Among those with matched phone numbers, we expect 40 % to be valid, active numbers. Among households with valid phone numbers, approximately 35 % of households will result in no contact (e.g., busy, voicemail), 55 % will refuse to be screened, and 12 % will not have an eligible respondent. Among screened households with an eligible respondent, we expect approximately 90 % to complete a survey. Addresses that are not matched to a phone number will receive a letter (Attachment K) inviting them to either call us or provide us with their phone number in order to complete the household survey via CATI. We expect approximately 8.5 % of these households to return a postcard with a phone number (Attachment L), based on a typical mail response rate of 10 %, and accounting for some invalid addresses in the address file. Among those returning a postcard, we expect at least 50 % to complete a survey.


Overall, we expect that 70 % of completed surveys will come from the sample with matched phone numbers, and 30 % will come from households without matching phone numbers. We expect to achieve adjusted response rates of 50 % or higher, on the basis of our experience conducting similar surveys such as the BRFSS.



B2. Procedures for the Collection of Information


Quantitative household survey data will be collected through a series of telephone interviews in the four identified disaster-exposed regions (Table 1). Within these regions, all households within a set number of miles of the tornado track will be included in the household sampling frame. Data collection will be achieved through ABS—households will be selected from a list of addresses that have been verified by the USPS’s computerized delivery sequence file. This design will provide coverage of landline households and wireless-only households.

In each region, a sample of 13,000 addresses will be drawn at random to yield 860 completed interviews. Based on prior experience, we expect 50 % of addresses to have a matching phone number. These households will receive a letter in advance of the phone call (Attachment J). Households without a corresponding phone number will be sent a letter (Attachment K) inviting them to call us or provide us with a phone number via postcard (Attachment L) in order to complete the survey.

Prior to conducting the CATI, all interviewers will attend the PHSMHCR-specific, one-day training, conducted by the CATI survey coordinator for the PHSMHCR project. The household survey training manual will provide the foundation for training CATI interviewers on survey-specific issues to ensure each interviewer is sufficiently skilled and knowledgeable about the household survey and can respond without hesitation to questions about the survey purpose, sponsor, and other common respondent questions. Specific training topics will include: survey goals and data usage; informed consent; CATI program nuances; review of unique or challenging terminology; study dialing and refusal protocols; special considerations for a survey about sensitive topics; methods for dealing with uncooperative respondents and maximizing response rates; methods to ensure privacy and minimize bias; and appropriate responses to frequently asked questions. The training also includes extensive practice using the CATI program, conducting mock interviews, and role-playing a variety of scenarios. As part of role-playing, the trainer will break the training group into smaller groups of three to four interviewers. Each group will be presented with several scenarios in which the respondent is reluctant to participate, or reluctant to transfer the phone to the selected respondent. Members of each group will discuss and role-play the methods they would use to gain respondent cooperation. At the end, the groups will convene as a whole to share ideas and receive feedback from the trainers. The sensitive nature of the household survey and content area predispose the project to the possibility of distressed respondents and adverse events. The training session will cover crisis protocol, respondent cues to ascertain distress levels, and appropriate actions for each distress level.

When an eligible household is reached, an eligible respondent will be selected randomly.

  • An eligible household is a housing unit that has a separate entrance, where occupants eat separately from other persons on the property, and that is occupied by its members as their principal or secondary place of residence. Noneligible households are (1) vacation homes not occupied by household members for more than 30 days per year, (2) group homes, and (3) institutions.

  • Eligible household members include all related adults (ages 18 years or older), unrelated adults, roomers, and domestic workers who consider the household their home, even though they may not be home at the time of the call. Household members do not include adult family members who are currently living elsewhere.


Over a 30-day period, interviewers will make a minimum of 15 attempts to reach an eligible household and interview an eligible adult for each telephone number in the sample frame. A minimum of 15 attempts will be spread over three calling periods: weekday days, weekday evenings, and weekends. At least three attempts will be made in each period; the remaining six attempts will be made at what are determined to be the most productive times, while maintaining about 20 % of the calling during the weekday daytime period. Each call attempt will allow a minimum of five rings. Eligible persons initially refusing to participate will be recontacted a minimum of one additional time for attempted conversion. Proxy interviews will not be conducted. An interview is considered complete if data are collected for age, race, and sex. If values on age or race are not entered, imputed values will be generated and used only to assign post stratification weights.

Quality Assurance (QA) Assistants in the call center monitor at least 10 % of all interviews by unobtrusively tapping into an interviewer’s telephone line while simultaneously observing responses as the interviewer enters them into the CATI system. Each interviewer is monitored at least weekly. In addition, we have a remote monitoring system that allows visual and auditory monitoring of any interview. CDC staff will be able to hear interviews, while a “dead” line permits them to communicate with the QA assistant during the monitoring session without interrupting the interview. A web-based interface will allow CDC staff to see the data being entering into the CATI program as the interview is being conducted. We keep a recording database of all CATI calls occurring during the prior 15 days of fielding. The database houses the majority of attempts, which includes everything from completed interviews and introductions, to no-answers (e.g., answering machines, privacy managers). Recorded interviews allow both ICF and CDC team members to conduct additional monitoring and QA tasks.



B3. Methods to Maximize Response Rates and Deal With No Response


For the household survey, we will be drawing on our BRFSS data collection infrastructure and expect to achieve response rates comparable to the BRFSS. CDC guidelines require a minimum Council of American Survey Research Organizations (CASRO) response rate (RR) of 40 % for the BRFSS, a response rate that we exceed in every State for which we administer the survey (Table 2).

Table 2. ICF BRFSS Response Rates by State

State

2012 CASRO RR

Arkansas

53%

Arizona

50%

Connecticut

49%

District of Columbia

50%

New Hampshire

54%

Rhode Island

48%

Vermont

61%

Washington

48%

Wyoming

53%

















We have responded to the systemic industry-wide decline in response rates by refining and optimizing our interviewing protocol and interviewer training and implementing leading-edge call center technology. We expect to achieve a 50 % CASRO response rate for the household survey based on our experience conducting BRFSS data collection. Suggested protocol enhancements to increase response rates include the following:

  • Sending advance notification letters to the addresses matched to phone numbers to alert households to the survey effort underway.

  • Making first attempt calls on the listed landline sample during the evening and weekend shifts, as evenings and weekends are more productive than weekdays.

  • Making up to two additional landline refusal conversion attempts in cases meeting specific criteria in order to obtain a more representative sample and increase the response rate.

  • Developing an Interactive Voice Response (IVR) system for the PHMHCR (include this phone number in the pre-notification letter) to promote informed survey response and provide 24-hour survey information to respondents.

  • Displaying caller identification linked to the IVR system; caller ID is critical to reaching respondents with call block and privacy manager devices, and notifying respondents of the important PHMHCR research effort is critical to achieving a representative survey sample.



B4. Tests of Procedures or Methods to Be Undertaken


During the instrument development process, the CATI coordinator will review the survey’s internal logic—making sure the questionnaire is consistent from question to question and that each question is asked of the appropriate respondents, adding questions to ensure accuracy of certain responses, and formatting the questionnaire. The CATI programming team will do a final review of survey logic and skip patterns, and program and test the survey, subjecting it to several steps of testing and quality control measures. The programmer will review the survey logic and resolve any discrepancies with the project team. A separate programmer will check the program to confirm accuracy. The program management team will conduct a thorough review to confirm accurate wording and screen layout and CDC staff will review an electronic version of the final survey.

The CATI training includes extensive practice using the CATI program, conducting mock interviews, and role-playing a variety of scenarios.



B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Name

Telephone Number

Email Address

Role

Organization

Bhuvana Sukumar

404-592-2122

[email protected]

Designed the data collection

Collect the data

Analyze the data

ICF International

Megan Brooks

651-330-6085

[email protected]

Designed the data collection

Collect the data

Analyze the data

ICF International

Tamara Lamia

404-592-2248

[email protected]

Designed the data collection

Collect the data

Analyze the data

ICF International

Melissa Scardaville

404-321-3211

[email protected]

Designed the data collection

Collect the data

Analyze the data

ICF International

Lisle Hites*

205-975-8980

[email protected]

Designed the data collection

Collect the data

Analyze the data

University of Alabama

Jessica Wakelee*

205-975-8963


[email protected]

Designed the data collection

Collect the data

Analyze the data

University of Alabama

Jessie Rouder

646-695-8138

[email protected]

Designed the data collection

Collect the data

Analyze the data

ICF International

Naomi Freedner

802-264-3730

[email protected]

Designed the data collection

Collect the data

Analyze the data

ICF International

William Robb

646-695-8182

[email protected]

Designed the data collection

Collect the data

Analyze the data

ICF International

Asha Z. Ivey-Stephenson

404-639-7581

[email protected]

Designed the data collection

Collect the data

Analyze the data

Centers for Disease Control and Prevention (CDC)

Dale Rose



404-639-5115 

[email protected]

Designed the data collection

Collect the data

Analyze the data

Centers for Disease Control and Prevention (CDC)

Sara Vagi

404-639-0879


[email protected]


Designed the data collection

Collect the data

Analyze the data

Centers for Disease Control and Prevention (CDC)

Epidemic Intelligence Service Officer TBD

TBD

TBD

Designed the data collection

Collect the data

Analyze the data

Centers for Disease Control and Prevention (CDC)



1 Federal Information Processing Standard (FIPS)

8


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy