Waiver demo supporting statement section B

Waiver demo supporting statement section B.docx

National Study of Title IV-E Child Welfare Waiver Demonstrations

OMB: 0970-0495

Document [docx]
Download: docx | pdf

OMB Information Collection Request

New Collection


Supporting Statement

Section B

September 23, 2016


Submitted by:

Children’s Bureau

Administration for Children and Families

U.S. Department of Health and Human Services

330 C St SW

Washington, DC 20201


Project Officer:

Liliana Hernandez

202-205-8086

[email protected]


  1. Statistical Methods (Used for Collection of Information Employing Statistical Methods)


1. Respondent Universe and Sampling Method


The universe of respondents is up to 333 representatives from 25 jurisdictions implementing demonstration projects (i.e., Arkansas, Arizona, Colorado, Hawaii, Illinois, Kentucky, Maine, Maryland, Massachusetts, Michigan, Nebraska, Nevada, New York, Oklahoma, Oregon, Pennsylvania, Port Gamble S’Klallam Tribe, Rhode Island, Tennessee, Texas, Utah, Washington, Washington DC, West Virginia, Wisconsin). Nonprobability sampling strategies will be used.


The respondents to the Web-Based Survey will make up a purposive sample of 250 waiver jurisdiction representatives and evaluators drawn from the 25 waiver jurisdictions. The sample will include evaluators (lead evaluators/principal investigators), jurisdiction demonstration project leaders, jurisdiction fiscal leaders, and representatives from the local county/regional child welfare agency(ies) in the areas of practice, program, policy, fiscal, and research/evaluation. The respondents will be identified by the 25 jurisdiction demonstration project leaders using the Web-Based Survey Sampling Form. The expected response rate for the Sampling Survey Form is 100 percent. The Web-Based Survey will be administered once during the National Study. The expected response rate is 80–90 percent.


The respondents to the Measuring Well-Being Telephone Survey will make up a census sample of the 23 evaluators (lead evaluator/principal investigator/designee) from the 23 jurisdictions who are involved with the assessment of child and caregiver well-being in their waiver jurisdictions. The Measuring Well-Being Telephone Survey will be administered once during the National Study. The expected response rate is 80–90 percent.


The respondents to the Practice- and Systems-Level Change Telephone Survey will make up a purposive sample of up to 60 respondents identified from 14 waiver jurisdictions. Approximately four respondents will be identified from each jurisdiction. Respondents will include the evaluator (lead evaluator/principal investigator), jurisdiction demonstration project leader, and representatives from the local county/regional child welfare agency(ies) knowledgeable about practice, policy, and organizational changes in their respective agency. The jurisdictions represent the 2012 and 2013 cohorts of jurisdictions and thus have been implementing their demonstrations for a sufficient period of time for the respondents to address the survey questions. The respondents will be identified by the 14 jurisdiction demonstration project leaders using the Practice- and Systems-Level Change Telephone Survey Sampling Form. The expected response rate for the Sampling Survey Form is 100 percent. The Practice- and Systems-Level Change Telephone Survey will be administered once during the National Study. The expected response rate is 80–90 percent.


The number of jurisdictions from which the samples will be drawn for the two telephone surveys deviates from what was reported in the Federal Register, Volume 81, Number 114, Tuesday, June 14, 2016, pages 38709–38710. The deviations reduced the burden estimates. The number of jurisdictions for the Measuring Well-Being Telephone Survey was changed from 12 to 23 to accommodate all the jurisdictions measuring child and caregiver well-being, but the number of individuals interviewed was reduced from 60 to 23. The number of jurisdictions for the Practice- and Systems-Level Change Telephone Survey was changed from 12 to 14 to include all the jurisdictions in the 2012 and 2013 cohorts, but the total possible number of individuals interviewed remained at 60.


The waiver jurisdictions are invited to participate in the National Study, including information collection described in this request. Representatives from the jurisdictions played a key role in the feasibility study that determined the viability of a national project. This data collection has not been conducted previously.


2. Procedures for Collection of Information


No statistical methodology for stratification and sample selection will be used for the information collection activities in this request. The information collected is descriptive and will not be used for formal hypothesis testing. The nonprobability sampling approach is the most efficient and appropriate method for generating the respondent groups for these information collection activities.


Because the Web-Based Survey is an electronic survey, advance appointments will not be made. The contractor will distribute a standard email communication (Appendix D) to all 25 jurisdictions’ demonstration project leaders describing the purpose and process of the survey and to request a list of respondents for each jurisdiction. The contractor will also send an email communication to all proposed survey respondents, copying the jurisdiction demonstration project leader, prior to the distribution of the surveys to the respondents.

Quality control procedures for the Web-Based Survey will be implemented by the contractor in regard to the following: (1) accuracy and completeness of survey distribution lists; (2) accuracy and completeness of electronic survey programming, including internal testing of all versions of the Web-Based Survey; (3) monitoring response rates and completeness of returned survey data; (4) reminders to nonresponders; (5) descriptive analysis; (6) completeness and accuracy of coding processes for qualitative responses to open-ended survey questions; and (7) completeness and accuracy in all reporting in consideration of respondent confidentiality.

Advance appointments will be made for the Measuring Well-Being Survey. The contractor will distribute a standard email communication (Appendix D) to the evaluator (lead evaluator/principal investigator) from each of the 23 jurisdictions measuring child and caregiver well-being to describe the purpose and process for the telephone surveys. The contractor will follow up with each evaluator to schedule an appointment to conduct the telephone survey.


Advance appointments will also be made for the Practice and Systems-Level Change Survey. The contractor will distribute a standard email communication (Appendix D) to the 14 jurisdictions’ demonstration project leaders from the 2012 and 2013 cohorts describing the purpose and process of the survey and requesting a list of respondents for the jurisdiction. The contractor will also send an email to all proposed survey respondents, copying the jurisdiction demonstration project leader, to schedule an appointment to conduct the telephone survey.


Quality control procedures for both telephone surveys will be implemented by the contractor in regard to the following: (1) accuracy and completeness of inclusion of appropriate respondents; (2) accuracy and completeness of questions and training of interviewers, including pilot testing of all versions of telephone surveys; (3) completeness and accuracy of coding processes for qualitative responses to open-ended questions; and (4) completeness and accuracy in all reporting in consideration of respondent confidentiality.


3. Methods To Maximize Response Rates and Deal With Nonresponse


A National Study feasibility study was conducted with representatives from the waiver demonstration projects participating as members of the primary work group. The findings of the feasibility study, along with the enthusiasm expressed by the work group, led to initiation of the National Study. As a result, high response rates are anticipated given the expected investment of respondents in the National Study. Calculation of the estimated response rates is shown in Exhibit B-3.



Exhibit B-3. Calculation of Estimated Response Rates

Survey

Respondent

# Respondents/

# Sampled

Response Rate (%)

Web-Based Survey Sampling Form

Jurisdiction demonstration project leaders

25/25

100

Web-Based Survey

Evaluators, jurisdiction, and local county/regional child welfare agency personnel

200–225/250

80–90

Measuring Well-Being Telephone Survey

Evaluators

18–21/23

80–90

Practice- and Systems-Level Change Telephone Survey Sampling Form

Jurisdiction demonstration project leaders

14/14

100

Practice and Systems-Level Change Telephone Survey

Evaluators, jurisdiction, and local county/regional child welfare agency personnel

48–54/60

80–90

Maximizing response rates is crucial to the administration of the surveys, but issues can arise with logistical matters such as scheduling interviews. The content and format of the three surveys were developed in consultation with key stakeholders from the jurisdictions and CB. Strategies that emphasize flexibility, confidentiality, and a respect for the respondent’s time facilitate timely participation. There are no incentives provided for participation in any of the surveys.


The following strategies will be implemented to maximize participation in the information collection and achieve the desired response rates. The various forms of correspondence are located in Appendix D.


General Introduction and Notification


CB will send an introductory email (National Study Data Collection Introductory email) via the waiver listserv to all jurisdiction respondents and evaluators describing the National Study and inviting them to participate in the survey data collection.


Web-Based Survey


Introduction and Notification

a. The contractor will send an introductory email (WBS Introduction and Request Email) along with the Web-Based Survey Sampling Form to the jurisdiction demonstration project leaders.

b. The contractor will send an introductory email (WBS Respondent Introductory Email) to each of the identified respondents. Respondents will be assured that data will be aggregated and not attributable to individuals.

Pre-Interview Preparation

None


Administration

a. The contractor will send an email (WBS Link and Completion Request Email) to all sampled respondents with a request to complete the survey (i.e., by accessing a Web link to an online version of the survey.

b. Approximately 1 week after sending this initial email, the contractor will send a reminder email (WBS Completion Reminder Email) to those respondents who have not yet completed the survey.

c. Approximately 1 week after sending this reminder email, the contractor will call (WBS Reminder Call Telephone Script) nonrespondents to remind them of the survey. During these follow-up calls, the contractor will administer the WBS by telephone if the respondent agrees.

d. Alternative Response Method: If the respondent prefers to submit written responses instead of responding online, the contractor will provide him or her with a paper version of the Web-Based Survey upon request. The Web-Based Survey can also be administered through a telephone interview if requested.


Thank You and Follow-Up

  1. The online survey will conclude by thanking the respondent for his or her participation and providing contact information for any questions.


Measuring Well-Being Telephone Survey


Introduction and Notification

a. The contractor will send an introductory email (MWBS Introduction and Request Email) to the jurisdiction demonstration project evaluators.

b. The contractor will send an introductory email (MWBS Respondent Introductory Email) to each of the identified respondents. The email will request dates and times available for the interview. Respondents will be assured that data will be aggregated and not attributable to individuals. This email will include the telephone survey.

c. The contractor will schedule the interview (MWBS Scheduled Interview Email).


Pre-Interview Preparation

a. Background information for some survey items will be “pre-filled” when applicable.

b. Interviewers will be familiar with the demonstration project in each jurisdiction to expedite administration of the interview.


Administration

a. The contractor will confirm the interview via email (MWBS Confirmation-Reminder Email) 2 to 3 days before the interview and reschedule any interviews as necessary to accommodate changes in the respondent’s schedule.

b. The contractor will administer the telephone survey.

c. Alternative Response Method: If the respondent prefers to submit written responses instead of participating in a telephone interview, the contractor will provide him or her with a paper version to submit via email, mail, or fax.


Thank You and Follow-Up

a. The contractor will send a follow-up email (MWBS Thank You Email) thanking the respondent for his or her participation and providing contact information for any questions.


Practice- and Systems-Level Change Telephone Survey


Introduction and Notification

a. The contractor will send an introductory email (PSLCS Introduction and Request Email) to the jurisdiction demonstration project leaders and evaluators along with the Practice- and Systems-Level Change Survey Sampling Form.

b. The contractor will send an introductory email (PSLCS Respondent Introductory Email) to each of the identified respondents. The email will request dates and times available for the interview. Respondents will be assured that data will be aggregated and not attributable to individuals. This email will include the telephone survey.

c. The contractor will schedule the interview (PSLCS Scheduled Interview Email).


Pre-Interview Preparation

a. Background information for some survey items will be “pre-filled” when applicable.

b. Interviewers will be familiar with the demonstration project in each jurisdiction to expedite administration of the interview.


Administration

a. The contractor will confirm the interview via email (PSLCS Confirmation-Reminder Email) 2 to 3 days before the interview and reschedule any interviews as necessary to accommodate changes in the respondent’s schedule.

b. The contractor will administer the telephone survey.

c. Alternative Response Method: If the respondent prefers to submit written responses instead of participating in a telephone interview, the contractor will provide him or her with a paper version to submit via email, mail, or fax.


Thank You and Follow-Up

a. The contractor will send a follow-up email (PSLCS Thank You Email) thanking the respondent for his or her participation and providing contact information for any questions.


4. Tests of Procedures or Methods To Be Undertaken


The three survey instruments contained herein were subject to review and feedback from key stakeholders including CB and a group of jurisdiction demonstration project evaluators. Pilot tests were conducted for each instrument using a sample of no more than nine respondents (i.e., similar respondents from jurisdictions not subject to the National Study). Respondents made wording recommendations and suggested reordering the questions so that positive questions preceded questions inquiring about something negative. The survey instruments were refined accordingly to minimize burden and improve utility. The pilot tests were instrumental in determining the amount of time required to complete the surveys and forms and develop burden estimates.


User access and responsiveness to the Web-based methodology of the Web-Based Survey has been pilot-tested. A change in question format (from drop-down to radio buttons) was made in response to feedback from the pilot test.


5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


The contractor will collect and analyze the information for the Children’s Bureau.





















File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJan Rothstein
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy