SRAE NDS-EIS Part B - 03132020

SRAE NDS-EIS Part B - 03132020.docx

OPRE Evaluation: Sexual Risk Avoidance EducationNational Descriptive Study—Early Implementation Study (NDS-EIS) [Descriptive Study]

OMB: 0970-0530

Document [docx]
Download: docx | pdf


Sexual Risk Avoidance Education
National Descriptive Study—Early Implementation Study (NDS-EIS)

OMB Information Collection Request

0970-New Collection

Supporting Statement

Part B

March 2020

Submitted By:

The Family and Youth Services Bureau
and
The Office of Planning, Research, and Evaluation


Administration for Children and Families

U.S. Department of Health and Human Services

4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, DC 20201

Project Officers:
Jessica Johnson
Calonie Gray



Contents

Introduction 3

B1. Respondent Universe and Sampling Methods 4

B2. Procedures for Collection of Information 5

B3. Methods to Maximize Response Rates and Deal with Nonresponse 6

B4. Tests of Procedures or Methods to Be Undertaken 7

B5. Individual(s) Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 7



INSTRUMENTS

Instrument #1 - Grantee Web Survey

Instrument #2 - Grantee Telephone Interview Protocol

ATTACHMENTS

Attachment A Research Question-Instrument Crosswalk

Attachment B Web Survey Advance Email

Attachment C FAQ

Attachment D Web Survey Invitation Email

Attachment E Web Survey Reminder Email

Attachment F Telephone Interview Invitation Email

Attachment G 60-Day Federal Register Notice

This information collection request (ICR) focuses on the National Descriptive Study’s Early Implementation Study (NDS-EIS). This is a new ICR. See Supporting Statement A for background information.


B1. Respondent Universe and Sampling Methods

The respondent universe for the NDS-EIS is all 122 SRAE grantees—39 State grantees, 26 Competitive grantees, and 57 Departmental grantees. The target respondents for the web survey and telephone follow-up interviews are the state grantee administrators (State SRAE) or organization grantee administrators (Competitive and Departmental SRAE). ACF will supply the contractor, Mathematica Policy Research, with the contact information of the grant administrators. The contractor will conduct the document review and field the Grantee Web Survey (Instrument 1) as a census of the grantees. A census is appropriate because the population is small, each grantee is unique, and a sample of the grantees might miss some key practices. In addition, it would be more costly to design and implement a sampling strategy rather than attempt to reach all of the grantees.


The overall response rate to the web survey is expected to be greater than 90 percent. The 65 State and Competitive grantees are required to participate in evaluation activities. Although the Funding Opportunity Announcement for the 57 Departmental grantees did not require them to participate in evaluation activities, we will make several efforts to encourage grantees to take part in the data collection effort (see Section B3).


After grantees have completed the Grantee Web Survey, the contractor will follow up with in-depth telephone interviews with grantees using the Grantee Telephone Interview Protocol (Instrument 2). Given contract resources at that time, we may select a subset of grantees. If a sample is selected, ACF will purposefully select the respondents. Key themes and questions that emerge from analysis of the web survey findings will, in part, inform the selection strategy. For instance, the contractor could use telephone interviews to clarify exactly how Title V State and Competitive SRAE grantees plan to meet the requirements, such as providing education on the “A–F” topics specified in the Title V legislation. In this case, only State and Competitive SRAE grantees would be chosen. Instead, ACF may instead elect to learn more about how grantees are integrating multiple program components – for example SRAE curricula with additional content modules, activities, and parent engagement strategies. In this case, the interview sample would be just those grantees that are implementing multiple components.


B2. Procedures for Collection of Information

Document Review

The contractor will collect much of the needed information on grant structure, target populations, geographic location and context, and program components through a review of the grant applications and post-award plans of each SRAE grantee. The contractor will also review the first semiannual reports if they are available. Because the document review relies upon extant sources that ACF will provide to the contractor, no burden is associated with this activity.


Grantee Web Survey

Prior to the survey, the contractor will send each grant administrator a notification email that briefly describes the NDS-EIS and the Grantee Web Survey, and how the survey relates to the broader SRAE National Evaluation efforts (Attachment B). The email will attach the FAQ (Attachment C) and provide the contractor’s contact information in case grantees have any questions. The email will remind State and Competitive grantees that their participation is required, and it will inform Departmental grantees that although their participation is not required, it is highly valuable to ACF. The email will also describe the time needed to complete the survey, and provide the deadline for completing it. The contractor will follow up by email with step-by-step instructions for accessing and completing the survey (Attachment D). The contractor will field the survey for six weeks. Nonrespondents will receive periodic reminders to complete the survey (Attachment E), and the contractor will inform ACF about the response rates so that federal project officers can follow up with nonrespondents. The survey will take about ninety minutes to complete.


The contractor will field the survey on the web to enable grantees to respond to the survey on their own time using their preferred electronic devices, such as a smartphone, tablet, laptop, or desktop computer. Respondents can pause, with their responses saved, and restart the survey using a different device. They can also go back and change responses, if needed, before final submission. These features will be particularly useful if respondents have to delegate some parts of the survey to other staff. The contractor will program the survey with checks to prevent missing, inconsistent, or implausible responses.


In-Depth Telephone Interviews

After the web survey is closed, ACF will conduct in-depth interviews with grantees that could last up to 1.5 hours.


The contractor will send an email inviting administrators or program directors of selected grantees to participate in an interview (Attachment F). The invitation will describe the purpose of the interview and suggest some potential dates and times for the call. The contractor will then follow up by telephone to schedule the interview at a time that is convenient to the respondent.


B3. Methods to Maximize Response Rates and Deal with Nonresponse

Expected Response Rates

ACF expects a high web survey response rate, greater than 90 percent, for several reasons. First, the 65 State and Competitive grantees are required to participate in evaluation activities per the conditions of their grant award. Second, the administrators and program directors who will respond to the survey are heavily invested in the issues surrounding sexual risk avoidance education and thus we expect that they will be motivated to participate. Third, the contractor will use several strategies to contact nonrespondents and encourage their participation (see the section on maximizing response rates). The expected response rate is adequate for the survey purpose of describing SRAE grantees’ plans. We expect that grantees asked to participate in subsequent interviews will respond similarly, and for similar reasons. The State and Competitive grantees are required to participate in the national evaluation activities. Furthermore, the interviews are designed to allow grantees to share more details about their program plans than the web survey can capture, and their specific program plans will be viewed as a contribution to the field of SRAE programming.


Dealing with Nonresponse

Maximizing participation and implementing checks in the web survey to prompt respondents to enter answers to critical questions before proceeding will help to minimize unit and item nonresponse. If the survey response rate is below 90 percent, the contractor will conduct a nonresponse analysis to identify any systematic differences between respondents and nonrespondents.


Maximizing Response Rates

The contractor will use several strategies to achieve a high response rate to the survey. An email (Attachment D) will remind State and Competitive grantees that their participation is required, per the Funding Opportunity Announcement. Although the Funding Opportunity Announcement for the 57 Departmental grantees did not require them to participate in evaluation activities, we will make several efforts to encourage grantees to take part in the data collection effort. After publication of the 60-Day FRN, we introduced the proposed EIS data collection effort to all grantees, including the Departmental grantees, during the January 2019 and October 2019 SRAE grantee orientation meetings, and included a discussion on why potential participation would matter for each grantee and the larger field. The federal project officers and the contractor will also disseminate a frequently asked questions document (FAQ) about evaluation efforts (Attachment C), and reminder emails (Attachment E) after approval if this ICR. These efforts should maximize participation among the Departmental grantees. To minimize item nonresponse on critical questions, the contractor will program the web survey with prompts to ensure respondents enter answers before proceeding.


The contractor will follow up with nonrespondents to encourage their participation by highlighting the importance of the survey. Federal project officers may also contact nonrespondents to encourage their participation.


B4. Tests of Procedures or Methods to Be Undertaken

The information collection instruments are similar to surveys and interview protocols that prior studies have used successfully with similar populations, including the PREP Multi-Component Evaluation.


B5. Individual(s) Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Mathematica Policy Research, the contractor for this study, will collect the information for the NDS-EIS, on behalf of ACF. The following individuals provided key input on the current data collection instruments:


Seth Chamberlain

Branch Chief (Formerly Project Officer in the Office of Planning, Research, and Evaluation)

Office of Family Assistance

Administration for Children and Families

U.S. Department of Health and Human Services


Jessica Johnson

Project Officer

Family Youth & Services Bureau

Administration for Children and Families

U.S. Department of Health and Human Services


Calonie Gray

Senior Social Science Research Analyst

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services



Sarah Forrestal

Senior Survey Researcher

Mathematica Policy Research


Brian Goesling

Senior Researcher

Mathematica Policy Research


Kimberly McDonald

Survey Researcher

Mathematica Policy Research


Alicia Meckstroth

Senior Researcher

Mathematica Policy Research


Matthew Stagner

Vice President

Mathematica Policy Research


Melissa Thomas

Senior Survey Researcher

Mathematica Policy Research


Susan Zief

Senior Researcher

Mathematica Policy Research

7

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleOPRE OMB Clearance Manual
AuthorDHHS
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy