Supporting Statement Part B Success Sequence Interview Study v3_clean_jan 22

Supporting Statement Part B Success Sequence Interview Study v3_clean_jan 22.docx

Success Sequence Interviews

OMB: 0970-0591

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes





Success Sequence Interviews



OMB Information Collection Request

0970 - New Collection



Supporting Statement

Part B

January 2022


Submitted by:

Office of Planning, Research and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, DC 20201


Project Officer:

Caryn Blitz


Part B

B1. Objectives

Study Objectives

The objective of the Success Sequence Interviews study is to collect data from adults, ages 30 through 35, about factors associated with economic self-sufficiency and the sequencing of milestones for high school graduation, employment, marriage, and childbearing..

The data collected from the interviews will:

  • Provide the Administration for Children and Families (ACF) with a deeper contextual understanding of the success sequence and the factors that influence the order in which respondents complete the milestones and pathways to achieve economic self-sufficiency.

  • Help provide ACF’s Family and Youth Services Bureau (FYSB) with greater insight into current program content and strategies related to the success sequence that could best resonate with Sexual Risk Avoidance Education (SRAE) program participants.

Generalizability of Results

Given that the interviews involve a nonprobability sample, the interview data will not be used to generate generalizable statistics for all adults, ages 30-35. The resulting data will be used to improve SRAE program content and strategies related to the success sequence, informing future trainings and technical assistance needs of the grant recipients conducting the programs.

Appropriateness of Study Design and Methods for Planned Uses

The success sequence economic analysis indicates that individuals fall into 64 milestone sequence pathways.1 The qualitative interviews are central to understanding the complex decisions and circumstances among youth transitioning into adulthood. The data collected via the interviews will help provide a more in-depth perspective as to what factors may be associated with economic self-sufficiency changes as well as contextual information about the factors influencing the order in which adults achieve milestones. Collaboration with external market vendors will allow us to recruit a diverse interview sample (see Supporting Statement Part A, Section A2). We will conduct virtual one-on-one asynchronous interviews that will allow participants to log in to an online platform and respond to discussion questions moderated by the study research team at Mathematica. This study design lends itself well to the population of interest, as adults in this age range often face competing demands and find it difficult to set aside time for a telephone interview during the day owing, for example, to work schedules or childcare responsibilities. The chat-board format allows respondents to complete activities at a convenient time over a three-day period. In addition, as compared to interviewer-administered methods such as face-to-face or telephone interviews, the design approach is similar to a self-administered survey and is likely more appropriate for limiting social desirability bias for sensitive questions.2 Key limitations, described in Supporting Statement Part A, Section A2, will be detailed in written products associated with the study.

As noted in Supporting Statement Part A, the survey information is not intended for use as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.

B2. Methods and Design

Target Population

The target population for this study includes English- and Spanish-speaking adults ages 30 through 35 living in the United States. The Success Sequence Economic Analysis report,3 discussed in Supporting Statement Part A (Section A2), indicates that individuals fall into 64 out of 65 possible life milestone sequence pathways. The analysis finds a complexity of paths taken by a national sample of over 7,000 survey respondents, with over 65 percent of the survey participants following 10 different pathways, and the remaining 35 percent falling into 54 different pathways. Table C.1, in Appendix C, shows the percent of National Longitudinal Survey of Youth’s 1997 Cohort (NLSY97) survey participants by the order in which key success sequence model variables of high school graduation, employment, marriage, and childbearing have been obtained or not.

To meet the goal of this research study, understanding the complex decisions and circumstances among youth transitioning into adulthood as they relate to the success sequence milestones and order of those milestones, it is necessary to purposively seek study participants based on variation in their income and by each of the four milestones (high school graduation, employment, marriage, and childbearing ). We propose using an interview sample size proportional to the percent of cases observed in the NLSY97 data used for the economic analysis, such that up to 110 interviews are conducted with those who report the top 10 most common pathways, discussed in Section A2, and shown as the first 10 rows of Table C.1 in Appendix C, and an additional 115 interviews are conducted with those who report the remaining, less common pathways found in the NLSY97 analysis. The total 225 includes the goal to conduct up to 11 interviews with individuals that reported uncommon ordering of the key success sequence variables, indicated as 0.05 percent of observed cases in the NLSY97 data analysis.

Mathematica will collaborate with market research vendors, which maintain large standing panels of research participants, to purposively recruit study members using the sample size targets by milestone combinations shown in Table C.1. All adults in the research panels, ages 30-35 who speak English or Spanish will be eligible for recruitment into the study, but they will undergo screening at recruitment to collect data on household income, life milestones achieved, geographic location, and demographics (gender, race, and ethnicity). Given that the sample is a nonprobability sample, the interview data will not be used to generate generalizable statistics for adults ages 30 through 35.

Respondent Recruitment

The research team at Mathematica will use nonprobability, purposive respondent recruitment to identify potential participants for up to 225 interviews; all participants must meet a set of criteria. First, as noted, to be eligible for the study, participants must be between the ages of 30-35 and speak either English or Spanish. Next, the research team will ensure that those recruited into the study have a range of income levels and experiences across the multitude of the four milestone pathways high school graduation, employment, marriage, and childbearing ) as shown in Appendix C, Table C.1. Lastly, the research team will seek participant variation in demographics (gender, race, and ethnicity) and across the 10 U.S. Department of Health and Human Services geographic regions of the United States. Table C.1 shows the planned 225 interviews will provide sufficient diversity of responses across the various possible milestone combinations to inform and contextualize the qualitative interview analyses.

To start, the panel vendors will recruit interview participants by beginning with a prescreen email to panel members (Appendix A: Success Sequence Recruitment Materials). All interested panel members will respond to the prescreen email by calling the vendor to complete a five-minute telephone screener (Instrument 1: Success Sequence Screener) for collection of data on demographics (race/ethnicity/gender), household income, U.S. geographic location, and the milestones achieved by each prospective participant. Due to higher rates of teen pregnancy among African American and Hispanic youth4 and the targeting of programming to these populations by FYSB grantees, the sample will overrepresent individuals from these racial/ethnic groups. Each week, the Mathematica research team will carefully review screener data for participants and monitor the proportion of participants in each gender category, race and ethnic group, geographic region, income category, and set of milestones achieved. If the team notices imbalances, it will target the next week’s recruitment to the groups with counts lower than desired. This targeted recruitment process will require the vendors to call panel members directly, as needed, for screening. In advance of starting the interview, participants will read the consent form and acknowledge consent (Appendix B: Consent Form). Study participants will consist of a convenience sample of panel members who provide consent to participate.

B3. Design of Data Collection Instruments

Development of Data Collection Instruments

ACF and Mathematica staff collaborated on development of the Success Sequence Screener and Interview Protocol (Instruments 1 and 2) by drawing on the findings from a literature review and secondary data analysis described in Supporting Statement Part A, Section A1. The screener includes questions on each of the success sequence milestones, income levels, and demographics to ensure variation in study participant background, geographic location, and economic self-sufficiency. Items in the screener on educational attainment and income are based on those used in the National Longitudinal Survey of Youth. The data collection instrument (Instrument 2: Success Sequence Interview Protocol) starts with a short set of close-ended questions to allow for more detailed follow-up questions that are tailored to individual respondents based on relevant milestones in respondents’ lives.

We pilot tested the Success Sequence Interview Protocol with 4 adults to gain a clear understanding of participants’ comprehension, recall, judgment, and response strategies for each item. We then conducted a usability test with an additional 5 adults within the secure chat room, QualBoard, to test the questions and procedures within the platform. To ensure that the instrument was tested on a broad range of participants we conducted a pre-test with 40 adults, under the umbrella generic, Pre-testing of Evaluation Data Collection Activities (OMB #0970-0355). During the pre-test, the research team carefully moderated each interview, paying special attention to respondents’ creative uses of text, such as punctuation and emojis, and adding follow up questions to the respondents where necessary to ensure common understanding and interpretation of responses. Participants were forthcoming and open, even on potentially sensitive and personal questions.


Both instruments have been translated into Spanish (Instruments 1 & 2, following the respective English versions) to ensure that Spanish speakers are recruited and included into the study.

The study design minimizes potential measurement error in three ways: (1) by providing anonymity for respondents on potentially sensitive items compared to in-person or telephone interviews; (2) by permitting study leads to oversee all interview moderators in the chat board, ensuring consistency in interviewee probes and follow-ups and clarifying unclear or ambiguous responses; and (3) by structuring the online platform to export a transcript of all comments to interviewers, limiting potential error in notetaking and transcription.

B4. Collection of Data and Quality Control

ACF is contracting with Mathematica for this data collection effort. The Mathematica research team has extensive experience in conducting qualitative data collection efforts with adult populations, particularly in earlier studies sponsored by the U.S. Department of Health and Human Services and other federal agencies.5

To conduct the interview, we will use QualBoard, which is an online bulletin board that allows participants to see and respond to questions. However, it is not a real-time chat; therefore, participants may log on at any time that is convenient for them. Their responses are stored and used as an interview transcript that is accessible only to the research team.

To start, the market research vendors will recruit study participants by beginning with a prescreen email to their panel members in both English and Spanish (Appendix A: Success Sequence Recruitment Materials). Interested participants will be screened over the phone using Instrument 1. Mathematica will monitor and select the study participants to ensure that participants vary by the key variables of interest: high school graduation, employment, marriage, and childbearing , with a targeted sample size for each combination of milestones completed that is proportional to the number of cases observed in the economic analysis (Appendix C, Table C.1).

Participants successfully recruited into the study will receive login credentials to access the QualBoard site via email (Appendix D: Qualboard Invitation Email Sample Screen Shot). When participants enter the Qualboard site, they will be asked to enter their first name, initials, or an alias and to not provide their full name in order to protect their privacy. The name they enter will be used as their display name in the chat board and can only be seen by the research team interviewers. An electronic consent form will be displayed upon login for completion by the participant before they begin their online interview (Appendix B: Consent Form). Participants will not be able to access any interview questions until they complete the consent form.

Interview questions will be programmed onto the chat board for study participants to answer at their convenience during a given data collection window of up to 3 days. The interview questions are grouped into four sections. Participants will type their responses to the posed questions. Mathematica interviewers will be trained to moderate the interviews and will follow an interview protocol guide for uniformity. Throughout the sessions, moderators will review responses. The project lead will review all moderators’ sessions for consistency in engagement with participants. When necessary, interviewers will encourage more in-depth responses and probe for elaboration on responses. When the moderator asks a follow-up question, the participant will receive a notification in QualBoard or an email notification (depending on set notification preferences) indicating a request for follow-up on his or her interview.

The interview protocol will be self-administered and can be completed in several sessions on QualBoard. We expect that responses and interviewer follow-up questions will take participants about 45 minutes to complete. After participants complete the initial set of questions, they will be encouraged to log back in at least once to review probes from the interviewer; they will receive an on-platform or email notification when interviewers leave probes. The board will remain open for three days. We will send participants their gift card for completing the interview questions.

Transcripts of all responses logged by participants will be exported from QualBoard as Excel files and/or PDFs at the end of the sessions to prepare for coding and analysis.

B5. Response Rates and Potential Nonresponse Bias

Response Rates

The interviews are not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion. Response rates will not be calculated or reported.

Nonresponse

As study participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. The research team will work with the vendors to purposively select a diverse group of study participants to ensure all interview questions are tested across varying participant life milestones, and that participants are demographically and geographically diverse. Respondent demographics will be documented and reported in written materials associated with the data collection effort.

The research team will examine item non-response as an indicator of the sensitivity of interview questions, and any item with non-response of 10 percent or higher will be reported in written materials.

B6. Production of Estimates and Projections

The data will not be used to generate population estimates, for either internal use or dissemination.

B7. Data Handling and Analysis

Data Handling

Screener data will be shared by the vendors with the Mathematica research team via a secure file transfer site during recruitment. There will not be any PII included in the data, all participants will be identified by a unique ID number. The vendor data will include the screener data for each participant. The data will be shared through an advanced file exchange website that users access using individual usernames and passwords.

Interview data retrieved from QualBoard will be saved on a secure drive accessible only to the Mathematica research team. Direct export of the QualBoard data to the secure drive will result in minimal processing. Interviewers will review transcripts to correct spelling and grammar errors, fill in missing words, and explain unclear terms or phrases in preparation for qualitative coding and analysis.

Data Analysis

The research team will review the qualitative interview transcripts for overarching themes and lessons across each of the key topics explored through the virtual asynchronous discussions. The Mathematica research team will develop a coding scheme based on the research objectives and interview topics for application to the transcripts by the team of interviewers that moderated the discussions. The task lead for the interviews will monitor the coding across the team to ensure accuracy and consistency.

Data Use

The interview data will help provide ACF with a fuller understanding of the success sequence by uncovering what factors may be associated with economic self-sufficiency. ACF can use the information to guide current success sequence programming, introducing content and strategies that can best resonate with youth and allowing ACF to provide updated guidance to SRAE grantees as they work to incorporate the success sequence into their programs. As noted in Supporting Statement Part A, ACF will develop a memorandum that summarizes key themes and lessons, including a description of the study methods and of the limitations regarding generalizability and policy decisions. We may also develop a public-facing document that will make our findings available to grantees or stakeholders, as the findings may indicate new or differing ideas on factors related to order the milestones are achieved and the relationship to economic self-sufficiency , shaping and improving grantees’ understanding of promising pathways for youth. Any shared information will include a discussion of the limitations (see Supporting Statement Part A, Section A2) and guidance on how the information should be used and interpreted.

B8. Contact Persons

In Table B.1, we list the federal and contract staff responsible for the study, including each individual’s affiliation and email address.

Table B.1. Individuals Responsible for Study

Name

Affiliation

Email address

Caryn Blitz

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services

[email protected]

Kathleen McCoy

VPD Government Solutions Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services

[email protected]

Susan Zief

Mathematica

[email protected]

Tiffany Waits

Mathematica

[email protected]

Jennifer Walzer

Mathematica

[email protected]

Maya Reid

Mathematica

[email protected]

Elizabeth Mugo

Mathematica

[email protected]





Attachments

Appendices

Appendix A: Success Sequence Recruitment Materials (English and Spanish)

Appendix B: Consent Form (English and Spanish)

Appendix C: Success Sequence Interview Study Target Sample

Appendix D: QualBoard Invitation Email Sample Screen Shot (English and Spanish)



Instruments

Instrument 1: Success Sequence Screener (English and Spanish)

Instrument 2: Success Sequence Interview Protocol (English and Spanish)

1 Inanc, H., A. Spitzer, and B. Goesling. “Assessing the Benefits of the Success Sequence for Economic Self-Sufficiency and Family Stability.” OPRE Report #2021-41, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

2 Kreuter, F., S. Presser, and R. Tourangeau. “Social Desirability Bias in CATI, IVR, and Web Surveys: The Effects of Mode and Question Sensitivity.” Public Opinion Quarterly, vol. 72, no. 5, December 2008, pp. 847–865. https://doi.org/10.1093/poq/nfn063.


3 Inanc, H., A. Spitzer, and B. Goesling. “Assessing the Benefits of the Success Sequence for Economic Self-Sufficiency and Family Stability.” OPRE Report #2021-148. Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

4 https://www.cdc.gov/teenpregnancy/about/index.htm.

5 Pregnancy Assistance Fund Study (OMB Control Number 0990-0424), The Strengthening Relationship Education and Marriage Services Evaluation (OMB Control Number 0970-0481), Head Start Family and Child Experiences Survey 2019 (OMB Control Number 0970-0151), Middle Grades Longitudinal Study of 2017–18 (OMB Control Number 1850-0911), Evaluation of Demonstration Projects to End Childhood Hunger (OMB Control Number 0584-0603), Regional Partnership Grants National Cross-Site Evaluation and Evaluation Technical Assistance (OMB Control Number 0970-0444).

6

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWoolverton, Maria (ACF)
File Modified0000-00-00
File Created2022-01-13

© 2024 OMB.report | Privacy Policy