2022 NYTD Supporting Statement Part B_3.22

2022 NYTD Supporting Statement Part B_3.22.docx

National Youth in Transition Database (NYTD) and Youth Outcomes Survey

OMB: 0970-0340

Document [docx]
Download: docx | pdf


National Youth in Transition Database (NYTD)



OMB Information Collection Request

0970-0340



Supporting Statement Part B –

Statistical Methods

MARCH 2022















Submitted By:

Telisa Burt

Children’s Bureau

Administration for Children and Families

U.S. Department of Health and Human Services



  1. Respondent Universe and Sampling Methods

The NYTD Data File instrument collects semi-annual information on all NYTD data elements regarding youth services, demographics, characteristics and outcomes. Fifty-two respondent states collect this information on an ongoing basis. No statistical methods are used or required for this instrument other those used for the Youth Outcome Survey, which is a component of the State Data File. The potential respondent universe for the Youth Outcome Survey instrument consists of 17-year-olds who are in state foster care systems during a federal fiscal year, beginning in FY 2011. A new cohort is selected every three years thereafter (see 45 CFR 1356.81(b) and 1356.82(a)(2)). Youth who are incarcerated or institutionalized in a psychiatric facility or hospital are not a part of the baseline population because they are not in foster care consistent with the definition found in 45 CFR 1355.20. Based on past trends in NYTD data, for FFY 2023, we anticipate approximately 23,000 youth will join the next Cohort 1 baseline population.


Depending on the number of actual baseline respondents in a state, the state may opt to sample respondents for the follow-up population after completing the baseline year of data collection. The sampling universe consists of youth in a state’s baseline population who participated in the outcomes survey collection at age 17. Simple random sampling procedures based on random numbers generated by a computer program is the required method unless another accepted methodology is approved by ACF. It is important to note that ACF will not accept proposals for non-probability sampling methodologies, but will consider stratified random sampling and other probability samples that generate reliable state estimates. NYTD Technical Bulletin #5 provides further guidance on the sampling process which is consistent with regulatory requirements found at 45 CFR 1356.84.



  1. Procedures for the Collection of Information

The NYTD Data File instrument collects semi-annual information on all NYTD data elements regarding youth services, demographics, characteristics and outcomes. No statistical methods are used or required for this instrument other than those used for the Youth Outcome Survey, which is a component of the State Data File. As stated in the response to B.1, states will conduct the Youth Outcome Survey on a three year wave basis, with a new universe of 17-year-olds every three years. Follow up surveys are administered at to baseline youth at the ages of 19 and 21 to assess long-term outcomes and experiences for youth currently and formerly in care.


States are required to use the survey questions as outlined in regulation, however, some states have also added clarifying questions to their surveys. Those questions are not reported to the program office through NYTD. States have the discretion to conduct the surveys via in-person interviewers, computer-aided devices, tablets, phone interviews or other methods as it suits their particular needs and population. There are no dedicated resources under 42 USC 677 for States to devote to this data collection effort and funds will likely come from a combination of funds that would otherwise be used for youth independent living services and other existing resources. Given these limited state resources, and our need for data primarily as an administrative database and oversight tool, we declined to prescribe a particular survey method as is commonly used in research practices. Through technical assistance, ACF has encouraged states to use methods that are likely to achieve high response rates. Most states use in-person interviews or a combination of in-person and computer-aided techniques.


Through conference calls, site visits, national meetings and written publications, ACF has provided technical assistance to states to encourage best practices in tracking youth and administering the survey regardless of the method chosen. Technical assistance on sampling is conducted by ACF statisticians, while assistance with tracking youth and administering the survey will be provided through the Children’s Bureau or our technical assistance partners, such as the Child Welfare Capacity Building Center for States.



  1. Methods to Maximize Response Rates and Deal with Nonresponse

Our original expected response rates devised in 2007 were modeled on RR2 response rate (American Association for Public Opinion Research, 2006) and were reflective of our analysis of information from data collection efforts on former foster youth sponsored by ACF and states. While we had anticipated a 90% response rate from our baseline population of 17-year-old youth in foster care, only about half (53%) of such youth participated in the NYTD survey in FY 2011. While this population is easy to locate because they are in the placement and care responsibility of a state agency, states did report that in that first year of data collection that some youth could not be located in time to take the NYTD baseline survey because they had run away from foster care, did not respond to an invitation to participate in the survey, or because a “gatekeeper” such as a foster parent or group home delayed the state in gaining access to the youth. Since FY 2011, response rates have improved for both baseline and follow-up populations. The most recent complete cohort data (FY 2014, FY 2016, FY 2018) reveal that 69% of 17-year olds, 72% of 19-year olds, and 64% of 21-year olds participated in the NYTD survey. While states continue to improve their survey methodology based on lessons learned from surveying the first NYTD Cohort, we believe our anticipated response rates remain suitable for our purposes, which is to have some outcomes information to meet the statutory mandates at 42 USC 677 that can provide a perspective on how youth are faring as they prepare to leave foster care and assess state performance of their independent living programs.


Approaches to collecting the Youth Outcome Survey data vary as states generally have selected the most appropriate approach to meet the needs of the state and the particular characteristics of the state's youth population (collection of data in-person, by telephone, using computer-aided devices, etc.). While each approach to data collection has the potential for non-response bias, response bias, and measurement error, there also are standards of practice for collecting data to address potential bias. The Children's Bureau has addressed these data quality threats through technical assistance products such as national conference calls, webinars and publications like “Planning for the Mode of Administration for the Youth Outcome Survey” and “Surveying Youth with Special Needs or Limited English Proficiency.”

Survey researchers can and do use information on differential response rates to create weights that are used to correct bias in the data due to non-responders (Holt and Elliot, 1991), (Nathan Berg, 2002). In our analysis of Cohort 1 youth outcomes data, we employed a weighting methodology to correct for potential non-response bias in youth outcomes reported at ages 17 and 19. This weighting ensures that groups that differ in response behavior are represented by members of those groups who did respond. Percentages reported in our latest data brief are weighted estimates. However, based on analyses, these weighted results did not vary dramatically from unweighted results and non-response bias corrections were small.

Measurement error also can occur due to the respondent's inability to understand certain questions. Because of the likelihood of a wide range of comprehension levels in the target population, the federal staff have provided states with advice and technical assistance in dealing with this issue to ensure the most accurate collection of the information from the target youth. We will continue to provide technical assistance, for example, on the use of prompts by interviewers to clarify the meaning of particular terms on the survey. We have also implemented a federal onsite review protocol that comprehensively evaluates the state’s NYTD implementation including its survey methodology and instrument.



  1. Test of Procedures or Methods to be Undertaken

The NYTD Youth Outcome Survey was developed in consultation with practitioners, youth, and researchers in the field and was included as part of our rule-making for public comment. A pilot test was conducted in August 2001 which served as a field test of the draft data elements, definitions, and procedures. This test provided valuable information for assessment of the data collection burden on the states. In each of the seven pilot states, caseworkers collected data about several older youth, identified any unclear definitions, and described any difficulties encountered while collecting data. Each pilot state also was asked to report the amount of effort required to collect the information. We used these responses to assess the burden for workers in our original proposed information collection request, and to learn if the capacity to report data varied significantly across agencies or states. Since the pilot, we have begun to fully administer the NYTD youth outcomes survey to 52 states while anticipating the addition of the US Virgin Islands in the near future. We have used the response rates recorded over the past few years to inform the FY2021 information collection request.


Based on this input, we proposed a survey in our rule-making process that we believed was useful to the states and balanced the burden placed on the youth with the statutory mandates for data collection. Furthermore, related studies of youth aging out of foster care, including the Multi-Site Evaluation of Foster Youth Programs, Midwest Evaluation of Adult Functioning and the Northwest Foster Care Alumni Study, conducted much more extensive surveys and typically used more personal and sensitive questions while maintaining high response rates. On the basis of these studies and the public’s input on our rule-making, we expected that the survey as presented in the NYTD regulation would be easily understood and its content and level of burden will not discourage participation. After reaching 10 years of data collection and consultation with the field, we feel that most youth have no trouble taking the NYTD survey when administered by a trained adult.



  1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Tammy White (Office of Data, Research and Evaluation, Administration for Children, Youth and Families, 215-861-4004) was consulted on the statistical aspects of this information collection request. She also is primarily responsible for the analysis of the data associated with this information collection request.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorBurt, Telisa (ACF)
File Modified0000-00-00
File Created2022-03-28

© 2024 OMB.report | Privacy Policy