supporting statement 2018_Part B NYTD

supporting statement 2018_Part B NYTD.docx

National Youth in Transition Database (NYTD) and Youth Outcomes Survey - Final Rule

OMB: 0970-0340

Document [docx]
Download: docx | pdf

B. Statistical Methods (used for collection of information employing statistical methods)

The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results. When item 16 is checked "Yes," the following documentation should be included in the supporting statement to the extent that it applies to the methods proposed:


  1. Respondent Universe and Sampling Methods


The NYTD Data File instrument collects semi-annual information on all NYTD data elements regarding youth services, demographics, characteristics and outcomes. Fifty-two respondent states collect this information on an ongoing basis. No statistical methods are used or required for this instrument other those used for the Youth Outcome Survey, which is a component of the Data File. The potential respondent universe for the Youth Outcome Survey instrument consists of 17-year-olds who are in state foster care systems during a federal fiscal year, beginning in FY 2011, with a new cohort selected every three years thereafter (see 45 CFR 1356.81(b) and 1356.82(a)(2)). Youth who are incarcerated or institutionalized in a psychiatric facility or hospital are not a part of the baseline population because they are not in foster care consistent with the definition found in 45 CFR 1355.20. According to NYTD data from FFY 2011, approximately 25,000 youth met the definition for baseline population membership in Cohort 1. The national participation rate was 53%. In FFY 2017, approximately 24,500 youth were reported in the baseline population for NYTD Cohort 3. Of these youth, 67% participated.


Depending on the number of actual baseline respondents in a state, the state may opt to sample respondents for the follow-up population after completing the baseline year of data collection. The sampling formula is regulated in 45 CFR 1356.84. The sampling universe consists of youth in a state’s baseline population who participated timely in outcomes data collection at age 17. Simple random sampling procedures based on random numbers generated by a computer program is the required method unless another accepted methodology is approved by ACF. NYTD Technical Bulletin #5 specifies that ACF will draw the sample for each state that opts to sample consistent with the requirements at 45 CFR 1356.84.


  1. Procedures for the Collection of Information


The NYTD Data File instrument collects semi-annual information on all NYTD data elements regarding youth services, demographics, characteristics and outcomes. No statistical methods are used or required for this instrument other than those used for the Youth Outcome Survey, which is a component of the Data File. As stated in the response to B.1, states will conduct the Youth Outcome Survey on a three year wave basis, with a new universe of 17-year-olds every three years. After states establish their baseline population cohorts, states that choose to sample will employ simple random sampling or they may request ACF approval of another accepted sampling methodology. ACF will not accept proposals for non-probability sampling methodologies, but will consider stratified random sampling and other probability samples that generate reliable state estimates. The sampling universe will consist of the total number of youth in the baseline population that participated timely in the data collection at age 17.


States will administer to youth the survey located in Appendix B to the regulation. States have the discretion to conduct the surveys via in-person interviewers, computer-aided devices, phone interviews or other methods as it suits their particular needs and population. There are no dedicated resources under 42 USC 677 for States to devote to this data collection effort and funds will likely come from a combination of funds that would otherwise be used for youth independent living services and other existing resources. Given these limited state resources, and our need for data primarily as an administrative database and oversight tool, we declined to prescribe a particular survey method as is commonly used in research practices. Through technical assistance, ACF has encouraged states to use methods that are likely to achieve high response rates. Most states use in-person interviews or a combination of in-person and computer-aided techniques.


Through conference calls, site visits, national meetings and written publications, ACF has provided technical assistance to states to encourage best practices in tracking youth and administering the survey regardless of the method chosen. Attached to this request package is ACF’s current guidance to states on administering the survey (See “Practical Strategies for Planning and Conducting the National Youth in Transition Database (NYTD) Youth Outcome Survey”). Technical assistance on sampling is conducted by ACF statisticians, while assistance with tracking youth and administering the survey will be provided through the Children’s Bureau or our technical assistance partner, the Child Welfare Capacity Building Center for States.



  1. Methods to Maximize Response Rates and Deal with Nonresponse


Our original expected response rates devised in 2007 were modeled on RR2 response rate (American Association for Public Opinion Research, 2006) and were reflective of our analysis of information from data collection efforts on former foster youth sponsored by ACF and states. While we had anticipated a 90% response rate from our baseline population of 17-year-old youth in foster care, only about half of such youth participated in the NYTD survey in FY 2011 and nearly 70% participated in FY 2014. While this population is easy to locate because they are in the placement and care responsibility of a state agency, we note that some youth could not be located in time to take the NYTD baseline survey because they had run away from foster care, did not respond to an invitation to participate in the survey, or because a “gatekeeper” such as a foster parent or group home delayed the state in gaining access to the youth. In addition, a small percentage of youth declined to participate in the survey. In FY 2013, the only full year of outcomes data available from youth over age 17, states garnered the participation of approximately 70% of 19-year-olds. This response rate was in line with our initial estimates regarding the surveying of this highly mobile population of young people. While states continue to improve their survey methodology based on lessons learned from surveying the first NYTD Cohort, we believe our anticipated response rates remain suitable for our purposes, which is to have some outcomes information to meet the statutory mandates at 42 USC 677 that can provide a perspective on how youth are faring as they prepare to leave foster care and assess state performance of their independent living programs.

Approaches to collecting the Youth Outcome Survey data vary as states generally have selected the most appropriate approach to meet the needs of the state and the particular characteristics of the state's youth population (collection of data in-person, by telephone, using computer-aided devices, etc.). While each approach to data collection has the potential for non-response bias, response bias, and measurement error, there also are standards of practice for collecting data to address potential bias. The Children's Bureau has addressed these data quality threats through technical assistance products such as national conference calls, webinars and publications like “Planning for the Mode of Administration for the Youth Outcome Survey” and “Surveying Youth with Special Needs or Limited English Proficiency.”

Survey researchers can and do use information on differential response rates to create weights that are used to correct bias in the data due to non-responders (Holt and Elliot, 1991), (Nathan Berg, 2002). In our analysis of Cohort 1 youth outcomes data, we employed a weighting methodology to correct for potential non-response bias in youth outcomes reported at ages 17 and 19. This weighting ensures that groups that differ in response behavior are represented by members of those groups who did respond. Percentages reported in our latest data brief are weighted estimates. However, based on analyses, these weighted results did not vary dramatically from unweighted results and non-response bias corrections were small.

Measurement error also can occur due to the respondent's inability to understand certain questions. Because of the likelihood of a wide range of comprehension levels in the target population, the federal staff have provided states with advice and technical assistance in dealing with this issue to ensure the most accurate collection of the information from the target youth. We will continue to provide technical assistance, for example, on the use of prompts by interviewers to clarify the meaning of particular terms on the survey. We have also implemented a federal review protocol, the NYTD Assessment Review that comprehensively evaluates the state’s NYTD implementation including its survey methodology and instrument. To date, we have conducted nearly 18 such reviews and plan to continue visiting three to four states each year.


  1. Test of Procedures or Methods to be Undertaken


The NYTD Youth Outcome Survey was developed in consultation with practitioners, youth, and researchers in the field and was included as part of our rule-making for public comment. A pilot test was conducted in August 2001 which served as a field test of the draft data elements, definitions, and procedures. This test provided valuable information for assessment of the data collection burden on the states. In each of the seven pilot states, caseworkers collected data about several older youth, identified any unclear definitions, and described any difficulties encountered while collecting data. Each pilot state also was asked to report the amount of effort required to collect the information. We used these responses to assess the burden for workers in our original proposed information collection request, and to learn if the capacity to report data varied significantly across agencies or states. Since the pilot, we have begun to fully administer the NYTD youth outcomes survey to 52 states while anticipating the addition of the US Virgin Islands in the near future. We have used the response rates recorded over the past few years to inform the FY2018 information collection request.


Based on this input, we proposed a survey in our rule-making process that we believed was useful to the states and balanced the burden placed on the youth with the statutory mandates for data collection. Furthermore, related studies of youth aging out of foster care, including the Multi-Site Evaluation of Foster Youth Programs, Midwest Evaluation of Adult Functioning and the Northwest Foster Care Alumni Study, conducted much more extensive surveys and typically used more personal and sensitive questions while maintaining high response rates.  On the basis of these studies and the public’s input on our rule-making, we expected that the survey as presented in the NYTD regulation would be easily understood and its content and level of burden will not discourage participation. After nearly eight years of data collection and consultation with the field, we feel that most youth have no trouble taking the NYTD survey if administered by a trained adult.


  1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


Tammy White (Office of Data, Research and Evaluation, Administration for Children, Youth and Families, 215-861-4004) was consulted on the statistical aspects of this information collection request. She also is primarily responsible for the analysis of the data associated with this information collection request.


References


Australian Institute of Family Studies (2007). "Growing Up in Australia: The Longitudinal Study of Australian Children." LASC Discussion Paper no. 5, ISSN 1447-1566 (online).


Bureau of Labor Statistics (May 2014). National Industry-Specific Occupational Employment and Wage Estimates, NAICS 999200 - State Government (OES Designation). http://www.bls.gov/oes/current/naics4_999200.htm


Berg, N. (2005) Non-Response Bias. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1691967.


Courtney, M.E., Dworsky, A., Cusick, G.R., Keller, T., Havlicek, J., Perez, A., Terao, S. & Bost, N. (2007). Midwest evaluation of the adult functioning of former foster youth: Outcomes at age 21. Chicago, IL: Chapin Hall Center for Children at the University of Chicago.


Courtney, M.E., Dworsky, A., Cusick, G.R., Keller, T., Havlicek, J., Perez, A., Terao, S. & Bost, N. (2005). Midwest evaluation of the adult functioning of former foster youth: Outcomes at age 19. Chapin Hall Working Paper. Chicago, IL: Chapin Hall Center for Children at the University of Chicago.


Fowler, F. J. (1986). Survey Research Methods. Beverly Hills: Sage Publications


Holt, D. & Elliot, D. (1991) Methods of Weighting for Unit Non-Response. The Statistician (1991) 40, pp. 333-342.


Meyers, K., Webb, A., Frantz, J., Randall, M. What does it take to retain substance-abusing adolescents in research protocols? Delineation of effort required, strategies undertaken, costs incurred, and 6-month post-treatment differences by retention difficulty. Drug and Alcohol Dependence 69 (2003) , pp. 73-85.


Pergamit, M.R. (2012). Locating and Engaging Youth after They Leave Foster Care: Experiences Fielding the Multi-Site Evaluation of Foster Youth Programs. Washington, DC: Urban Institute.


Statistics Canada (2006). National Longitudinal Survey of Children and Youth, Cycle 6 – User Guide. Ottowa, Canada.


Turner, C.F., & Martin, E. (eds). (1984). Surveying Subjective Phenomena. New York: Basic Books.

U.S. Department of Health and Human Services, Administration for Children and Families. (2007). Multi-Site Evaluation of Foster Youth Programs: Evaluation of the Life Skills Training Program. Washington, D.C.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting statement 2018 Part B NYTD
AuthorWindows User
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy