NYTD final suppt stmt FEB 2008 (II)

NYTD final suppt stmt FEB 2008 (II).doc

National Youth in Transition Database (NYTD) and Youth Outcomes Survey - Final Rule

OMB: 0970-0340

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT

National Youth in Transition Database

Final Rule


A. Justification. Requests for approval shall:


1. Circumstances Making the collection of Information Necessary.



Public Law 106-169, enacted December 14, 1999, established the John H. Chafee Foster Care Independence Program (CFCIP) at section 477 of the Social Security Act (the Act). The law provides States with flexible funding to carry out programs that assist young people in making the transition from foster care to self-sufficiency. The law requires the Administration for Children and Families (ACF) to develop a data collection system to track the independent living services States provide to youth and develop outcome measures that may be used to assess States’ performance in operating their independent living programs. We are implementing this requirement by creating a collection of information entitled the National Youth in Transition Database (NYTD).


2. Purpose and Use of the Information Collection



This is a new collection of information. We will use this information to track independent living services, assess the collective outcomes of youth, and potentially to evaluate State performance with regard to those outcomes consistent with the law’s mandate. The services that States will report are those that are traditionally offered to young adults in preparation for emancipation. States will be able to use the information to design their independent living programs to better help youth transition to independence.


3. Use of Improved Information Technology and Burden Reduction


The States will submit NYTD to the Department in an electronic format. Although ACF has not determined the exact method of transmission, ACF will explore the possibility of using Extensible Markup Language (XML) and other recent technologies for file transmission consistent with the E-Government Act of 2002 (Public Law 107-347).


4. Efforts to Identify Duplication and Use of Similar Information


Congress specifically mandated that we collect data on independent living services and youth outcomes. We have analyzed the sources and reporting instruments already in use in the collection of independent living information and concluded that the type of data in the NYTD final rule is not collected elsewhere. Specifically, we examined the data sources of both Federal and non-Federal agencies such as:

  • Federal—Adoption and Foster Care Analysis and Reporting System (AFCARS), and Runaway and Homeless Youth Management Information System (RHYMIS)

  • State, county, and local governments with integrated and/or complementary data systems such as Statewide Automated Child Welfare Information System (SACWIS)

  • Non-governmental organizations—Casey Family Programs, United Way, Lutheran Social Services, Catholic Social Services, Child Welfare League of America, and Public/Private Ventures.

AFCARS is a Federal system for data collection on youth who are in foster care or were adopted under the auspices of State child welfare agencies. Many youth who will be reported by States to the NYTD are also going to be reported to AFCARS. However, youth receiving independent living services may not be in foster care, nor are those for whom the State is required to report outcome information to NYTD. Also, there are separate and different authorizing statutes and penalty structures for NYTD and AFCARS that do not lend themselves to combining the databases.


5. Impact on Small Businesses or Other Small Entities


This information collection is required of State agencies only and does not impact small businesses or other small entities.


6. Consequences of Collecting the Information Less Frequently



This data collection is mandated by law therefore, we would be out of compliance with the statutory requirements if we did not collect independent living service and youth outcome information. We believe this approach advances our goal of understanding how young people who are likely to age out of foster care or have aged out are faring.


In the final rule, we require that States submit NYTD data to ACF every six-months, which is the same reporting frequency as AFCARS. We believe that any less frequent reporting may increase the risk of States reporting inaccurate or missing data. Further, we chose a semi-annual reporting period to preserve our ability to analyze NYTD data along with AFCARS data for the same youth.


We did receive a number of comments from commenters that suggested moving to an annual reporting cycle to reduce the burden on States.  Some commenters believed that an annual report period would ease the burden of reporting data for States and ease penalty and outcome calculations for Federal officials.  To keep the cycles timed to AFCARS, some commenters suggested moving AFCARS to an annual report period as well.


We declined to make the suggested change for several reasons. First, as stated in the preamble to the proposed rule, we considered a 12-month reporting period, but believed that a longer period increases the risk of inaccurate or missing data.  Second, since we want to preserve our ability to analyze NYTD data along with AFCARS data, we want comparable reporting periods.  The six-month report period for AFCARS is integral to a number of ACF priorities and legislative requirements.  Further, the six-month reporting period is required in regulation (see section 1355.40) and changing this timeframe is outside the scope of this regulation.




7. Special Circumstances Relating to the Guidelines of 5 CFR 1230.5



There are no special circumstances required in the collection of this information in a manner other than that required by OMB.


8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency


We published the information requirement in the Notice of Proposed Rule Making (NPRM) published in the Federal Register July 14, 2006 (71 FR 40346 - 40382). We included the information collections in the NPRM specifically to allow for public notice and comment. We received many comments that disagreed with our determination of burden estimates and asserted that we underestimated the actual amount of staff time that is required to collect and record services, demographic and characteristics data; track the whereabouts of the youth in the follow-up population; garner youth consent; and collect and report outcome survey data. A couple of commenters believed that more realistic estimates to collect outcomes data from youth would be between 30 – 60 minutes. Another commenter estimated it would take one quarter hour for staff just to report outcome data and the collection of such data would require extra time. A final commenter believed that we should pilot test the final NYTD to come up with better estimates of hour burden.


We have reexamined our initial estimates related to collecting and recording services, demographic, characteristics data as well the outcome data. The estimate for the average amount of staff time per youth to collect and record services, demographic and characteristics data of 30 minutes per youth per reporting period is based on a pilot test and on experience with AFCARS and other data systems. Most, if not all, of the information required for these three areas should be readily available either in the case file or through the case worker. We do not expect that the time spent collecting and recording this information for each youth will take longer than 30 minutes on average. We do agree with the commenters that the estimate in the NPRM for completing the youth outcomes survey is low and have adjusted our estimate upwards to take into account the necessary time for the youth to complete the survey and the State to incorporate that information into their database. In reexamining the number of questions on the outcomes survey, we estimate that it will take approximately one half hour to complete the survey and 15 more minutes for the State to record that information.



The Department consulted with the stakeholders required by law before developing the NPRM. To meet the consultation requirement, we conducted discussion groups with key stakeholders across the country. Some data elements and data collection procedures conceptualized after consultation were pilot tested in August 2001. HHS developed the data elements and data collection procedures in the proposed rule after completing consultation and pilot testing. The final rule refines the requirements based on public notice and comment.


9. Explanation of Any Payment or Gift to Respondents


No payments, other than Federal financial participation, will be made to States for the maintenance and development of an information system.


10. Assurance of Confidentiality Provided to Respondents



The final rule requires States to use an encrypted personal identification number so the identity of the individual youth remains confidential to anyone other than the State.

11. Justification for Sensitive Questions



Congress specifically required in law that States collect outcome information on issues of a sensitive nature including rates of non-marital childbirth, incarceration and high risk behaviors. Therefore, we pose questions in the youth survey that ask about these areas of a youth’s life. The survey is voluntary and we expect States to obtain youth or parental consent, as appropriate.


12. Estimates of Annualized Burden Hours and Costs



During negotiations with OMB regarding approval of this information collection request, ACF was given approval for eighteen months from the effective date of the final rule. That is, the information collection is approved for April, 2008 through October, 2009. However, the final rule requires States to begin collecting data as of October 1, 2010 and report information to ACF as of May 15, 2011. Therefore, we do not anticipate there will be any information collection burden on States prior to the conclusion of the 18-month approved period.



13. Estimates of Other Total Annual Cost Burden to Respondents and Record keepers



As stated above, during negotiations with OMB regarding approval of this information collection request, ACF was given approval for eighteen months from the effective date of the final rule. That is, the information collection is approved for April, 2008 through October, 2009. However, the final rule requires a State to implement and comply with the rule no later than October 1, 2010 and report information to ACF as of May 15, 2011. Therefore, we do not anticipate there will be any information collection burden on States prior to the conclusion of the 18-month period nor will there be a annual cost burden during the initial approved period of this information collection.


14. Annualized Cost to the Federal Government



We estimate the total Federal cost per year will be approximately $1 million for a contractor to design and implement a Federal computer system to receive and analyze State data, provide technical assistance (TA) via telephone and on-site TA to the States, and run 3 nation-wide TA meetings. The Federal share of the State annual cost burden for NYTD is approximately $6.5 million.


15. Explanation of Program Changes or Adjustments


This is a new information collection.


16. Plans for Tabulation and Publication and Project Time Schedule



The final rule requires States to begin collecting data as of October 1, 2010 and report information to ACF as of May 15, 2011. We plan to make some data reports available on the Children’ Bureau website. We cannot finalize a schedule for analyzing and publishing the data until after we receive State data.


17. Reason(s) Display of OMB Expiration Date is Inappropriate



The Department seeks OMB approval to display the expiration date in the regulation only as we are not regulating a specific form for the States to use in conducting the youth outcomes survey or to transmit data to us. In addition, as noted earlier, OMB has approved this information collection only through October 2009. We do not anticipate receiving data from States prior to May 15, 2011.


18. Exceptions to Certification for Paperwork Reduction Act Submission



There are no exceptions to the certification statement.


B. Collections of Information Employing Statistical Methods


The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results. When item 16 is checked "Yes," the following documentation should be included in the supporting statement to the extent that it applies to the methods proposed:


1. Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The Chafee Foster Care Independence Act of 1999, as provided in 42 U.S.C. 677(f) requires the Secretary to develop a data collection system that can track educational attainment, employment, avoidance of dependency, incarceration and other outcomes of foster youth and former foster youth that can be used to assess State performance in operating independent living programs. The law requires that States comply with the data collection requirement or be subject to financial penalties with respect to the Federal Chafee Program funds.


As one component of this data collection effort known as the National Youth in Transition Database, ACF proposed a rule (71 FR 40346) that would require States to collect certain outcomes information from youth directly on a three-year wave basis and report this information to ACF. States will administer the survey to 17-year old youth and then again at ages 19 and 21. The NYTD requirements will be codified at 45 CFR 1356.81 – 1356.86 and appendices.


The potential respondent universe consists of 17-year-olds who are in State foster care systems during a Federal fiscal year, beginning in FY 2011, with a new cohort selected every three years thereafter. States will get this information from data submissions submitted to the Adoption and Foster Care Analysis and Reporting System (AFCARS) which is a mandatory data collection system described at 45 CFR 1355.40. The respondent universe includes foster care youth who receive independent living services and those that do not and is defined in 45 CFR 1356.81(b) and 1356.82(a)(2). Youth incarcerated or institutionalized in a psychiatric facility or hospital would not be a part of the baseline population because they are not in foster care according to the definition in 45 CFR 1355.20. According to data from AFCARS, approximately 40,000 youth in foster care at the end of fiscal year 2005 met the criteria for inclusion in the respondent universe.


States must attempt to administer an outcomes survey on all youth in the baseline population of 17-year olds in foster care, unless the youth is mentally or physically incapacitated, the youth is incarcerated, or deceased (excluded groups). States unable to obtain information on a youth in the baseline reporting population must indicate in their data file the reason why the information is not available, such as the youth refused participation. Youth who have run away from their foster care setting for the 45-day time span following their 17th birthday would be a part of the baseline population, but a State could report the youth as having run away in the outcomes reporting status element to explain why that youth’s information was not collected (see section 1356.83(g)(34)). The youth who participate in the outcomes survey as part of the baseline become the follow-up population as defined in 45 CFR 1356.81(c) and 1356.82(a)(3).


Depending on the number of actual baseline respondents in a State, the State may choose to sample respondents for the follow-up population. The sampling formula is regulated in 45 CFR 1356.84. The sampling universe consists of youth in the State in the baseline population who participated in the State’s agency’s data collection at age 17. A State that chooses to sample must use simple random sampling procedures based on random numbers generated by a computer program unless another accepted methodology is approved by ACF. States that choose to will conduct sampling following the baseline year data collection. States report to ACF in their data file the identifiers of those youth in the sample.


Based on our understanding of the population and similar efforts to obtain outcomes data on this population (see Table 1, Multi-State Surveys on Foster Youth Outcomes, 2001-2007), we estimate that response rates for the baseline population will be 90% of 17-year-old, 80% of 19-year-olds and 60% of 21-year olds. To comply with the NYTD, States must achieve response rates of 60% of the 19- or 21-year olds who have aged out of foster care.


2. Describe the procedures for the collection of information including:


. Statistical methodology for stratification and sample selection,


. Estimation procedure,


. Degree of accuracy needed for the purpose described in the justification,


. Unusual problems requiring specialized sampling procedures, and


. Any use of periodic (less frequent than annual) data collection cycles to reduce burden.



TABLE 1: Multi-State Surveys on Foster Youth Outcomes, 2001-2007

SURVEY

METHODS

RESPONSE RATE

2007. Multi-site Evaluation of Foster Youth Programs: Evaluation of the Life Skills Training Program, Los Angeles County, CA: Final Report to U.S. DHHS ACF. (The Urban Institute, Chapin Hall Center for Children, and the National Opinion Research Center)

  • Data: Los Angeles CA Department of Child and Family Services; 17-year-olds in foster care receiving IL/Life Skills Training.

  • Letters to remind youth about follow-up

  • Mostly in person interviews, with the use of telephone interviews, as necessary; multiple survey instruments.


N= 467, baseline

- 97% response rate


1st follow-up at 18 years old – 91% response rate


2nd follow-up at 19 years old – 88% response rate

2007. Midwest Evaluation of the Adult Functioning of former Foster Youth: Outcomes at age 21. Courtney, M., et al. (Chapin Hall Center for Children)


2005. Midwest Evaluation of the Adult Functioning of Former Foster Youth: Outcomes at age 19. Courtney, M., et al. (a Chapin Hall Working Paper)


  • Data: Youth were identified from Public Child Welfare agency records in IA, WI and IL.

  • In-person interviews conducted at baseline and at ages 19 and 21.

  • Audio Computer-Assisted Self-Interviewing (ACASI) used for sensitive portions of interview.



N= 736, baseline

- 96% response rate


1st follow-up interviews at age 19 in 2004 (N=603).

- 82% response rate


2nd follow-up interviews at age 21 in 2006 (N=591).

- 81% response rate

2005. Improving Family Foster Care: Findings from the Northwest Alumni Study. Pecora, P. et al. (Casey Family Programs)

  • Data: Youth who had been in foster care between 1988 and 1998 identified from several programs’ administrative data.

  • Combination of in-person and phone interviews conducted by the University of Michigan survey Research Center.

  • Incentives paid.


N= 659 (total sample)


N= 479, youth interviewed in 2000 and 2001; the average age was 24 at time of interview.

- 73% response rate

2001. Foster Youth Transitions to Adulthood: A longitudinal view of youth leaving care. Courtney, M., et al. in Child Welfare, 80(6), 685-717.

  • Data: WI Human Services Reporting System.

  • Youth who had been in care at least 18 months and were 17-18 years old.

  • Voluntary participation, but cash incentive

  • Interviews in person or telephone; multiple survey instruments.


N = 149 selected, with 141 interviewed at baseline (before leaving care)  

- 95% response rate

                         

113 interviewed at 1st period (12-18 months since leaving care)      

- 80% response rate



As stated in the response to B.1, States will conduct the outcomes survey on a three year wave basis, starting with a new universe of 17-year-olds every three years. States that choose to sample will employ simple random sampling, or they may request ACF to approve another accepted sampling methodology. ACF will not accept proposals for non-probability sampling methodologies, but will consider stratified random sampling and other probability samples that generate reliable state estimates. The sampling universe will consist of the total number of youth in the baseline population that participated in the data collection at age 17.


States will administer to youth the survey located in appendix B to the regulation. States have the discretion to conduct the surveys via in-person interviewers, computer-aided devices, phone interviews or other methods as it suits their particular needs and population. Through technical assistance, ACF will also encourage States to use methods that are likely to achieve high response rates. We anticipate that most States will use in-person interviews or a combination of in-person and computer-aided devices given the recent experiences of the Multi-Site Evaluation of Foster Youth Programs, the Midwest Evaluation of the Adult Functioning of Former Foster Youth: Outcomes at Age 21 and the Educational and Employment Outcomes of Adults Formerly Placed in Foster Care: Results from the Northwest Foster Care Alumni Study that indicated high response rates with these approaches.


We determined through consultation and the public notice and comment process that this degree of variation in modalities of survey administration was acceptable for our needs. There are no dedicated resources under 42 USC 677 for States to devote to this data collection effort and funds will likely come from a combination of funds that would otherwise be used for youth independent living services and other existing resources. Given these limited State resources, and our need for data primarily as an administrative database and oversight tool, we declined to prescribe a particular survey method as is used in research practices. Although the youth outcomes information is obtained through a survey instrument, we consider the information to be State administrative data, similar to our data collection through AFCARS. In AFCARS, we do not prescribe how a State must obtain child-specific information. The State’s Chafee data will have the same utility for our program needs regardless of the State’s collection methods as long as it meets the regulatory requirements.


We will provide technical assistance to States to encourage best practices in tracking youth to minimize respondent attrition over time and survey methodologies. See attachment as an example: Practical Strategies for Tracking and Locating Youth. Technical assistance on sampling will be conducted primarily with in-house statisticians, assistance on tracking and survey methods will be provided through our National Resource Center partners. Supplemental material that provides guidance to ensure consistency for States in introducing youth to the survey, administering the survey, and following up with youth, regardless of the method chosen, will be issued by ACF and/or its technical assistance partners prior to the implementation date of NYTD.


3. Describe methods to maximize response rates and to deal with issues of nonresponse. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


Our response rates are modeled on RR2 response rate (American Association for Public Opinion Research, 2006) and are reflective of our analysis of information from data collection efforts on former foster youth sponsored by ACF and States. As we indicate above, we anticipate a 90% response rate from 17-year-old youth in foster care which is our baseline population. This population is easy to locate since they are in the placement and care responsibility of a State agency. However, we anticipate that some youth will not be able to be located because they have run away from foster care and, in addition, a small percentage of youth may decline to participate.


For 17-year-olds (the baseline population), the response rate numerator will be the number of 17-year-old foster youth who complete or partially complete the outcomes survey. The response rate denominator is the total number of 17-year-old youth in foster care minus youth who are not eligible to be interviewed because they are mentally or physically incapacitated, deceased or incarcerated. This response rate will not be used to determine compliance.


We anticipate that States will achieve response rates of 80% for youth at age 19 and 60% for youth at age 21. As seen in Table 1, these rates have been demonstrated to be appropriate expectations for gathering data from this population. The regulatory participation rates that States have to meet or enter into corrective action are 60% for 19- and 21-year old youth who are no longer in foster care. This population is more mobile than youth in foster care at age 17, however, because States now have Federal funds through 42 USC 677 to provide independent living services to older youth who age out of foster care, we anticipate that States will stay in contact with youth through these services and other outreach activities. Further, we anticipate that State agency incorporation of contact and tracking methods into State agency standard child welfare practice will enable a State to reduce attrition for survey purposes. Note: The regulations also require 80% for 19 and 21 year olds who are currently in foster care. However, we are not aware of any state that has those age groups in their foster care programs.


For 19- and 21-year olds (the follow-up population) if a State does not sample, the response rate numerator is the number of 19- or 21-year olds who complete or partially complete the outcomes survey. The denominator is the number of youth now 19 and 21 years old who completed or partially completed the outcomes survey as part of the baseline population at age 17, minus youth who are by the time of the follow-up survey incapacitated, deceased or incarcerated. For States that choose to sample, the numerator is youth in the sample who complete or partially complete the outcomes survey. The sampling denominator is the number of youth who were identified as in the sample minus youth who are incapacitated, deceased or incarcerated. These resultant response rates are also the participation rate which does factor into State compliance with the NYTD as discussed in 45 CFR 1356.85(b)(3).


We believe that these anticipated response rates are suitable for our purposes, which is to have some outcomes information to meet the statutory mandates at 42 USC 677 that can provide a perspective on how youth are faring after they leave foster care and assess State performance of their independent living programs. We intend to conduct nonresponse analysis using data available from NYTD on demographic characteristics and independent living service history and AFCARS data regarding disabling conditions, length of time in foster care, foster care placement setting history and other factors. States are required in the final rule to use the same person identifiers in NYTD as those they use for AFCARS, which allows us to conduct analysis using the available characteristics from both databases.


For the National Youth in Transition surveys, the Children's Bureau (CB) will manage the potential for non-response bias in several ways. These are standard, established approaches in the field of survey work, and will maximize the accuracy and usefulness of the data derived from the NYTD work.


The approach to collecting the NYTD data is expected to vary by State regarding the collection of data in-person, by telephone, using computer-aided devices, and similar possible approaches that meet the needs of the State and the particular characteristics of the State's population. Clearly, any of the aforementioned approaches to data collection has the potential for bias, both non-response and response, and\or measurement error.  However, there also are standards of practice for collecting data to address potential bias, and the Children's Bureau is well-prepared to address this issue.


The Children's Bureau will provide States with a set of standard guidelines to assist them in 1) maximizing response rate with as little non-response bias as possible; 2) determining how the non-responders differ from the responders; and 3) making statistical adjustments for non-response bias. Critical to the success in thoroughly disseminating this information to States will be the technical assistance (TA) that CB plans to provide to States through National Resource Center staff, contractors and Federal staff. All staff providing TA will be experienced in collecting survey data, and they will receive additional specific training prior to their work with States.


Non-response bias occurs when groups with certain characteristics tend to fail to respond at a rate greater than the others in the group being surveyed. While it is possible that one could have a uniform response rate, it is more common to have groups with some characteristics who have more of a tendency not to respond. An example would be that groups with lower educational levels may be less likely to respond. If youth with a lower education are disproportionately represented in non-responders, this can lead to data that will produce false conclusions about our youth in general. Knowing the magnitude and kind of non-response bias present in the NYTD work will be important for achieving an accurate picture of the youth we are studying.


The first line of defense with regard to non-response bias is to prevent it to the degree possible. TA will be the key to prevention. Many of the principles behind good survey design and practice are long-standing and well-documented, but are not always followed (Turner and Martin, 1984). The Children's Bureau will ensure that TA staff are knowledgeable professionals in the field of surveys, and they will have guidelines for training and assisting State staff. Training for States will be given in the two critical tasks of accessing the targeted youth and, just as important, ways of encouraging their participation. Calling youth to either complete the interview by phone, or to set up a time to meet the youth in person, would be discussed in detail.


Fowler (1984, pp. 52-53) provides commonly used suggestions such as calling numerous times (often six), and making the calls at times convenient to respondents (avoid limiting the calls to the 9am to 5pm time period because it will eliminate those who are at work, for example). Fowler also provides a summary of common suggestions for improving response rates, such as sending a letter in advance of the call, explaining why the study is important, and assuring the respondent that he or she will not be harmed in any way by responding. These are only examples of ways that TA would be used to prevent non-response bias to the degree possible. A detailed plan for training States will be developed by the TA staff and approved by the Children's Bureau.


A usual way to determine non-response bias is to make stronger additional efforts to track down a random sample of intended respondents who did not respond. The goal is to determine if and how these initial non-responders differ from those who responded after only the normal contact methods were used. It might be discovered, for example, that the initial non-responders (the "difficult to track down ones") replied in their survey answers that they were employed full-time at a rate of 1 percent, whereas the average for the typical responders was 50%. Or it could be a difference based on the demographic characteristics of the target group—for example, we might find that Native American youth are responding at a much lower rate than are other groups <in proportion> to their numbers in the over all target group. We would know from the AFCARS data files the demographics of the targeted population. The States will be given guidance on options for dealing with non-responders.


Survey researchers can and do use information on differential response rates to create weights that are used to correct bias in the data due to non-responders (Holt and Elliot, 1991), (Nathan Berg, 2002). After the Children's Bureau receives the data from the States, these corrections can be done at the Federal level using standard methods.


While the primary focus of this discussion is on non-response bias, it also needs to be pointed out that the Children's Bureau is well aware of the potential for measurement error in administering the items on the survey. This will be addressed by ensuring that the TA providers are knowledgeable about the potential causes of measurement error so they can provide advice and training to State staff.


Because it is expected that much of the survey work will be completed during personal interviews, an important training area will be that of interviewing techniques. These techniques will need to be tailored to account for the individual characteristics and needs of the respondents. For example, some respondents may have disabilities, such as limited vision or hearing, and the TA providers will need to give States advice and information on how to address such interviewing challenges so that the respondents feel comfortable enough to proceed with the survey and provide accurate responses.


Measurement error also can occur due to the respondent's inability to understand certain questions. Because of the likelihood of a wide range of educational levels in the target population, the TA staff will provide States with advice in dealing with this issue to ensure the most accurate collection of the information from the target youth. They will have standardized alternative wording that provides a simplified version of the question in cases where this seems warranted (such as the substitution of the term, "full-time job" for "full-time employment"). In this way, the interviewer can be true to the intent of the survey questions, while ensuring that they are understood clearly.

Specific efforts by the Children’s Bureau directed toward preparing and assisting States in the collection of high quality, accurate data include the following:  

  • Upon the release of the final rule, one of the first steps CB will take is to hold an all-State conference call or webinar to present a power point presentation on the final rule. At this time and through future calls and webinars, we will begin assessing State plans for the NYTD data collection, including how they will implement the survey (e.g., their preferred mode of delivering the survey to youth). Throughout this process, we will be available to provide responses to State questions and concerns as States proceed through the development of their data collection system. This approach will enable us to tailor technical assistance materials to State-specific concerns, as well as to be proactive in offering assistance about non response bias and other related issues.


  • The first opportunity we will have to offer technical assistance in some of these key areas will be May 2008 at the Pathways to Adulthood Conference, sponsored by the Children’s Bureau’s National Child Welfare Resource Center for Youth Development.  Both program and data staff from State child welfare agencies will be in attendance. We are finalizing plans now for Federal/contract staff working on the Evaluation of Chafee Independent Living programs to present on specific tracking techniques and strategies for engaging current and former foster youth in participating in survey research on youth outcomes. The presenters will be able draw on lessons learned from the ACF Chafee Evaluation.  By beginning to address these issues very early in the development of NYTD, we will lay the groundwork for ensuring that States receive the desired NYTD response rates, thereby mitigating the potential issue of non-response bias. The information provided during our presentation at this conference also will be made available through the CB website.


  • After publication of the final rule, CB technical assistance providers will begin work on developing a guide to implement the NYTD outcome survey in paper-based, web-based, and telephone interview modalities to establish consistency and to control for measurement error and response biases.  We anticipate that this information will be disseminated via the CB website by the end of FY 2008.


  • During FY 2009, there will be at least two opportunities for CB and our TA providers to offer in person training and technical assistance on the outcome survey component of NYTD with the State child welfare agency system and program staff. We are planning our annual Pathways conference for the spring of 2009 and the annual Data and Technology Conference in the summer of 2009. We are in the planning stages now for those conferences and will offer specific workshops that will allow State staff to have hands on sessions with Federal staff as well as our TA contract providers to address the issues discussed above on response bias and measurement error and well as other related topics.

 

A final word: Note that others doing similar work have successfully used procedures such as those described above (Canada's 2006 "National Longitudinal Survey of Children and Youth" and Australia's 2007:"Growing Up in Australia: The Longitudinal Study of Australian Children."). We will draw on all these resources as we work with States to develop this new data collection process.


References


1) Australian Institute of Family Studies, 2007. "Growing Up in Australia: The Longitudinal Study of Australian Children." LASC Discussion Paper no. 5, ISSN 1447-1566 (online).

2) Berg, N. (2002) Non-Response Bias. WWW: UTD/_nberg/Berg_ARTICLES/BergNon-ResponseBiasMay2002.pdf.

3) Fowler, F. J. (1986). Survey Research Methods. Beverly Hills: Sage Publications

4) Statistics Canada (2006). National Longitudinal Survey of Children and Youth, Cycle 6 – User Guide. Ottowa, Canada.

5) Holt, D. & Elliot, D. (1991) Methods of Weighting for Unit Non-Response. The Statistician (1991) 40, pp. 333-342.

6) Turner, C.F., & Martin, E. (eds). (1984). Surveying Subjective Phenomena. New York: Basic Books.


4. Describe any tests of procedures or methods to be undertaken.


Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for collection of identical information from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


The survey was developed in consultation with practitioners, youth, and researchers in the field and was part of the proposed rule issued at 71 FR 40346.  As we stated in the preamble to the proposed rule issued at 71 FR 40347, a pilot test was conducted in August 2001 which served as a field test of the draft data elements, definitions, and procedures. It provided valuable information for assessment of the data collection burden on the States. In each of the seven pilot States, caseworkers collected data about several older youth, identified any unclear definitions, and described any difficulties encountered while collecting data. Each pilot State also was asked to report the amount of effort required to collect the information. We used these responses to assess the burden for workers, and to learn if the capacity to report data varied significantly across agencies or States.


Based on this input, we proposed a survey in the NPRM that we believed was useful to the States and balanced the burden placed on the youth with the statutory mandates for data collection. In response to the NPRM, we received very few comments from State child welfare agencies that would indicate concern that the survey will be difficult for the youth to complete. Furthermore, related studies of youth aging out of foster care, including the Multi-Site Evaluation of Foster Youth Programs, Midwest Evaluation of Adult Functioning and the Northwest Foster Care Alumni study, conducted much more extensive surveys and typically used more personal and sensitive questions while maintaining high response rates.  On the basis of these other efforts’ experiences and the public’s input on our NPRM, we expect that the survey as presented in the final rule will be easily understood, and its content and level of burden will not discourage participation.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


John Gaudiosi, Mathematical Statistician

HHS, Administration for Children and Families,

Children’s Bureau

202-205-8625


Data will be collected by States and submitted to ACF in a data file as a component of the larger NYTD. ACF will conduct primary analysis of the data. We intend to make the data files available for public use with appropriate measures taken to ensure confidentiality.





17


File Typeapplication/msword
File TitleI-83 supporting statement
SubjectNational Youth in Transition Database
AuthorMiranda Lynch
Last Modified ByUSER
File Modified2008-02-22
File Created2008-02-22

© 2024 OMB.report | Privacy Policy