Supporting Statement - Part B

Supporting Statement - Part B.docx

Promoting Readiness of Minors in SSI (PROMISE) Evaluation - Interviews with Program Staff, and Focus Group Discussions

OMB: 0960-0799

Document [docx]
Download: docx | pdf

b. collection of information employing statistical methods MATHEMATICA POLICY RESEARCH

B. Collection of Information Employing Statistical Methods


This section provides information regarding the staff activity logs, which will not use any statistical methods. It also includes information about the previously cleared data collection instruments for the focus group and staff interviews and 18-month survey interview instruments. The former do not use any statistical methods, whereas the latter do.


B1. Respondent Universe and Sampling Methods


PROMISE programs deliver services to youth with disabilities in their designated service areas. The respondent universe for this evaluation comprises youth Supplemental Security Income (SSI) recipients aged 14-16 at enrollment who reside in one of the programs’ service areas and who consent to participate in the research. Below we describe: (1) the selection of programs for the demonstration, (2) the selection of youth in the programs’ service areas to participate in the 18-month survey, and (3) the selection of program staff to complete the staff activity logs.

1. Selection of Programs

On September 30, 2013, The Department of Education (ED) announced the award of $211 million over five years to five individual states and one consortium of six states to design and implement PROMISE demonstration programs. These awards are in the form of cooperative agreements that entail an ongoing working relationship between the funding agency and the awardees to achieve the program objectives. The awardees are all individual state agencies that formed partnerships with other agencies for the purpose of implementing PROMISE. They were selected through a competitive process that included publication of a request for applications in the May 21, 2013, Federal Register (98 FR 29733), preparation and submission of applications by state agencies, and external peer review of the applications by a panel which ED convened. ED used the following criteria to evaluate the applications and select the agencies to which they awarded cooperative agreements:

  • The quality of the program design

  • The quality of the youth recruitment plan

  • The quality of the program management plan and program personnel

  • The significance of the program, including its potential to bring about systems change and the likely magnitude of anticipated outcomes

  • The capacity of the program for continuous feedback and improvement


Table B1 lists the lead PROMISE agencies, participating states, program names, and award amounts:

Table B1. The PROMISE Programs

Lead Agency

States

Program Name

Award Amount

Arkansas Department of Education

Arkansas

Arkansas PROMISE

$32,427,441

Utah State Office of Rehabilitation

Consortium of states: Utah, South Dakota, North Dakota, Montana, Colorado, and Arizona

Achieving Success by Promoting Readiness for Education and Employment (ASPIRE)

$32,500,000

California Department of Rehabilitation

California

California PROMISE (CaPROMISE)

$50,000,000

Maryland Department of Disabilities

Maryland

Maryland PROMISE

$31,190,076

New York Office of Mental Health

New York

New York State PROMISE (NYS PROMISE)

$32,500,000

Wisconsin Department of Workforce Development

Wisconsin

Wisconsin PROMISE

$32,497,181

Source: ED’s press release on PROMISE awards [http://www.ed.gov/news/press-releases/department-awards-211-million-promoting-readiness-minors-supplemental-security-i].

A minimum of 2,000 youth participate in the evaluation within each PROMISE program. We randomly assigned half of the youth participants to a treatment group and the other half to a control group. We sampled the youth in the one program (in California) that expects to enroll more than 2,000 youth in the evaluation.

Because we anticipate achieving at least an 80 percent response rate on the 18-month survey, we do not expect to submit a nonresponse bias analysis to OMB. We will, however, use SSA lists and administrative data to assess the extent of differences between evaluation enrollees and non-enrollees at baseline and between survey respondents and non‑respondents at follow-up.

2. Selection of Youth

The enrolled youth and their parents or legal guardians will be the respondent universe for the 18-month surveys. Five of the programs anticipate enrolling 2,000 youth and families in the PROMISE evaluation, thus we will not need to sample. However, the California program plans to enroll 3,172 youth in the evaluation. Therefore, for the California program we will need to randomly select youth enrollees for follow-up data collection, because SSA’s budget for the evaluation presumes we will attempt the follow-up data collection with only 2,000 youth per site.

The California program will recruit and enroll youth over a 15-month period. Because the recruitment period will be less than 18 months, they will complete enrollment before it is time to conduct the 18-month follow-up with the first California youth who entered the evaluation. We will simplify the design for drawing the sample, because we will know the final population of enrollees at the time we select the survey sample.

In California, we will conduct stratified random sampling, with the strata defined by the key dimensions, as follows:

  • Local education agency (LEA): 19 strata

  • Treatment/control status: 2 strata


These strata define 38 cells, which will be the basis for the random selection of cases for the survey sample.

The probability that we will select an enrolled youth into the survey sample will be equal to the budgeted sample size (2,000) divided by the achieved number of enrolled youth who we will randomly assign, which we assume for now will be the California program’s proposed total number of enrollees, less the approximately 3 percent of enrollees who will be non-research cases: 3,172 – 95 = 3,077. Under this assumption, the selection probability will be 2,000/3,077 = 0.650. We will array the research enrollees across the 38 cells, and then randomly select cases from each cell with a probability of 0.650. This method will ensure the representation of each cell in the sample in the same proportion that it represents in population of enrollees. All research enrollees in the California program will have the same probability of selection into the survey sample, so the sample will be self-weighting, and we will not need to calculate sampling weights.1 However, we will need to calculate weights to correct for nonresponse to the follow-up surveys.

3. Selection of Program Staff

Management and line staff substantively involved with each PROMISE program represent the population of interest for the staff activity collection logs. Each program uses a different staffing approach in how it provides services to youth and families. We will conduct telephone meetings with the program manager to describe the staff activity log data collection effort, discuss the different staff positions involved in program service provision, and identify which staff positions to include in the data collection effort. We will collect activity logs from members of all staff positions (such as administrators, case managers, employment specialists, and benefits counselors) in which PROMISE activities represent a primary part of their job duties. Where a program has many individuals (4 or more) working in a staff category, we will consult with the program manager on whether to select a sample of staff (a total of 2 to 10 individuals, depending on the number of staff in that category) to complete the log. Where a program has 3 or fewer individuals working in a position, we will ask all individuals in that category to complete the log. We will work with the program manager to identify the individuals from each category to include in the data collection effort. We will also ask the program manager to provide additional instructions and activity descriptions to include in the log instructions.

B2. Procedures for the Collection of Information


1. Recruiting Study Participants

a. Staff Interviews: As our first step in the data collection, we sent a letter to each PROMISE site project director explaining the evaluation and seeking their cooperation with it. The letter came from Jeffrey Hemmeter, the SSA Project Officer, to lend credibility to the study and further encourage cooperation. The evaluator followed up with the PROMISE project director in a telephone call to describe further the information we will gather from PROMISE stakeholders during in-person interviews and a brief self-administered social network survey we ask them to complete at the end of each interview. We asked project directors to identify individuals who can provide the required information, and for information about their general schedule constraints. We developed a schedule for the interviews that meets participants’ needs collaboratively with the PROMISE project and partner contacts.


Approximately two weeks before the interviews take place, the evaluator mails an information packet to the PROMISE project director containing the final interview schedule. The packet contains contact information for the evaluation team member who will conduct the interviews so the respondents can reach them in the event of a schedule change or other issues that may arise before the interviews. Providing the sites with adequate information ahead of time in a professional manner helps build rapport and ensures the interviews go smoothly and that interviewees are available and responsive. The evaluator uses an interview guide, based on the interview topic list provided in Attachment B, to conduct the staff interviews. The interviewer is responsible for taking notes during each interview. Upon completion of all interviews conducted for a particular PROMISE project, the evaluator develops a summary of the information we collected during the site visit and phone interviews.


During the staff interviews, the evaluator asks interviewees to complete a brief social network survey. We administer separate versions of the survey to program managers or directors and project staff, tailored to their specific perspectives (Attachment C). The survey is self-administered with pen and paper. The evaluator enters the survey data into an Excel spreadsheet for analysis after completion of the site visit.


b. Focus Groups: After state PROMISE project staff confirms youth and their families as eligible for services, the project staff obtains their informed consent and enrolls them into the study. The consent process addresses the program benefits; random assignment process; expectation to complete follow-up surveys; and the voluntary nature of participation in all study activities. The project staff also discloses any potential risks of participation and the use of personal information. Once the state PROMISE project staff receives consent, they enter the youth into the PROMISE random assignment database, enroll them in the demonstration, and assign them to either the treatment or control group.


We conduct the focus groups with a convenience sample of youth and their parents or guardians who are PROMISE treatment group members. We conduct the focus groups during the site visits in fall 2014 and through summer 2015, and will conduct a second set of focus groups during the 2016 site visits. Separate but concurrent discussions occur with 10 youth and 10 parents or guardians in each group. The evaluator works closely with the PROMISE project staff to arrange the focus groups. Where possible, we convene the groups in a facility of a PROMISE service provider familiar to participants.


The evaluator uses a recruitment script to introduce the evaluation, describe the purpose of the focus group, and confirm the willingness to participate. One week before the focus groups, the evaluator sends a reminder letter to each individual who agreed to participate along with directions to, and a map of, the focus group location. The evaluator also sends a reminder or confirmation mailing prior to the session, with a telephone reminder placed the day before the session.


A professional researcher on the staff of the evaluation contractor conducts the focus groups using a semi-structured protocol to facilitate an informal group discussion. To ensure we capture all information, the facilitator records the discussion for later transcription. We inform participants about the recording and instruct that they may request the facilitator suspend the recording at any time. We ask no identifying information during the focus group, and the facilitator only calls group participants by their first names; thus, we include no identifying information in the recording. Each focus group participant receives a $30 incentive in the form of a gift card, following completion of the session.


SSA recognizes that the small number of participants and small number of groups and sampling approach means that we cannot use the collected focus group data to extrapolate to the larger population of youth and parents or guardians enrolled in PROMISE, or families in the broader population of those receiving SSI payments. However, the focus groups capture critical qualitative information about the experiences of PROMISE participants, their families, and project staff. The information we collect during these interviews complements the information we gather through administrative data and through follow up surveys, providing more in-depth and qualitative understanding of the PROMISE projects. These focus groups are a critical piece of the process for PROMISE and essential for the evaluation team to assess whether and how the projects did or did not meet expectations.


c. 18-Month Survey Interviews: The 18-month survey data collection efforts will span 28 months, with a rolling release of sample that will mirror the 24 months of study enrollment plus another 4 months to complete interviewing in each sample release. We will aggregate the sample cases into cohorts and release them by month to ensure we interview each of the enrollees as close to their 18 month anniversary as practical. Assuming a roughly even pace of enrollment, each monthly release will contain approximately 500 cases.2


We will use Mathematica’s sample management system (SMS) to: (1) release eligible cases and ensure we work them thoroughly; (2) mail invitation and reminder letters and incentive payments; and (3) track and store sample cases’ updated contact information. Over the full 28-month survey period, data collection managers will use a range of production reports to monitor the data collection and ensure it aligns with production, cost, and quality goals. We will carefully monitor response rates for each program and for treatment and control groups.

The survey process for an individual sample case will begin with an advance notification letter from Mathematica, inviting the youth and a parent to call Mathematica for an interview. The letter will offer the respondents a $30 gift card for completing their respective interviews (a 35-minute interview for parents/guardians and a 25-minute interview for youth). The letter will offer an additional incentive of $10 to those who call Mathematica to complete the interview within 10 days of receiving the letter ($40 total, in a single gift card). By deploying a differential incentive, resources can target sample members who are likely to require intensive efforts to locate, contact, or engage for interviews. We anticipate 20 percent of the cases completing the 18-month interview will do so by calling Mathematica in response to the advance notification letter and follow-up postcard (Attachment J).

We will send subsequent mailings during the remaining weeks of the survey period to all outstanding sample cases to: (1) notify them that an interviewer will contact them by telephone or in-person, (2) encourage them to participate in the survey, (3) respond to concerns they may have about the study, and (4) notify them the survey will end soon, and their unique experiences and input are critical to the success of the study. All contacts and outreach will emphasize the voluntary nature of participation, and how we will not affect their benefits, regardless of whether sample members decide to participate in it.

Mirroring the approach we used on the YTD evaluation, we will target the parent or guardian who is “most knowledgeable about the services received by the enrolled youth,” as the respondent for the parent survey. This parent or guardian is likely to be the same one who helped the youth enroll in PROMISE and signed the PROMISE program’s enrollment consent form. It is also likely to be the parent or guardian who is most engaged in the youth’s receipt of PROMISE services (if the youth is in the treatment group). Because the individual satisfying this description may change over time, we do not plan to target a specific named individual for the parent survey. However, interviewing staff will have access to data identifying which parent or guardian signed the enrollment consent form, should we need it.

We will design the instruments to accommodate a wide range of disabilities. We will train interviewers to offer breaks, where needed, to accommodate youths with disabilities that cause stamina limitations. We will word questions as simply as possible to allow accessibility to those with mild cognitive disabilities. Although we cannot design instruments that will address every disability we may encounter, these basic design characteristics will enable us to interview most youth in the study without the use of proxies.3 We will, however, design proxy wording for circumstances in which a youth cannot complete an interview independently. We expect to complete the youth and parent interviews in the same call for approximately 40 percent of the sample cases. In addition to ensuring a high overall response rate, we will carefully monitor progress with the youth and parent subsamples to ensure completion of both the youth and parent interviews for as many cases as possible. We anticipate completing most interviews (80 percent) by telephone. Some sample cases will be extremely difficult to locate or contact, or will require an in-person interview due to a disabling condition. Field staff will use computer-assisted personal interview (CAPI) to completed interviews with such cases. We anticipate completing approximately 20 percent of all interviews via CAPI. Once we send a case to the field, it will be retired from outbound calling. Field staff will conduct interviews using tablet computers, either in the sample member’s home or at an alternate location agreed to in advance. We will conduct interviews primarily in English and Spanish, with a Spanish version of the instrument available in the CATI or CAPI system. We anticipate completing approximately 10 percent of the 18-month interviews in Spanish. All of our Spanish-speaking interviewers will have completed professional certification to ensure they are qualified to interview in Spanish.

Our review of the winning applications for PROMISE cooperative agreements identified some unique features of the ASPIRE program that may necessitate special survey strategies for sub-populations of enrollees. These features and the proposed strategies are as follows:

  • Enrollees in the ASPIRE program will include Native Americans, who may reside on reservations. Native Americans are considered a hard-to-survey population for several reasons, including: (1) mistrust of outside researchers, who may be perceived as judgmental; (2) concerns about how the survey data will be used; (3) high concentrations of poverty and other household complexities; and (4) reduced access to telephone service due to limited household resources or cultural norms (Basto, Warson, and Barbor 2012; Brugge and Missaghian 2006; Getrich et al. 2013; Gilder et al. 2013; Hodge et al. 2010; Israel et al. 2008; Jones 2008; Ver Ploeg, Moffitt, and Citro 2002). To address these challenges, we will collaborate with the program implementation team and build upon the positive outreach they conducted with tribal leaders. We will seek to obtain endorsements for the survey by the tribal leaders and, having done so, will work with them to determine how best to conduct outreach to reservation-based sample cases.

  • The ASPIRE program will serve not only rural, but also “frontier” populations (geographic areas with extremely low population density), for which exceptionally long distances may exist between households. For these sample cases, we will attempt to complete the 18-month interview by telephone, using whatever accommodations may be necessary. When necessary and feasible, we will use alternative means of communication, such as Skype, to connect with sample cases using a video-based computer exchange. If cases are unreachable by telephone and have no computer access, we will determine whether there is a sufficient concentration of them to make efficient use of field interviewers. We will work with ASPIRE program staff to proactively address this challenge by aligning our hiring of field interviewers with areas where concentrations of sample cases are anticipated.

d. Staff Activity Logs: As a first step, we will schedule telephone interviews with the program manager of each program to discuss the staff activity logs. During this meeting, we will discuss the staff activity logs, review the staff categories and the number of staff in each category, consider which staff categories to include in the data collection activity, and identify individuals in each staff category to complete the logs. In addition, we will ask the program manager to identify two one-week periods that would be best for staff to track their time. These should be periods that do not include program activities that would not represent typical weeks in program service delivery, such as when staff conferences or trainings occur. The periods will vary for each program, depending on the timing of the site visit.

For each period and program, we will follow a similar schedule to administer the staff activity logs:

  • About one to two weeks before the data collection effort, we will ask the program manager to inform program staff that Mathematica will be contacting them about the staff activity logs.

  • We will send an email to selected staff the Wednesday before the target week of data collection. The email will contain the staff activity log (Attachment K) in Excel and PDF formats, and the body of the email will explain the rationale for the staff activity log, provide instructions on how to complete it and where to go for more information on completing the log, and present options for returning the log to Mathematica. The log is self-administered and the selected staff can complete it either electronically or with pen and paper.

  • On the Monday after the staff complete their logs, we will send emails to staff asking them to return the logs, for those who have not already done so.

When we receive the logs, we will enter the data into an Excel spreadsheet for analysis.

1. Statistical Power/Precision Estimates

Even with an experimental design, we need sample sizes large enough to provide sufficient statistical power for statistically significant impact estimates in cases where the program produces large enough estimates which policymakers or practitioners find meaningful. The PROMISE evaluation has samples of 2,000 SSI youth in each of five sites. In California, we anticipate recruitment of 3,172 youth. We randomly assign half of the enrollees in each program to a treatment group, and the other half to a control group. Relying on these sample sizes, we present in Table B1 the minimum impacts we expect to detect using administrative or survey data on five key outcomes for youth: (1) employment in paid jobs, (2) annual earnings, (3) enrollment in school, (4) SSI payment receipt, and (5) annual SSI payment amount.

The minimum detectable impacts (MDIs) in Table B1 suggest that the planned study samples will support the detection of meaningful impacts. For example, in five of the six sites, we expect to detect program impacts of five percentage points or larger on employment in paid jobs estimated using administrative data and six percentage points or larger using survey data for the full samples; we expect to detect impacts of four percentage points or larger using administrative data in the California site because of its larger sample size. Evaluations of interventions providing transition services to youth with disabilities found short-term impacts on employment rates that are larger than these MDIs. For example, in the YTD evaluation, three of the six projects showed estimated impacts on the likelihood of being employed in a paid job during the 12 months following enrollment of between 9 and 19 percentage points (Fraker 2013).

The study samples will also be sufficient to detect policy-relevant impacts for important subgroups. For example, we will be able to detect a program impact of eight percentage points or larger on paid employment using 50 percent samples of the survey respondents, such as female or male evaluation enrollees. We will be able to detect an impact of 11 percentage points or more on the likelihood of youth being employed in paid jobs during the year following enrollment even using 25 percent survey samples, such as youth who had any work experience prior to enrollment in the evaluation. However, we note that for two of the three YTD projects with statistically significant impacts on employment during the year following enrollment, the impacts were 9 percentage points (Fraker 2013). Table B2 indicates that we will not be able to detect impacts of that magnitude by the PROMISE programs at the 95 percent confidence level based on 25 percent survey samples.

Table B2. Minimum Detectable Impacts


Outcome

Sample Size

Employed in Paid Jobs

Annual Earnings

Enrolled in School

SSI
Receipt

Annual SSI Payments

Assumed mean value of outcome for control group members

23%

$900

88%

99%

$6,500

Follow-Up Data from Administrative Records

California






3,100 (full sample)

4%

$287

n.a.

1%

$220

1,550 (50% sample)

6%

$405

n.a.

1%

$311

Other sites






2,000 (full sample)

5%

$357

n.a.

1%

$274

1,000 (50% sample)

7%

$505

n.a.

2%

$387

Follow-Up Data from Surveys

All sites






1,600 (full sample)

6%

$399

4%

n.a.

n.a.

800 (50% sample)

8%

$564

6%

n.a.

n.a.

400 (25% sample)

11%

$798

9%

n.a.

n.a.

Notes: MDI calculations assume (1) an equal number of treatment and control members, (2) a 95 percent confidence level with an 80 percent level of power, (3) a two-tailed test, (4) a reduction in variance of 10 percent owing to the use of regression models, (5) standard deviations of annual earnings and annual SSI payments of $3,000 and $2,300, respectively, (6) administrative data obtained on 100 percent of the sample, and (7) survey response rates of 80 percent. Mean values of outcomes for control group members are based on findings from the YTD evaluation’s 12-month impact analysis.

n.a. = not applicable.

B3. Methods to Maximize Response Rates and Deal with Nonresponse


1. Staff Interviews and Participant Focus Groups

In arranging the interviews we conduct with PROMISE staff and the staff of partner organizations, the evaluator works with the PROMISE project leadership to determine the most convenient times and formats (group versus individual; phone versus in-person) to convene the interviews. The evaluator also limits the interviews to approximately one hour to ensure the data collection imposes only a modest burden on respondents. The evaluator uses separate discussion guides for each potential respondent type so respondents are not asked about activities or issues which do not apply to them. In addition, data collectors meet with in-person interview respondents in their own offices or at a location of their choice.


Because the focus group sample is a convenience sample, target response rates to ensure a representative population are not at issue. To address non-response by ensuring the groups contain approximately 10 parents or guardians and 10 youth each, the evaluator recruits more participants than needed, based on prior experience that a portion of those initially recruited will not attend the group when it meets. Further, the evaluator provides telephone and mail reminders to all recruited participants as the focus group date approaches. Finally, we provide incentive payments to focus group participants (as noted in section B.2) to alleviate some barriers to focus group participation.


2. 18 Month Survey Interviews

We anticipate out-of-date contact information for sample cases due to high mobility of the low-income target population as one of the biggest challenges to achieving high survey response rates. The physical addresses of sample cases could change between their enrollment in the study and the 18-month survey, and also between the 18-month survey and the five-year survey. Our proactive approach to addressing this challenge includes the following strategies:

  • PROMISE programs collect multiple types of contact information for a participant – land line telephone number, cell phone number, e-mail address, and physical address – at enrollment (through the program consent forms). We share these data with Mathematica on a flow basis for all cases through the random assignment system (RAS) during enrollment. Further, Mathematica updates these data through the programs’ administrative data, collected prior to the start of the 18-month survey. We will update this information again during the 18-month interview. In general, we expect cell phone numbers and e-mail addresses will not change when sample members move from one physical address to another.

  • During the 18-month survey, we will collect contact information for one or more individuals who would be able to assist us in contacting a sample member for the five-year interview.

  • When completing an interview with a parent or guardian, we ask to complete an interview with the youth during the same telephone call or in-person visit. When the youth is not available, we ask the parent or guardian to assist us in contacting the youth.

  • We use interim contacts at nine months after enrollment and one and two years after the 18-month interview to keep in touch with mobile sample members. This strategy includes the use of text messages, letters, post cards, and email reminders.

  • We expect the offer of a $10 supplemental incentive will motivate some sample members to call our Survey Operations Center to complete their interviews within 10 days of receiving their advance notification letters. This strategy proved to be effective on previous Mathematica surveys in generating call-ins from sample members for whom they could find no working telephone number. We will also use web-based search engines, such as Accurint and National Change of Address to aid our locating efforts.

  • Four times a year, SSA provides the contractor with updated contact information on sample cases from its automated records. Mathematica copies the updated information into the sample management and interviewing systems.

  • To minimize nonresponse bias due to the inability of sample members with significant hearing impairments to participate in telephone interviews, we use Instant Messenger (IM) with secure applications, which we make available to the sample members. This technology is highly accessible and presents significantly greater opportunities for dynamic engagement than more antiquated technologies such as teletype. Using IM, an interviewer copies and pastes questions from the CATI survey instrument and toggles between IM and CATI to input the responses and move to the next question. Mathematica developed expertise in deploying this modality on the National Longitudinal Transition Survey 2012 (NLTS 2012), (Matulewicz et al. 2012b). We also use video relay service, upon request. The video relay operator engages with the respondent by video, converting the interviewer’s speech into sign. The operator then conveys the respondent’s responses to the interviewers over the telephone. We make each of these technologies available to any respondent who requests it. We offer assistive technology, or person-based supports, at the end of the parent interview (in preparation to speak with the youth), as well as at the start of the youth interview.

  • SSA excluded SSI recipients who are living in institutions from the lists of PROMISE-eligible youth we provided to the programs. Notwithstanding this exclusion, we may find enrolled youth who are living in institutions at the time of the follow-up survey. When such a case arises, we will contact the manager of the facility, describe the study, and explain how we received parental consent to contact the youth. We will send the manager a cover letter accompanied by a redacted copy of the signed evaluation consent form4 (which we will request from the PROMISE program on an as-needed basis). We will follow up to ensure these materials were received and to work with the facility staff to contact and interview the youth.

3. Staff Activity Logs

Our discussions with the program managers are intended to identify the right people to ask about completing the staff activity logs, to identify the best times for this data collection activity, and to facilitate staff responses to the request to complete the activity. We will send up to three reminder emails to staff after the data collection period asking for their completed logs. The evaluator also limits the effort to no more than five minutes per day (or 35 minutes across all seven days) to ensure the data collection imposes only a modest burden on respondents.

B4. Tests of Procedures or Methods to Be Undertaken


We completed pretest interviews on the 18-month survey instruments in November 2014 to gauge respondent burden, assess the question skip logic, and gather feedback from the respondents regarding their understanding of the questions. We drew the convenience sample from a list of youth who had recently aged out of eligibility for PROMISE, thereby ensuring they closely resembled the target population of SSI youth and their parents or guardians. We conducted pretest interviews by telephone on paper versions of the questionnaires. After each interview, we encouraged participants to provide feedback on their experience. The pretest used a reference date 18 months prior to mirror the recall period for parents and youth in PROMISE, who will be asked to report on their experiences and services received from their date of enrollment to the current date of interview (approximately 18 months later).


Mathematica submitted a preliminary and final memo on the findings from two iterative pretests. The memo provided both individual and summary-level statistics regarding burden for specific groups and for particular sections of the instruments. It included a discussion of difficulties with the data collection process; internal consistency of the responses; and recommendations related to item sequencing, modifications to specific items, or definitions and standardized probes to add. Mathematica based the first pretest memo on five parent and four youth interviews. The final memo included an additional four parent and five youth interviews. The interviewers conducting both pretests used different questions for each interview, thus, they conducted fewer than 10 pretests with each group.


We used pretest respondent feedback to revise the parent and youth survey instruments (Attachments H and I). We previously used most of the questions in both the baseline and follow‑up questionnaires within other studies of youth or persons with disabilities, including the NLTS 2012, the National Beneficiary Survey (NBS), and the Short Form 12 (SF12).


B5. Individuals consulted on Statistical Aspects of the Design and on Collection and/or Analyzing Data


As discussed in A.8, SSA convened a technical advisory panel for the PROMISE evaluation. The panel provided input on the evaluation criteria and research design. It consisted of researchers and advocates who reflected expertise in youth transition, disability, and evaluation design. The external experts were:


  • Burt Barnow, PhD, George Washington University

  • Hugh Berry, US Department of Education

  • Mark Donovan, Marriott Foundation for People with Disabilities

  • David Johnson, PhD, University of Minnesota

  • Jamie Kendall, US Dept. of Health and Human Services

  • Jeffrey Liebman PhD, Harvard University

  • Pamela Loprest, PhD , The Urban Institute

An interdisciplinary team of economists, disability policy researchers, survey researchers, and information systems professionals on the staff of the evaluation contractor (Mathematica Policy Research and its subcontractor, BCT Partners) contributed to the design of the overall evaluation. These individuals include:


  • Karen CyBulski, Mathematica

  • Thomas Fraker, PhD, Mathematica

  • Jacqueline Kauff, Mathematica

  • Gina Livermore, PhD, Mathematica

  • Holly Matulewicz, Mathematica

  • Tonya Woodland, BCT Partners

References

Basto, E., E. Warson, and S. Barbour. “Exploring American Indian Adolescents’ Needs Through a Community-Driven Study.” The Arts in Psychotherapy, vol. 39, 2012, pp. 134-142.

Blacher, Jan, Bonnie Kraemer, and Erica Howell. “Family Expectations and Transition Experiences for Young Adults with Severe Disabilities: Does Syndrome Matter?” Advances in Mental Health and Learning Disabilities, vol. 4, no.1, 2010, pp. 3–16.

Brugge, D, and M. Missaghian. “Protecting the Navajo People Through Tribal Regulation of Research.” Science and Engineering Ethics, vol. 12, 2006, pp. 491-507.

Cameto, R., P. Levine, and M. Wagner. Transition Planning for Students with Disabilities. A Special Topic Report of Findings from the National Longitudinal Transition Study-2 (NLTS2). Menlo Park, CA: SRI International, November 2004. Available at [http://www.nlts2.org/reports/2004_11/nlts2_report_2004_11_execsum.pdf]. Accessed July 22, 2013.

Carter, E., D. Austin, and A. Trainor. “Predictors of Postschool Employment Outcomes for Young Adults with Severe Disabilities. Journal of Disability Policy Studies, vol. 23, no. 1, 2012, pp. 1–14.

Chiang, Hsu-Min, Ying Kuen Cheung, Huacheng Li, and Luke Y. Tsai. “Factors Associated with Participation in Employment for High School Leavers Autism.” Journal of Autism Developmental Disorders, vol. 42, no. 5, 2012, pp. 685–696.

Emerson, Eric. “Poverty and People with Intellectual Disabilities.” Mental Retardation and Developmental Disabilities Research Reviews, vol. 12, no. 2, 2007, pp. 107–113.

Fraker, Thomas. “The Youth Transition Demonstration: Lifting Employment Barriers for Youth with Disabilities.” Issue brief no. 13–01. Washington, DC: Center for Studying Disability Policy, February 2013.

Fraker, Thomas, Todd Honeycutt, Arif Mamun, Allison Thompkins Erin Jacobs Valentine. “Final Report on the Youth Transition Demonstration Evaluation.” Washington, DC: Mathematica Policy Research, October 2014.

Getrich, C., A. Sussman, K. Campbell-Voytal, J. Tsoh, R. Williams, A. Brown, M. Potter, W. Spears, N. Weller, J. Pascoe, K. Schwartz, and A. Neale. “Cultivating a Cycle of Trust with Diverse Communities in Practice-Based Research: A Report from PRIME Net.” Annals of Family Medicine, vol. 11, 2013, pp. 550-558.

Gilder, D., J. Luna, J. Roberts, D. Calac, J. Grube, R. Moore, C. Ehlers. “Usefulness of a Survey on Underage Drinking in a Rural American Indian Community Health Clinic.” American Indian and Alaska Native Mental Health Research, vol. 20(2), 2013, pp. 1-26.

Hasazi, S.B., L.R. Gordon, and C. A. Roe. “Factors Associated with the Employment Status of Handicapped Youth Exiting High School from 1979 to 1983.” Exceptional Children, vol. 51, 1985, pp. 455–469.

Hemmeter, Jeffrey, Jacqueline Kauff, and David Wittenburg. “Changing Circumstances: Experiences of Child SSI Beneficiaries Before and After Their Age-18 Redetermination for Adult Benefits.” Journal of Vocational Rehabilitation, vol. 30, no. 3, 2009, pp. 201–221.

Hemmeter, Jeffrey, and Elaine Gilby. “The Age-18 Redetermination and Postredetermination Participation in SSI.” Social Security Bulletin, vol. 69, no. 4, 2009, pp. 1–25.

Hodge, F., M. Cadogan, T. Itty, B. Cardoza, and S. Maliski. “Learning How to Ask: Reflections on Engaging American Indian Research Participants.” American Indian Culture and Research Journal, 2010, vol. 34, 2010, pp. 77-90.

Israel, B, A. Shulz, E. Parker, A. Becker, A. Allen, and J. Guzman. “Critical Issues in Developing and Following CBPR Principals.” In M. Minkler and N. Wallerston (eds.), Community Based Participatory Research for Health: From Process to Outcomes (second edition). San Francisco: Jossey-Bass, 2008, pp. 47-66.

Lee, Gloria, and Erik Carter. “Preparing Transition-Age Students with High-Functioning Autism Spectrum Disorders for Meaningful Work.” Psychology in the Schools, vol. 49, no. 10, 2012, pp. 988–1000.

Lindstrom, Lauren, Bonnie Doren, and Jennifer Miesch. Waging a Living: Career Development and Long-Term Employment Outcomes for Young Adults with Disabilities, Council for Exceptional Children, vol. 77, no. 4, 2011, pp. 423–434.

Lindstrom, Lauren, Bonnie Doren, Jennifer Metheny, Pam Johnson, and Claire Zane. “Transition to Employment: Role of the Family in Career Development.” Council for Exceptional Children, vol. 73, no. 3, 2007, pp. 348–366.

Loprest, Pamela J., and David C. Wittenburg. “Post-Transition Experiences of Former Child SSI Beneficiaries.” Social Service Review, vol. 81, no. 4, 2007, pp. 583-608.

Luecking, R. and N. Certo. “Service Integration at the Point of Transition for Youth with Significant Disabilities: A Model that Works.” American Rehabilitation, vol. 27, 2003, pp. 2–9.

Matulewicz, H., S. Boraas, D. Friend, A. Ciemnecki, and A. DeGraff. “Respondent Permission to Contact or Locate on Facebook: Findings from the National Longitudinal Transition Study 2012.” Paper presented at the American Association for Public Opinion Research Annual Meeting, Orlando, FL, 2012a.

Matulewicz, H., D. J. Friend, A. Ciemnecki, and A. DeGraff. “Technologies Used to Interview Youth Who Are Deaf or Have Hearing Impairments: Results from the National Longitudinal Transition Study 2012.” Paper presented at the American Association for Public Opinion Research Annual Meeting, Orlando, FL, 2012b.

Mook, K., S. Harrington, and A. Skaff. “Capabilities and Considerations for Using Facebook in Survey Research.” Paper presented at the American Association for Public Opinion Research Annual Meeting, Boston, MA, 2013.

Powers, L., T. Garner, B. Valnes, P. Squire, A. Turner, T. Couture, and R. Dertinger. “Building a Successful Adult Life: Findings from Youth-Directed Research.” Exceptionality, vol. 15, no. 1, 2007, pp. 45–56.

Shattuck, Paul, Sarah Carter Narendorf, Benjamin Cooper, Paul Sterzing, Mary Wagner, and Julie Lounds Taylor. “Postsecondary Education and Employment Among Youth with an Autism Spectrum Disorder.” Pediatrics, vol. 129, no. 6, 2012, pp. 1042–1049.

Simonsen, M., and D. Neubert. “Transitioning Youth with Intellectual and Other Developmental Disabilities: Predicting Community Employment Outcomes.” Career Development and Transition for Exceptional Individuals, 2013. doi: 1177/216543412469399.

Social Security Administration. Annual Statistical Supplement to the Social Security Bulletin, 2013. Publication No. 13-11700. Table 7.A1. Washington, DC: Social Security Administration, 2014. Available at http://ssa.gov/policy/docs/statcomps/supplement/2013. Accessed March 6, 2014.

Social Security Administration. SSI Annual Statistical Report, 2012. Publication No. 13-11827. Table 4. Washington, DC: Social Security Administration, 2013. Available at http://ssa.gov/policy/docs/statcomps/ssi_asr/2012. Accessed March 6, 2014.

U.S. Government Accountability Office. “Summary of a GAO Conference: Helping California Youths with Disabilities Transition to Work or Postsecondary Education.” GAO-06-759SP. Washington, DC: United States Government Accountability Office, 2006.

Wittenburg, D., T. Golden, and M. Fishman. “Transition Options for Youth with Disabilities: An Overview of the Programs and Policies That Affect the Transition from School.” Journal of Vocational Rehabilitation, vol. 17, 2002, pp. 195–206.

Ver Ploeg, M., R. Moffitt, and C. Citro. Studies of Welfare Populations: Data Collection and Research Issues. Committee on National Statistics, Division of Behavioral and Social Sciences and Education, National Research Council. Washington, DC: National Academy Press, 2002.

Wittenburg, David. Testimony for Hearing on Supplemental Security Income Benefits for Children.” Presented at the Subcommittee on Human Resources Committee on Ways and Means U.S. House of Representatives. Washington, DC: Mathematica Policy Research, 2011.



1 Another way of looking at this method is that every case in the sample will have the same sampling weight, equal to the inverse of the probability of selection: 1/0.650 = 1.538. And because they will have the same sampling weight, the sample will be self-weighting.

2 We realize that enrollment is unlikely to be evenly paced; it probably will be higher at the beginning and end of the enrollment period for each program.

3 Most youth with disabilities can provide more accurate data on their school and work activities than can potential proxy respondents. If necessary, parents could assist youth rather than completing the entire interview for them.

4 Because Social Security Numbers for youth and their parents or guardians are not necessary for documenting informed consent for the managers of institutional homes for youth with disabilities, we will request that the PROMISE programs redact this information from copies of consent forms that they may provide to Mathematica.



7

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Author889123
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy