Supporting Statement Part B - 0799

Supporting Statement Part B - 0799.docx

Promoting Readiness of Minors in SSI (PROMISE) Evaluation - Interviews with Program Staff, and Focus Group Discussions

OMB: 0960-0799

Document [docx]
Download: docx | pdf


Part B. Collection of Information Employing Statistical Methods

1. Respondent universe and sampling methods

Information about the Promoting Readiness of Minors in Supplemental Security Income (PROMISE) evaluation 60-month survey respondent universe and sampling methods is provided below.

a. Selection of programs for the PROMISE initiative

On September 30, 2013, the Department of Education (ED) announced the award of $211 million over five years to five individual states and one consortium of six states to design and implement PROMISE demonstration programs. These awards are in the form of cooperative agreements that entail an ongoing working relationship between the funding agency and the awardees to achieve the program objectives. The awardees are all individual state agencies that formed partnerships with other agencies for the purpose of implementing PROMISE. ED selected awardees through a competitive process that included publication of a request for applications in the May 21, 2013, Federal Register (98 FR 29733), preparation and submission of applications by state agencies, and external peer review of the applications by a panel which ED convened.

ED used the following criteria to evaluate the applications and select the agencies to which they awarded cooperative agreements:

  • The quality of the program design

  • The quality of the youth recruitment plan

  • The quality of the program management plan and program personnel

  • The significance of the program, including its potential to bring about systems change and the likely magnitude of anticipated outcomes

  • The capacity of the program for continuous feedback and improvement

Table B1 lists the lead PROMISE agencies, participating states, program names, and award amounts.

With support from ED, the Department of Labor (DOL), and the Department of Health and Human Services (HHS), the Social Security Administration (SSA) is evaluating the six PROMISE programs. SSA contracted with Mathematica Policy Research to conduct the evaluation.



Table B1. The PROMISE Programs

Lead Agency

States

Program Name

Initial Award

Supplemental Funding

Total Award

Arkansas Department of Education

Arkansas

Arkansas PROMISE

$32,427,441

$3,587,210

$36,014,651

Utah State Office of Rehabilitation

Consortium of states: Utah, South Dakota, North Dakota, Montana, Colorado, and Arizona

Achieving Success by Promoting Readiness for Education and Employment (ASPIRE)

$32,500,000

$3,787,500

$36,287,500

California Department of Rehabilitation

California

California PROMISE (CaPROMISE)

$50,000,000

$5,077,500

$55,077,500

Maryland Department of Disabilities

Maryland

Maryland PROMISE

$31,190,076

$1,900,000

$33,090,076

New York Office of Mental Health

New York

New York State PROMISE (NYS PROMISE)

$32,500,000

$950,779

$33,450,779

Wisconsin Department of Workforce Development

Wisconsin

Wisconsin PROMISE

$32,497,181

$3,587,500

$36,084,681

Sources: ED’s press release on PROMISE awards [http://www.ed.gov/news/press-releases/department-awards-211-million-promoting-readiness-minors-supplemental-security-i] and the PROMISE program’s applications for supplemental funding.

b. Selection of youth and parents for the PROMISE evaluation 60-month survey

The youth and parents who enrolled in the PROMISE evaluation will be the respondent universe for the evaluation’s 60-month survey. In five programs - Arkansas PROMISE, ASPIRE, Maryland PROMISE, NYS PROMISE, and Wisconsin PROMISE – the evaluation randomly assigned about 2,000 youth and parents per program, the number sufficient to detect policy‑relevant impacts (see more information regarding statistical power in section B2.b). The evaluator will attempt to interview all randomly-assigned youth and parents in those programs. In CaPROMISE, the evaluator randomly assigned 3,097 youth and parents. Because 2,000 youth and parents are sufficient to detect policy-relevant impacts, for the 18-month survey the evaluator selected a random sample of 2,000 youth and parents to attempt to interview in CaPROMISE. These youth and parents will also serve as the CaPROMISE sample for the 60‑month survey.

2. Procedures for the collection of information

a. Recruiting survey respondents

The total enrollment period for the six PROMISE programs lasted 25 months, beginning in April 2014 and ending in April 2016. Survey cases were grouped into cohorts based on the month of enrollment. Because the first enrollment cohort is small (57 youth and parent cases), we will combine it with the second cohort, which is scheduled for release in May 2019. As a result, the 25 cohorts will be spread across 24 monthly releases from May 2019 through April 2021. Each eligible parent and youth will have up to 24 weeks to respond to the 60-month survey, meaning that a parent and youth will complete the survey by the end of the 65th month following enrollment in the evaluation. The 24 monthly cohort releases and the 24 week field period for each cohort yield a 29-month survey field period (ending in September 2021).

As of May 2018, there were 11,416 youth and 11,324 parents eligible for the 60-month survey (Tables B.2 and B.3).1 All youth who enrolled in the PROMISE evaluation and were randomly assigned are eligible for the survey unless (1) the youth is deceased or (2) the youth was not selected for the CaPROMISE survey sample. All parents or guardians who enrolled in the PROMISE evaluation and were randomly assigned are eligible for the survey unless (1) the parent or guardian is deceased; (2) the youth is deceased; (3) the parent or guardian represents an agency that has guardianship of the youth, such as a group home or a child welfare agency; or (4) the parent or guardian was not selected for the CaPROMISE survey sample. Data collection will span 29 months, with a rolling release of sample in cohorts that will mirror the months of study enrollment. As with the 18-month survey, youth and parent cases are aggregated into cohorts and released by month to simplify the sample management process. Cohorts range in size from 57 to 1,477 youth and parent cases, with an average of 910 youth and parent cases per cohort.

Table B2. Youth cases eligible for 60-month survey by cohort and program

Cohort

Enrollment month

AR

ASPIRE

CA

MD

NY

WI

Total

% of total eligible

1

April 2014

0

0

0

17

0

12

29

0.3

2

May 2014

0

0

0

53

0

76

129

1.1

3

June 2014

0

0

0

62

0

74

136

1.2

4

July 2014

0

0

0

66

0

66

132

1.2

5

August 2014

0

0

14

54

0

78

146

1.3

6

September 2014

35

0

129

63

0

42

269

2.4

7

October 2014

56

17

122

70

4

45

314

2.8

8

November 2014

139

27

108

57

15

41

387

3.4

9

December 2014

241

89

68

73

5

14

490

4.3

10

January 2015

85

97

85

93

30

51

441

3.9

11

February 2015

81

98

66

81

27

74

427

3.7

12

March 2015

122

121

68

80

34

70

495

4.3

13

April 2015

124

90

92

77

47

64

494

4.3

14

May 2015

141

116

75

75

56

48

511

4.5

15

June 2015

114

134

130

98

78

67

621

5.4

16

July 2015

38

122

148

112

134

103

657

5.8

17

August 2015

44

103

143

118

145

110

663

5.8

18

September 2015

54

98

154

105

174

77

662

5.8

19

October 2015

82

61

121

107

214

88

673

5.9

20

November 2015

99

76

125

122

207

111

740

6.5

21

December 2015

86

88

84

99

233

79

669

5.9

22

January 2016

73

63

82

94

194

59

565

4.9

23

February 2016

84

118

96

73

205

106

682

6.0

24

March 2016

87

194

40

0

131

114

566

5.0

25

April 2016

14

213

42

0

32

217

518

4.5

Total

1,799

1,925

1,992

1,849

1,965

1,886

11,416

100.0



Table B3. Parent cases eligible for 60-month survey by cohort and program

Cohort

Enrollment month

AR

ASPIRE

CA

MD

NY

WI

Total

% of total eligible

1

April 2014

0

0

0

17

0

11

28

0.2

2

May 2014

0

0

0

53

0

76

129

1.1

3

June 2014

0

0

0

62

0

72

134

1.2

4

July 2014

0

0

0

66

0

62

128

1.1

5

August 2014

0

0

14

53

0

77

144

1.3

6

September 2014

34

0

129

61

0

42

266

2.3

7

October 2014

56

17

122

69

4

45

313

2.8

8

November 2014

135

27

108

56

15

41

382

3.4

9

December 2014

234

89

68

71

4

14

480

4.2

10

January 2015

85

96

84

93

30

51

439

3.9

11

February 2015

80

98

66

81

27

74

426

3.8

12

March 2015

120

119

65

78

34

70

486

4.3

13

April 2015

123

90

92

77

45

64

491

4.3

14

May 2015

140

115

74

74

56

48

507

4.5

15

June 2015

114

133

130

98

77

65

617

5.4

16

July 2015

38

118

146

110

134

103

649

5.7

17

August 2015

44

103

142

117

145

108

659

5.8

18

September 2015

53

98

153

105

172

77

658

5.8

19

October 2015

79

61

121

107

211

88

667

5.9

20

November 2015

97

76

124

122

207

111

737

6.5

21

December 2015

86

87

84

99

233

78

667

5.9

22

January 2016

71

63

82

94

192

58

560

4.9

23

February 2016

83

118

96

73

204

106

680

6.0

24

March 2016

86

192

40

0

130

114

562

5.0

25

April 2016

14

211

42

0

31

217

515

4.5

Total

1,772

1,911

1,982

1,836

1,951

1,872

11,324

100.0


The following sections describe key features of the data collection plan, including: (1) target respondents and expected completes by mode; (2) survey incentives; (3) use of responsive survey design; (4) language of interview administration; (5) survey protocols for special populations; and (6) monitoring production and data quality during the field period.

b. Target respondents

The target respondent for the parent survey will be the parent/guardian who helped the youth enroll in PROMISE and signed the enrollment consent form. It is also likely to be the parent or guardian who is most engaged in the youth’s receipt of PROMISE services (if the youth is in the treatment group). The target respondent for the youth survey is the youth who enrolled in PROMISE and provided assent. Proxy respondents will be permitted for either the parent or the youth interview, as needed. Based on findings from the National Longitudinal Transition Study-2012 survey of transition-aged youth and their parents, as well as the PROMISE 18-month survey, the evaluator expects to complete the youth and parent interviews on the same day for at least 50 percent of sample cases.

The computer-assisted telephone interviewing (CATI)/computer-assisted personal interviewing (CAPI) system for the PROMISE evaluation will be designed to allow either the youth or the parent interview to be completed first. At the end of each interview, text is provided for interviewers to ask to speak with the person linked to the pending case or set an appointment to do so, as applicable. At the end of each youth interview, the interviewer will ask to speak to or set an appointment with the parent if the parent has not yet completed an interview. The interviewer will do the same for youth at the end of each parent interview.

The evaluator will field the 60-month survey across three modes (telephone, field, and mail). Based on the 18-month survey results, across all programs it is anticipated that 75 percent of completed interviews will be completed by telephone, 23 percent by field, and 2 percent by mail. Mathematica will use its sample management system to (1) release eligible cases and ensure they are worked as intended; (2) mail invitation and reminder letters and incentive payments; and (3) track and store sample cases’ updated contact information. Field interviewers will use the sample management system to manage their assigned cases and track contact attempts. The interviewing period for each cohort will be 24 weeks. Over the full 29-month survey period, Mathematica’s data collection managers will use a range of production reports to monitor the data collection effort to ensure it aligns with production goals and anticipated costs. They will also monitor the quality of the data collected and the response rates for each program, as well as for different groups of sample within each program (such as treatment and control groups, age groups, alternate languages, etc.). Because of the eligibility criteria, the parent and youth surveys will be fielded concurrently but managed separately. However, the evaluator will leverage the paired nature of the cases in its locating and other outreach efforts. The parent and youth surveys each have a target response rate of 80 percent.

c. Survey incentives

Each survey respondent will be offered a base incentive of $30 for completing an interview. A bonus incentive will be offered to sample members who call in to complete an interview within twelve days of their survey cohort’s launch. The bonus incentive will be $10 for sample members with a high propensity to respond and $20 for sample members with a low propensity to respond. This differential bonus offsets follow-up costs associated with more difficult-to-reach cases by generating completes from early responders who call in to complete an interview and by providing greater motivation for the hardest-to-reach cases to respond. By deploying a differential bonus, resources can be targeted to sample cases that otherwise are likely to require intensive efforts to locate, contact, or gain cooperation for interviews. Mathematica used a similar bonus incentive for the PROMISE 18-month survey, offering a base incentive of $30 and a $10 bonus to sample members who called in to complete an interview within ten days of their survey cohort’s launch. Based on that experience, Mathematica anticipates that 15 to 20 percent of 60-month survey respondents will receive the bonus incentive. Table B4 describes the groups and incentive structure in greater detail.

Table B4. Proposed incentives for 60-month survey based on 18-month survey response

Parent survey

Youth survey

Incentive per respondenta

Survey group

60-month eligible

18-month respondent

60-month eligiblea

18-month respondent

$30 base

$10 early call-in bonus ($40 total possible)

A

60-month ineligible

60-month eligible

18-month respondent

$30 base

$10 early call-in bonus ($40 total possible)

A

60-month eligible

18-month respondent

60-month ineligible

$30 base

$10 early call-in bonus ($40 total possible)

A

60-month eligible

18-month respondent

60-month eligible

18-month non-respondent

$30 base

$20 early call-in bonus ($50 total possible)

B

60-month eligible

18-month non-respondent

60-month eligible

18-month respondent

$30 base

$20 early call-in bonus ($50 total possible)

B

60-month eligible

18-month non-respondent

60-month ineligible

$30 base

$20 early call-in bonus ($50 total possible)

B

60-month ineligible

60-month eligible

18-month non-respondent

$30 base

$20 early call-in bonus ($50 total possible)

B

60-month eligible

18-month non-respondent

60-month eligible

18-month non-respondent

$30 base

$20 early call-in bonus ($50 total possible)

B

60-month ineligible

60-month ineligible

n/a – case not released for survey

n/a

a Incentives shown are based on each respondent in the parent-youth pair. Therefore, survey respondents in group A could each receive up to $40 for completing the survey ($80 total for parent and youth). Respondents in group B could each receive up to $50 ($100 total).


The vast majority of cases in the 18-month survey (81 percent) represent a dyad where both parent and youth completed their interviews. In the minority are cases where neither completed (14.6 percent), the parent completed but the youth did not (4 percent), or the youth completed but the parent did not (0.4 percent). The evaluator will conduct group assignment at the case dyad level to avoid circumstances where individuals become disinclined to take part because they feel they should have been offered the same (higher) incentive as the other member of the case. For all cases, the incentive will be provided in a single gift card.

Mathematica will distribute incentives through gift cards, and survey respondents will be offered a choice of a Visa, Target, or Walmart gift card. Gift cards will be mailed to respondents who complete interviews by telephone or on paper, and distributed in-person to respondents who complete interviews with a field interviewer.

d. Responsive survey design

To optimize project resources and deploy best practices in survey methodology, Mathematica will use a responsive survey design for the 60-month survey (Groves 2006; Brick et al. 2017; Durivant et al. 2017; Axinn et al. 2011; and Couper 2017). This approach breaks follow-up efforts into sequential phases that seek to mitigate both unit and item nonresponse while making best use of project resources. Mathematica will use five phases, as shown in Table B5. Cases that do not complete an interview in one phase will either move on to the next or be finalized (as refusals, unlocatable, or non-Spanish language barriers). Finalized cases will not receive any further follow-up.

Table B5. Responsive survey design for 60-month survey by phase

Phase

Activity

Pending cases included

Phase begins

1

Inbound calls with $10 or $20 early-responder bonus

All receiving survey invitation outreach

Launch of cohort field period

2

Outbound calls

All where working telephone numbers are available

Cohort launch date plus 13 days

3

Supervisor review, in-house locating

All

After all available telephones numbers hit maximum attempts or no telephones numbers are viable for dialing

4

Field locating, interviewing

All where residential addresses are available from prior records or in-house locating efforts, where field staff are based, or where a cluster of cases makes travel viable

After supervisor review and locating are completed, on a flow basis (approximately week 10 for 18-month survey nonrespondents and week 12 for 18-month survey respondents)

5

Mail (abbreviated questionnaire)

All with a viable mailing address

Week 23


The survey process begins with an advance notification letter from Mathematica, inviting the youth and the enrolling parent to contact us to complete the interview. Mathematica will leverage findings from the recent National Beneficiary Survey experiment that found a “concrete” approach to the survey invitation yielded the highest percentage of inbound calls (Johnson et al. 2017). In contrast to a standard approach to such invitations, this format directs sample members to call a specific telephone number to exercise one of three options: (1) complete the interview; (2) schedule an appointment for later completion; or (3) decline to participate in the survey. Mathematica will customize the letter with case-specific text pertaining to the PROMISE program site, the end date for the early-call in bonus, and the differential incentive ($10/$20). Based on results from the 18-month survey, combined with the findings from the National Beneficiary Survey, Mathematica anticipates about 20 percent of enrolling parents and 15 percent of youth will call in during phase 1 and receive the bonus.

Phases 2 through 5 comprise the outbound calls and a series of reminder mailings (see Appendix B for the timing of the mailings). At the start of phase 2, Mathematica will leverage 18-month survey administration data to schedule the first call attempt for the day of the week and time of day when the respondent completed the 18-month interview (for all 18-month survey respondents). From there, Mathematica will continue call attempts through all available telephone numbers linked to each case. Mathematica will send mailings during the remaining weeks of the survey period to all outstanding sample cases to (1) encourage them to participate and let them know that an interviewer will be contacting them soon by telephone or in person; (2) respond to concerns they may have about the study; and (3) notify them the survey will be ending soon and that their unique experiences and input are critical to the success of the study. Mathematica will also reach out to additional contacts provided during the 18-month survey if data collection staff encounter difficulty locating youth or parents. In all contacts with sample members, Mathematica will stress that their participation in the survey is voluntary and their SSA or other program benefits will not be affected regardless of whether they participate. Appendix B shows the survey outreach activities by week of the field period. Additional mailings, sent as needed, will include refusal and locating letters and letters to enrolling parents for cases where the enrolling parent interview is completed and the youth is still pending.

Mathematica anticipates that the majority of interviews (75 percent) will be completed via CATI. Mathematica’s survey operations center is open seven days a week and can accept call-ins any time it is open. Interviewers will make outbound calls from 9 a.m. to 9 p.m. on weekdays, from 10 a.m. to 5 p.m. on Saturdays, and from noon to 9 p.m. on Sundays (sample members’ time). Sample members will not be contacted after 9:00 p.m. local time unless requested by a sample member. Further, if a sample member requests not to be contacted during a specific timeframe, interviewers will record this information in the sample management system so staff can adhere to the request. Mathematica will follow the same protocol if a sample member requests not to be called on a specific telephone number, such as a work telephone number.

Some sample members will be difficult to locate or contact or will require an in-person interview because of a disabling condition. Field staff will use CAPI to complete interviews with such cases. Mathematica anticipates completing about 23 percent of all interviews via CAPI. Field follow-up will occur in phase 4 of the interviewing period for each monthly cohort of sample cases, with up to 13 weeks of field work following 10 to 12 weeks of work in the survey operations center. Once a case is sent to the field, it will be retired from outbound calls. Field staff will conduct interviews using tablet computers either in the sample member’s home or at an agreed-upon alternate location. If, for a given sample case, the parent interview has not yet been completed at the time of the youth interview (or vice versa), the field interviewer will capitalize on the rapport established with the respondent to solicit information and assistance in locating and contacting the other member of the case. Field staff will record all contacts and adhere to the study protocols reviewed in training.

Offering the survey in different modes will increase the likelihood of participation for cases who may not be able to participate in a given mode. For example, those without telephone service or access to telephones or who are wary of contact with strangers by telephone will likely not respond to outreach in phase 1 or 2. Case review in phase 3 might conclude that no other telephone numbers can be found and perhaps no viable addresses are available, because all mailings have been returned as undeliverable. Field follow-up (phase 4) is a useful resource for such cases; Mathematica can send staff to visit last known addresses and make contact with friends, relatives, neighbors, or other informants who can help us reach the parent and youth. However, not all cases will be eligible for field follow-up in phase 5 because they are not concentrated in close proximity to other cases, making in-person contact extremely costly and inefficient. Therefore, Mathematica plans to offer all non‑responding, non-finalized cases the opportunity to complete the survey by mail, in an abbreviated format, to attempt to reach these individuals. The abbreviated mail format can also address reasons for nonresponse related to the survey length and facilitate completion by individuals who may require assistance from a close contact because they speak a language other than English or Spanish.

For the 18-month survey, the self-administered version of the questionnaire was offered only to ASPIRE enrollees who resided in rural and frontier areas that were ineligible for field follow-up because the cases were too few and too geographically dispersed. Those in regions without field follow-up were sent the mailing twice over the field period (once in week 15 and again in week 19 if no response was received). The 60-month mailing protocol includes just one mailing of this questionnaire across all programs. The plan is based on the diminished returns experienced with the second mailing in the 18-month survey. Offering the abbreviated mail survey in all the programs can reduce nonresponse by addressing the particular reasons for nonresponse described above. Those who complete the abbreviated questionnaire will receive the same $30 gift card as those who complete the full interview.

e. Language of interview administration

Interviews will be conducted primarily in English and Spanish, with instruments in both languages available in the CATI system and in the abbreviated questionnaires sent by mail. Based on 18-month survey data, 12 percent of parent cases and 11 percent of enrolled youth are Spanish-speaking. All Spanish-speaking interviewers will have completed professional certification to ensure they are qualified to conduct the interview in Spanish. Cases designated as Spanish-speaking from the 18-month survey or from enrollment will be worked by bilingual interviewers only. If an English-speaking interviewer identifies a new Spanish-language case, the interviewer will transfer the case to an available Spanish‑speaking interviewer or make arrangements for the interview to be competed in Spanish at a later time.

The evaluation enrolled some youth and parents or guardians who speak neither English nor Spanish. However, based on results from the 18-month survey, these cases accounted for less than 2 percent of all research cases in the 60-month survey. Of these, only a small number (13 parents, 9 youth) were finalized as non-completes because of language barriers at the end of the 18-month survey. These parents and youth did not have another person who could help them complete the interview, either in a supported format (using translation, where needed) or as a proxy on their behalf. When attempting to interview people who speak neither English nor Spanish for the 60-month survey, Mathematica will seek to complete the interview using bilingual interviewers. To ensure as much standardization as possible in how questions are asked and terms are communicated in the non-translated languages, all the bilingual interviewers will be trained to conduct the 60-month survey in English. When conducting interviews in languages other than English or Spanish, these interviewers will interpret from and code the survey responses directly into the English version of the CATI/CAPI instrument. This approach ensures that all interviews are subject to the same rigorous data quality checks regardless of the language of administration.

Because of the strategies described above (that is, use of bilingual staff, assisted interviews, proxy interviews, and potential translation of the abbreviated questionnaire), SSA will not pursue the use of outside translator services.

f. Survey protocols for special populations

In addition to the data collection strategies described above, unique features of the ASPIRE necessitate special survey strategies for subpopulations of enrollees. These features, and the proposed strategies, are described below.

ASPIRE enrollees include a nontrivial proportion of Native Americans, some of whom might reside on reservations. ASPIRE collected self-reported data from parents and youth who identified as belonging to a Native American tribe. In data that ASPIRE provided in November 2017, 120 youth and 122 parents among ASPIRE’s 1,934 research cases self‑identified as belonging to a Native American tribe (6.2 and 6.3 percent, respectively). Of these, 101 pairs of parents and youth self-identified as belonging to a Native American tribe (5.2 percent of all ASPIRE survey cases). This population is considered hard to survey for several reasons, including (1) mistrust of outside researchers, who may be perceived as judgmental; (2) concerns about how the survey data will be used; (3) high concentrations of poverty and other household complexities; and (4) reduced access to telephone service as a result of limited household resources or cultural norms (Basto et al. 2012; Brugge and Missaghian 2006; Getrich et al. 2013; Gilder et al. 2013; Hodge et al. 2010; Israel et al. 2008; Jones 2008; Ver Ploeg et al. 2002). To address these challenges, Mathematica collaborated with ASPIRE staff to build on the positive outreach they have conducted with tribal leaders. Further, prior to launching the 18-month survey in ASPIRE, SSA sent a letter to tribal leaders to inform them of the study and to obtain their endorsements for the survey. In response, leaders of the Sisseton Wahpeton Oyate requested that the evaluator not conduct interviews with members of their tribe without first securing approval from their IRB. At SSA’s request, Mathematica did not include the one eligible case from this tribe in the survey outreach efforts. Overall, the survey outreach strategies were successful in reaching this population, demonstrated by completed interviews with 87 of these youth and 92 parents, with 72 completed dyads (response rates of 72.5, 75.4, and 71.3 percent, respectively).2 Mathematica will use similar efforts in working with this population for the 60-month survey and will assist SSA in identifying the appropriate tribal contacts to whom to send a letter similar to that sent for the 18-month survey. Mathematica will work with leaders thereafter to determine how best to conduct outreach to reservation-based sample cases.

The ASPIRE program serves not only rural but also frontier areas (geographic areas with extremely low population density), for which exceptionally long distances may exist between households. 3 Mathematica analyzed the ZIP codes linked to best known addresses for ASPIRE research cases and found that 7.2 percent reside in frontier areas. Mathematica will attempt to complete the 60-month interview by telephone with sample members in frontier areas, using whatever accommodations might be necessary. When necessary and feasible, Mathematica will use alternative means of communication, such as WebEx, to connect with sample cases using Voice over Internet Protocol. If cases are unreachable by telephone and have no Internet access, Mathematica will determine whether a sufficient concentration exists to make efficient use of field interviewers. Finally, in phase 5, Mathematica will mail the abbreviated questionnaires to all nonresponding enrolling parents and youth.

Finally, cases from any of the programs, especially control group members, might not remember enrolling in the PROMISE evaluation. To address their concerns, Mathematica will provide a copy of their signed consent/assent forms upon request if the forms are available. These forms may also be useful in helping survey staff work with gatekeepers to gain permission to contact youth who are institutionalized or incarcerated. Because the majority of the PROMISE programs will no longer be in operation at the time of the 60‑month survey, Mathematica has asked SSA to request these forms from the programs during the close-out process. The forms will be stored securely at either SSA or Mathematica.

g. Monitoring production and data quality

To ensure the data collected are of high quality, all interviewers will receive regular, ongoing feedback on their work during the survey period. This will include monitoring their performance in engaging sample members and conducting the interviews, as well as providing them with statistics on their productivity relative to the entire team of interviewers (such as attendance, rates of refusal, and hours per completed case). Mathematica’s survey operations center managers, many of whom are highly skilled former interviewers, will provide this feedback to the telephone interviewers. Field interviewers will also meet with their managers regularly to receive ongoing feedback on their production statistics, debrief on challenging cases, and prioritize their workload. In addition, a portion of all field interviews will be validated. The process entails (1) selecting cases for validation (random subset of 10 percent of each field interviewer’s completed interviews, as well as outliers for length of interview, manually identified by managers); (2) contacting these cases by mail and then by telephone (if no response) to confirm that the interviews took place and was conducted in a professional manner; (3) reviewing the responses of these cases to look for missing data, many similar responses, or incongruent responses; and (4) reviewing the electronic signatures from respondent payment records to ensure a variety of handwriting is observed, as anticipated. Mathematica may also utilize GPS data, where needed, to verify the location of an interview. Finally, managerial review of frequency distributions of critical data elements and open-ended responses may identify field interviewers who need retraining.

h. Statistical Power/Precision Estimates

Even with an experimental design, the evaluation requires sample sizes large enough to provide sufficient statistical precision to detect policy-relevant impacts. The PROMISE evaluation has a sample of about 2,000 youth and parents per program in Arkansas PROMISE, ASPIRE, Maryland PROMISE, NYS PROMISE, and Wisconsin PROMISE, and a sample of about 3,100 youth and parents in CaPROMISE. Using these sample sizes, the evaluator calculated the minimum impacts the evaluation might expect to detect using administrative or survey data on five key outcomes for youth: (1) employment in paid jobs, (2) annual earnings, (3) enrollment in school, (4) SSI payment receipt, and (5) annual SSI payment amount.

The minimum detectable impacts (MDIs) in Table B6 suggest that the study samples will support the detection of meaningful impacts. For example, in five of the six programs, the evaluation is expected to detect impacts of five percentage points or larger on employment in paid jobs estimated using administrative data, and six percentage points or larger using survey data for the full samples; the evaluation expects to detect impacts of four percentage points or larger using administrative data in CaPROMISE because of its larger sample size. Evaluations of interventions providing transition services to youth with disabilities found short-term impacts on employment rates that are larger than these MDIs. For example, in SSA’s Youth Transition Demonstration evaluation, three of the six projects showed estimated impacts on the likelihood of being employed in a paid job during the 12 months following enrollment of between 9 and 19 percentage points (Fraker 2013).

The study samples will also be sufficient to detect policy-relevant impacts for important subgroups. For example, the evaluation will be able to detect a program impact of eight percentage points or larger on paid employment using 50 percent samples of the survey respondents, such as female or male evaluation enrollees. It will be able to detect an impact of 11 percentage points or more on the likelihood of youth being employed in paid jobs during the year following enrollment even using 25 percent survey samples, such as youth who had any work experience prior to enrollment in the evaluation. However, for two of the three Youth Transition Demonstration projects with statistically significant employment impacts during the year following enrollment, the impacts were 9 percentage points (Fraker 2013). Table B6 indicates that the evaluation will not be able to detect impacts of that magnitude by the PROMISE programs at the 95 percent confidence level based on 25 percent survey samples.

Table B6. Minimum Detectable Impacts


Outcome

Sample Size

Employed in Paid Jobs

Annual Earnings

Enrolled in School

SSI
Receipt

Annual SSI Payments

Assumed mean value of outcome for control group members

23%

$900

88%

99%

$6,500

Follow-Up Data from Administrative Records

CaPROMISE






3,100 (full sample)

4%

$287

n.a.

1%

$220

1,550 (50% sample)

6%

$405

n.a.

1%

$311

Other programs






2,000 (full sample)

5%

$357

n.a.

1%

$274

1,000 (50% sample)

7%

$505

n.a.

2%

$387

Follow-Up Data from Surveys

All programs






1,600 (full sample)

6%

$399

4%

n.a.

n.a.

800 (50% sample)

8%

$564

6%

n.a.

n.a.

400 (25% sample)

11%

$798

9%

n.a.

n.a.

Notes: MDI calculations assume (1) an equal number of treatment and control members, (2) a 95 percent confidence level with an 80 percent level of power, (3) a two-tailed test, (4) a reduction in variance of 10 percent owing to the use of regression models, (5) standard deviations of annual earnings and annual SSI payments of $3,000 and $2,300, respectively, (6) administrative data obtained on 100 percent of the sample, and (7) survey response rates of 80 percent. Mean values of outcomes for control group members are based on findings from the YTD evaluation’s 12-month impact analysis.

n.a. = not applicable.

3. Methods to Maximize Response Rates and Deal with Nonresponse

One of the biggest challenges to achieving high survey response rates will be out-of-date contact information due to the high mobility of the low-income target population. The physical addresses and telephone numbers of sample cases could change between their enrollment in the study and the 18-month survey and also between the 18-month survey and the 60-month survey. The proactive approach to addressing this challenge includes the following strategies:

  • Mathematica collected multiple types of contact information for enrollees at enrollment through the programs’ consent forms. These data included landline telephone number, cell phone number, email address, and physical address. Mathematica collected updated information during the 18-month survey from survey respondents, SSA, and the programs. In general, cell phone numbers and email addresses will not change when sample members move from one physical address to another.

  • Mathematica collected contact information for one or more individuals who would be able to assist us in contacting an enrollee at a later date. This information was collected during the 18-month interview for use in the 60-month survey. Survey records were updated on a flow basis as data from completed interviews were processed.

  • Interviewers will seek to interview or establish contact with the enrolling parent and the youth on the same day an interview is conducted with either one. When completing an interview with a youth, interviewers will ask to complete an interview with the parent or guardian and vice versa. When that is not possible, the interviewer will ask the responding party to assist in contacting the party who has not yet responded. Mathematica had success with this strategy for more than half the cases in which both the parent and youth completed the 18-month survey.

  • Interim contacts after the 18-month interview will be used to keep in touch with mobile sample members. Mathematica will use text messages, emails, post cards, and letters to conduct outreach to sample cases. This outreach will expand opportunities to obtain updated contact information for sample cases by varying the modes and connecting with cases in modes that are most salient for them. Mathematica will send text messages only to 18-month survey respondents who provided consent to be contacted by text and will explain that standard text messaging rates may apply. Due to security and privacy concerns, Mathematica will not include or solicit personally identifiable information via test messages, emails, and post cards. Instead, sample cases will be asked to call the survey operations center to update their contact information.

  • Mathematica will leverage updated contact information available through electronic searches and SSA’s administrative records. The locating efforts will be informed by efficient deployment of web-based search engines such as Accurint and National Change of Address. Twice a year, SSA will provide Mathematica with updated contact information on sample cases from its records. Mathematica will upload the updated information into the sample management system, which also provides a cumulative locating history for each youth and enrolling parent, along with any changes in the youth’s representative payee over time.

In addition to challenges associated with locating enrolling parents and youth five years after enrollment, there may be challenges with motivating sample members to respond when they face competing demands for their time, might not respond to calls from unknown telephone numbers, or might not remember enrolling in the study. The proactive plan to address these challenges includes:

  • Offering a $10 or $20 differential incentive to motivate sample members to call the survey operations center to complete their interviews within 12 days of receiving their advance notification letters. This strategy has been proven effective on previous Mathematica surveys in generating call-ins from sample members for whom no working telephone number could be found even after extensive locating efforts. For the 18‑month survey, 14 percent of parents and 11 percent of youth called in within the bonus window. Combining this strategy with clear directives in the advance letter should bolster call-in rates even further (section B2.a.3).

  • Asking the PROMISE programs to provide copies of enrollees’ signed consent forms so that Mathematica can make them available to sample members, upon request. As discussed earlier, this benefits two key groups of enrollees: (1) members of the control group, for whom enrollment in PROMISE may have lost salience after a five‑year period; and (2) youth who reside in group homes or institutions. Although SSA excluded SSI recipients who were living in institutions from the lists of PROMISE‑eligible youth provided to the programs for recruitment, youth may have subsequently moved to such setting. Mathematica will work with the enrolling parent to determine optimal contact strategies. This can include establishing contact with the manager of the facility, describing the study, and explaining how parental consent to contact the youth was obtained. Mathematica will send the manager a cover letter accompanied by a redacted copy of the signed evaluation consent form,4 follow up to ensure that the materials were received, and work with the facility staff to contact and interview the youth.

  • Highlighting SSA as the study sponsor in letters and interviewer introductory remarks. Because all youth enrolling in PROMISE must have received SSI benefits to qualify for the evaluation, the agency name will be salient and may alleviate potential concerns about legitimacy of the survey efforts.

  • Using a mixed-mode survey design and having local area codes appear in caller identification devices from calls placed by interviewers. As more sample members use cell phones, the potential for call-screening and call blocking by sample members increases, especially if the numbers do not have local area codes. Mathematica will employ technology that allows the survey operations center to leverage a local area code when placing calls. In addition, field staff will have cell phones with local area codes. Interviewers will also place calls to sample members at a variety of possible telephone numbers (cell, landline, work, or other). This helps to address challenges associated with screening calls or not picking up calls from unknown numbers. If telephone outreach is not successful, nonresponding cases will move on to subsequent phases where outreach can be optimized through other modes, such as in person or by mail.

  • Ensuring that highly trained, experienced data collectors engage with all potential respondents in a professional, respectful manner. Both telephone and field staff will successfully complete PROMISE 60-month survey training before beginning work. This includes knowledge of assistive technologies and best practices for interviewing individuals with disabilities. Ongoing monitoring will ensure consistent high quality efforts among all data collection staff. Additional refresher trainings may be used to address challenges the team is facing on an as-needed basis across the 29-month field period.

Finally, Mathematica will continue to use its extensive reporting tools that enabled us to successfully monitor a wide range of production statistics across the 18-month survey field period. In addition, Mathematica will leverage information from the 18-month survey to strategize on optimal calling patterns as well as enrich the training materials with detailed examples from (de-identified) PROMISE cases.

4. Tests of Procedures or Methods to Be Undertaken

Mathematica completed pretest interviews for the 60-month survey instruments in May 2018, to gauge respondent burden, assess the question skip logic, and gather feedback from the respondents regarding their understanding of the questions. The pretest respondents were a convenience sample of nine youth and nine parents who enrolled in the PROMISE evaluation but who are not eligible for the 60-month survey. Pretest interviews were conducted by telephone on paper versions of the questionnaires. After each interview was completed, participants were encouraged to provide feedback on their experience.

Mathematica submitted a memo on the findings from the pretest to SSA. The memo provided both individual and summary-level statistics regarding burden for specific groups and for particular sections of the instruments. It included a discussion of difficulties with the data collection process, internal consistency of the responses, and recommendations related to item sequencing, modifications to specific items, or definitions and standardized probes to be added. The pretest included interviews with nine parents and nine youth; thus, fewer than 10 pretest interviews were conducted with each study group. Pretest respondent feedback was used to revise the parent and youth survey instruments (Appendices E and F). Most questions in the instruments have been used on other surveys of youth or adults with disabilities, including the PROMISE 18-month survey, the National Longitudinal Transition Study-2012 survey, the National Beneficiary Survey, and the Youth Transition Demonstration surveys.

5. Individuals consulted on Statistical Aspects of the Design and on Collecting and/or Analyzing Data

As discussed in A8, SSA convened a technical advisory panel for the PROMISE evaluation. The panel provided input on the evaluation criteria and research design. It consisted of researchers and advocates who reflected expertise in youth transition, disability, and evaluation design. The external experts were:

  • Burt Barnow, PhD, George Washington University

  • Hugh Berry, US Department of Education

  • Mark Donovan, Marriott Foundation for People with Disabilities

  • David Johnson, PhD, University of Minnesota

  • Jamie Kendall, US Dept. of Health and Human Services

  • Jeffrey Liebman PhD, Harvard University

  • Pamela Loprest, PhD , The Urban Institute

An interdisciplinary team of economists, disability policy researchers, survey researchers, and information systems professionals on the staff of the evaluation contractor (Mathematica Policy Research and its subcontractor, BCT Partners) contributed to the design of the overall evaluation. These individuals include:

  • Karen CyBulski, Mathematica

  • Thomas Fraker, PhD, Mathematica

  • Jacqueline Kauff, Mathematica

  • Gina Livermore, PhD, Mathematica

  • Arif Mamun, PhD, Mathematica

  • Holly Matulewicz, Mathematica

  • Tonya Woodland, BCT Partners

REFERENCES

Basto, E., E. Warson, and S. Barbour. “Exploring American Indian Adolescents’ Needs Through a Community-Driven Study.” The Arts in Psychotherapy, vol. 39, 2012, pp. 134-142.

Blacher, Jan, Bonnie Kraemer, and Erica Howell. “Family Expectations and Transition Experiences for Young Adults with Severe Disabilities: Does Syndrome Matter?” Advances in Mental Health and Learning Disabilities, vol. 4, no.1, 2010, pp. 3–16.

Brugge, D, and M. Missaghian. “Protecting the Navajo People Through Tribal Regulation of Research.” Science and Engineering Ethics, vol. 12, 2006, pp. 491-507.

Cameto, R., P. Levine, and M. Wagner. Transition Planning for Students with Disabilities. A Special Topic Report of Findings from the National Longitudinal Transition Study-2 (NLTS2). Menlo Park, CA: SRI International, November 2004. Available at [http://www.nlts2.org/reports/2004_11/nlts2_report_2004_11_execsum.pdf]. Accessed July 22, 2013.

Carter, E., D. Austin, and A. Trainor. “Predictors of Postschool Employment Outcomes for Young Adults with Severe Disabilities. Journal of Disability Policy Studies, vol. 23, no. 1, 2012, pp. 1–14.

Chiang, Hsu-Min, Ying Kuen Cheung, Huacheng Li, and Luke Y. Tsai. “Factors Associated with Participation in Employment for High School Leavers Autism.” Journal of Autism Developmental Disorders, vol. 42, no. 5, 2012, pp. 685–696.

Emerson, Eric. “Poverty and People with Intellectual Disabilities.” Mental Retardation and Developmental Disabilities Research Reviews, vol. 12, no. 2, 2007, pp. 107–113.

Fraker, Thomas. “The Youth Transition Demonstration: Lifting Employment Barriers for Youth with Disabilities.” Issue brief no. 13–01. Washington, DC: Center for Studying Disability Policy, February 2013.

Fraker, Thomas, Erik Carter, Todd Honeycutt, Jacqueline Kauff, Gina Livermore, and Arif Mamun. “Promoting Readiness of Minors in SSI (PROMISE) Evaluation Design Report.” Washington, DC: Mathematica Policy Research, June 2014.

Fraker, Thomas, Todd Honeycutt, Arif Mamun, Allison Thompkins, and Erin Jacobs Valentine. “Final Report on the Youth Transition Demonstration Evaluation.” Washington, DC: Mathematica Policy Research, October 2014.

Getrich, C., A. Sussman, K. Campbell-Voytal, J. Tsoh, R. Williams, A. Brown, M. Potter, W. Spears, N. Weller, J. Pascoe, K. Schwartz, and A. Neale. “Cultivating a Cycle of Trust with Diverse Communities in Practice-Based Research: A Report from PRIME Net.” Annals of Family Medicine, vol. 11, 2013, pp. 550-558.

Gilder, D., J. Luna, J. Roberts, D. Calac, J. Grube, R. Moore, C. Ehlers. “Usefulness of a Survey on Underage Drinking in a Rural American Indian Community Health Clinic.” American Indian and Alaska Native Mental Health Research, vol. 20, no. 2, 2013, pp. 1-26.

Hasazi, S.B., L.R. Gordon, and C. A. Roe. “Factors Associated with the Employment Status of Handicapped Youth Exiting High School from 1979 to 1983.” Exceptional Children, vol. 51, 1985, pp. 455–469.

Hemmeter, Jeffrey, Jacqueline Kauff, and David Wittenburg. “Changing Circumstances: Experiences of Child SSI Beneficiaries Before and After Their Age-18 Redetermination for Adult Benefits.” Journal of Vocational Rehabilitation, vol. 30, no. 3, 2009, pp. 201–221.

Hodge, F., M. Cadogan, T. Itty, B. Cardoza, and S. Maliski. “Learning How to Ask: Reflections on Engaging American Indian Research Participants.” American Indian Culture and Research Journal, 2010, vol. 34, 2010, pp. 77-90.

Israel, B, A. Shulz, E. Parker, A. Becker, A. Allen, and J. Guzman. “Critical Issues in Developing and Following CBPR Principals.” In M. Minkler and N. Wallerston (eds.), Community Based Participatory Research for Health: From Process to Outcomes (second edition). San Francisco: Jossey-Bass, 2008, pp. 47-66.

Lee, Gloria, and Erik Carter. “Preparing Transition-Age Students with High-Functioning Autism Spectrum Disorders for Meaningful Work.” Psychology in the Schools, vol. 49, no. 10, 2012, pp. 988–1000.

Lindstrom, Lauren, Bonnie Doren, and Jennifer Miesch. “Waging a Living: Career Development and Long-Term Employment Outcomes for Young Adults with Disabilities.” Council for Exceptional Children, vol. 77, no. 4, 2011, pp. 423–434.

Lindstrom, Lauren, Bonnie Doren, Jennifer Metheny, Pam Johnson, and Claire Zane. “Transition to Employment: Role of the Family in Career Development.” Council for Exceptional Children, vol. 73, no. 3, 2007, pp. 348–366.

Loprest, Pamela J., and David C. Wittenburg. “Post-Transition Experiences of Former Child SSI Beneficiaries.” Social Service Review, vol. 81, no. 4, 2007, pp. 583-608.

Luecking, R. and N. Certo. “Service Integration at the Point of Transition for Youth with Significant Disabilities: A Model that Works.” American Rehabilitation, vol. 27, 2003, pp. 2–9.

Powers, L., T. Garner, B. Valnes, P. Squire, A. Turner, T. Couture, and R. Dertinger. “Building a Successful Adult Life: Findings from Youth-Directed Research.” Exceptionality, vol. 15, no. 1, 2007, pp. 45–56.

Shattuck, Paul, Sarah Carter Narendorf, Benjamin Cooper, Paul Sterzing, Mary Wagner, and Julie Lounds Taylor. “Postsecondary Education and Employment Among Youth with an Autism Spectrum Disorder.” Pediatrics, vol. 129, no. 6, 2012, pp. 1042–1049.

Simonsen, M., and D. Neubert. “Transitioning Youth with Intellectual and Other Developmental Disabilities: Predicting Community Employment Outcomes.” Career Development and Transition for Exceptional Individuals, 2013. doi: 1177/216543412469399.

Social Security Administration. “Annual Statistical Supplement to the Social Security Bulletin, 2013.” Publication No. 13-11700. Table 7.A1. Washington, DC: Social Security Administration, 2014. Available at http://ssa.gov/policy/docs/statcomps/supplement/2013. Accessed March 6, 2014.

Social Security Administration. “SSI Annual Statistical Report, 2012.” Publication no. 13‑11827. Table 4. Washington, DC: Social Security Administration, 2013. Available at http://ssa.gov/policy/docs/statcomps/ssi_asr/2012. Accessed March 6, 2014.

U.S. Government Accountability Office. “Students with Disabilities: Better Federal Coordination Could Lessen Challenges in the Transition from High School.” GAO-12-594. Washington, DC: United States Government Accountability Office, 2012.

Ver Ploeg, M., R. Moffitt, and C. Citro. “Studies of Welfare Populations: Data Collection and Research Issues.” Committee on National Statistics, Division of Behavioral and Social Sciences and Education, National Research Council. Washington, DC: National Academy Press, 2002.

Wittenburg, D., T. Golden, and M. Fishman. “Transition Options for Youth with Disabilities: An Overview of the Programs and Policies That Affect the Transition from School.” Journal of Vocational Rehabilitation, vol. 17, 2002, pp. 195–206.

Wittenburg, David. “Testimony for Hearing on Supplemental Security Income Benefits for Children.” Presented at the Subcommittee on Human Resources Committee on Ways and Means U.S. House of Representatives. Washington, DC: Mathematica Policy Research, 2011.


1 The number of parents eligible for the 60-month survey is a preliminary figure. The evaluator is still processing the data to determine which enrolling parents or guardians represent an agency that has guardianship of the youth and thus, are ineligible for the survey.

2 The case outcomes for these cases are similar to other cases where field follow-up was not feasible due to geographic dispersion.

3 One commonly used definition of frontier areas is ZIP codes where the majority of residents live 60 minutes or more from urban areas with populations of 50,000 people or more. Available at https://www.ers.usda.gov/data-products/frontier-and-remote-area-codes.aspx. Accessed May 11, 2018.

4 Social Security numbers and other information that the managers of group homes do not need to verify informed consent will be redacted.



1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSipple, Naomi
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy