AC Longitudinal Study_OMB 9-07-06_Section B

AC Longitudinal Study_OMB 9-07-06_Section B.doc

Longitudinal Study of AmeriCorps

OMB: 3045-0123

Document [doc]
Download: doc | pdf










Cambridge, MA

Lexington, MA

Hadley, MA

Bethesda, MD

Chicago, IL

Abt Associates Inc.

55 Wheeler Street

Cambridge, MA 02138

OMB Clearance Package – Section B



Phase III of the Longitudinal Evaluation of AmeriCorps












August 28, 2006









Prepared for

Lillian Dote

Corporation for National and Community Service

1201 New York Avenue, NW

Washington, DC 20525




Prepared by

Abt Associates Inc.



Contents







B. Collection of Information Employing Statistical Methods

B.1. Respondent Universe

The target population for the Phase III follow-up survey of AmeriCorps members is the same group of members who participated in previous surveys since 1999. Repeated measurements over time are obtained from the original sample of members. The study involves full-time members who were in their initial year of service with AmeriCorps.


Exhibit 5 shows the number of programs, expected universe of members, and the sample size in each program. A representative sample of 108 programs was selected from the population of the 1999–2000 State and National program year (N=650). All members were selected from each selected program, yielding a total sample of 1,752 members at baseline (1999-2000), 1,385 members at post-program (2000-2001), and 1,242 members at follow-up (2003-2004) for the post-program supplemental survey. All first-year members in the three selected NCCC regions were included in the sample in view of the small number of programs in the population and the need to provide accurate estimates of member characteristics. This yielded a total sample of 476 NCCC members at baseline (1999-2000), 461 at post-program (2000-2001), and 419 at follow-up for the PPSS (2003-2004).


Exhibit 5

Population and Sample Sizes by Program Stream (1999–2000 program year)


Universe

Survey Sample

Programs



State and National

650

108

NCCC

5

3

Total

655

111

Members



State and National

21,000

1,752

NCCC

1,000

476

Total

22,000

2,218


B.2. Procedures for the Collection of Information/Limitations of the Study

For each previous round of data collection, current and permanent contact information as well as contact information for up to three close friends or family members of the respondent were requested. The Phase III data collection will use this information to contact members of both the treatment and comparison groups. As indicated in Exhibit 6, Computer Assisted Telephone Interviews (CATI) will be used to collect information for all target groups. We are offering an option to complete the survey using a web-based survey. We believe that this option will reduce the burden on respondents, and that most of the sample members are computer-literate and have access to the internet.


Exhibit 6

Data Collection Methodology, Phase III Follow-up

Respondent Group

Data Collection Method

State and National


Treatments

Computer Assisted Telephone Interviewing or web-based surveys

Controls

Computer Assisted Telephone Interviewing or web-based surveys

NCCC


Treatments

Computer Assisted Telephone Interviewing or web-based surveys

Controls

Computer Assisted Telephone Interviewing or web-based surveys


B.2.1. Statistical Methodology for Stratification and Sample Selection

Sample Selection

Programs were selected from the universe of AmeriCorps*State and National and AmeriCorps*NCCC programs operational in 1999/2000. The sample was stratified based on the two program types. Within the State and National stratum, programs were stratified by Census regions, program size (number of members), and focus area. The sample of 108 State and National programs were allocated to each Census region proportionate to the total number of programs in that region. A systematic sample of programs was selected from the population after sorting the list by state and size. All first-year, full-time members from the selected AmeriCorps*State and National and AmeriCorps*NCCC programs were included in the sample.


Sample Size

Overall sample sizes for the study are summarized in Exhibit 7. We expect an 80% response rate from the baseline survey sample due to mobility or unavailability. However, since this is a longitudinal study, we will continue to track all individuals in the sample, even those who may have missed a round of data collection. This continued tracking of individuals has helped response rates in previous rounds of data collection, with an average response rate of 80% in the post-program survey (2000-2001) and 75% in the post-program supplemental survey (2003-2004).


Comparison Groups

In addition to the sample of AmeriCorps*State and National and NCCC members, the study also includes a sample of nonparticipants from comparison groups in order to assess the impact of the program. An impact study is generally so defined because it attempts to identify program effects that cause the change in outcomes. A study of impacts requires the identification of appropriate comparison groups who are like program participants but who did not participate themselves. Separate comparison groups were selected for the State and National and NCCC analyses. The State and National comparison group was selected using the Corporation’s Inquiry Data Base, where individuals inquired about AmeriCorps but did not enroll during the 1999/2000 program year. The comparison group for the NCCC study was drawn from the Corporation’s list of individuals who applied to the NCCC, were determined eligible, but did not actually enroll in the 1999/2000 program year due to lack of interest or insufficient available slots.


Exhibit 7
Sample Sizes


Number of Programs

Number of Respondentsa

Follow-up (Phase III)a

State and National

108


Treatments


1,401

Comparisons


1,219

TOTAL


2,620

NCCC

3


Treatments


380

Comparisons


320

TOTAL


700

Total

111

3,320

a The follow-up (Phase III) represents an estimate for expected total respondents. For the estimate of respondents, our target is an 80% response rate from baseline.


B.2.2. Estimation Procedure

For producing population-based estimates of totals, percentages and means, each respondent has a sampling weight. This weight combines the base weight that is the inverse of the probability of selection of a respondent and an adjustment for non-response. This adjustment is to account for members who are in the sample but do not respond to the survey. Every effort will be made to minimize the non-sampling errors in the estimates by maximizing the response rates and taking steps to reduce response errors.


B.2.3. Degree of Accuracy Needed for the Purpose Described in the Justification

Power Calculations for AmeriCorps Longitudinal Study

State and National Sample. Exhibit 8 displays, for the State and National sample, statistical power calculations for detecting a difference of at least five percentage points on some population characteristics of interest between baseline and the post-program and follow-up periods under the assumption of a one-tailed test at alpha = .05. Our power estimates will be affected by several factors: design effects, attrition, and correlation between responses at the two time periods. First, the effects of clustering individuals within programs (i.e., design effect) are accounted for here due to the non-independence of observations within programs. This non-independence has the effect of deflating standard error estimates. We assume two levels of design effect here: 2.0 and 2.5 to show the range of power estimates between these two scenarios. In addition, we are assuming an average rate of attrition of 20 percent from baseline to post-program and follow-up observations. Our power will be augmented, however, due to the fact that observations at two time periods will be correlated because we are following the same group of individuals over time. For example, a correlation of .5 between observations at the two time periods has the effect of essentially doubling the effective sample size.

We present varying levels of power by different population percentage values. As Exhibit 8 shows, our power is lowest when the population percentages being compared are near 50. The power of detecting differences of five percentage points is higher, however, when the population percentages that we are estimating are higher or lower than 50 percent. Exhibit 8 clearly shows that we will have adequate power in most circumstances. That is, even when the correlation between baseline and post-test or follow-up is only .4, under the higher design effect of 2.5, we will achieve an adequate level of power for calculating differences between percentages of varying levels. Conversely, our power increases as the correlation between measurement points increases and the design effect decreases.

Exhibit 8

Power of Detecting a Difference of Five Percentage Points for the Program Group from Baseline to Follow-up (Power Expressed in Percentages)a


Design Effect = 2.0

Design Effect = 2.5

Baseline Population Percentage

= 0.4

= 0.5

= 0.6

= 0.4

= 0.5

= 0.6

50

75

81

87

67

73

80

40, 60

78

83

88

69

75

81

30, 70

83

87

92

75

80

86

20, 80

91

94

97

85

89

93

10, 90

99

99

99

98

99

99

a Assuming a baseline sample size of 1,752 and an attrition rate of 20 percent.


We expect the design effects to be smaller for the various AmeriCorps subgroups (see subgroup sample sizes in Exhibit 9). Assuming a design effect of 1.5, a sample of 510 for the African-American subgroup, for example, yields an effective sample size of 340. With an attrition rate of 20 percent, this sample reduces to 272. With this sample assuming an average correlation coefficient of 0.5 between measurements, we can detect differences of eight percentage points with 80 percent power between the two periods.


Exhibit 9

Expected Sample Size for Various Subgroups

Initial Sample Size

Subgroups

Ethnicity

Gender

African-American

Hispanic

Men

Women

1,763

510

264

475

1,288


In addition to looking at change over time on outcomes for the AmeriCorps members, we also are interested in conducting comparisons of program vs. comparison group outcomes. Exhibit 10 displays respective sample sizes for the two groups, taking into account the factors of attrition and clustering.


Exhibit 10

Sample Sizes for Comparing Program and Comparison Groups

Group

Baseline Sample Size

Expected Sample Size After Attrition

Effective Sample Size (Design Effect = 2.0)

Program

1,762

1,410

705

Comparison

1,524

1,223

611


With an effective sample size of 705 and 611 from the two groups, we can detect a difference of seven percentage points with 80 percent power. With the use of baseline covariates in our analysis, however, our ability to detect even smaller differences will subsequently be increased.


NCCC. We have selected three program sites out of five NCCC program sites, and all the members in the selected programs are enumerated. We expect the sampling variability to be small because of the high sampling fraction employed. Consequently, if the between-program variability is small, then we expect estimates with high precision.


B.2.4. Unusual Problems Requiring Specialized Sampling Procedures

Not applicable.


B.2.5. Use of Periodic (Less Frequent Than Annual) Data Collection Cycles

Not applicable.


B.3. Methods To Maximize Response Rates and Deal With Issues of Nonresponse

Response rates from previous rounds of data collection have been consistently high. An overall response rate of 80 percent was achieved in the post-program survey (2000-2001) and an overall response rate of 75 percent was achieved for the post-program supplemental survey (2003-2004). Phase III will rely upon similar procedures to maximize cooperation and to achieve the desired high response rates on this round of data collection.


For treatment and control group members:


1. An advance letter with a study overview will be mailed prior to each round of telephone interviews. The letter will present an interesting and appealing image and study justification, and alert respondents to the upcoming telephone survey. It will also provide a URL and a password for the web-based survey completion option.


2. An incentive may be provided to respondents completing the interview either by web or by phone.


3. A telephone number for a Corporation staff member knowledgeable about the study has been provided to all comparison group and treatment group members to assure them of the validity and purpose of the study. This number will be provided upon request.


4. A toll-free number will be available for responses to questions about the study.


5. The advance letter and introduction to the survey will confirm that information collected will be protected as confidential.


6. Respondents will have the option of scheduling the telephone interview at their convenience (within the data collection period), or completing the survey on the web


B.4. Tests of Procedures or Methods

All survey instruments have been drafted and have undergone three reviews: (1) an internal review conducted by Abt Associates, the contractor conducting the study, (2) a review by members of the Technical Working Group, and (3) a review by Corporation staff. The Phase III follow-up survey has also been pre-tested by three individuals who participated in organized AmeriCorps, and three individuals who considered AmeriCorps activities and who are demographically similar both to individuals in the treatment and comparison group used for the AmeriCorps study.


B.5. Names and Telephone Numbers of Individuals Consulted

The information for this study is being collected by Abt Associates Inc., a research consulting firm, on behalf of the Corporation for National and Community Service. With Corporation oversight, Abt Associates Inc. is responsible for the study design, instrument development, data collection, analysis, and report preparation.


The instruments for this study, and the plans for statistical analyses, were developed by Abt Associates Inc. The survey staff team is composed of JoAnn Jastrzab, Project Director, Dr. Larry Orr and Dr. Chris Winship (of Harvard University), Co-Principal Investigators, and Dr. Ryoko Yamaguchi, Deputy Project Director. In addition, various members of the Technical Working Group (see Exhibit 1) were consulted on the instruments and statistical analysis design. Contact information for the survey staff team is provided below:


Name Number


JoAnn Jastrzab 617-349-2372

Larry Orr 301-634-1724

Chris Winship 617-495-9821

Ryoko Yamaguchi 301-634-1778






File Typeapplication/msword
File TitleAbt Double-Sided Body Template
AuthorAdministrator
Last Modified Byldote
File Modified2006-09-11
File Created2006-09-11

© 2024 OMB.report | Privacy Policy