AmeriCorps Member Exit Survey
SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSIONS
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
B1. Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
The potential respondent universe for the AmeriCorps member exit survey consists of all AmeriCorps (AmeriCorps State and National, VISTA, and NCCC) members that complete a term of service; this survey is a census of all exiting AmeriCorps members. If a member completes multiple terms of service, they will receive an invitation to take the exit survey for each term completed. This approach recognizes that the member experience can differ from term to term and from service placement to placement.
B2. Describe the procedures for the collection of information, including: Statistical methodology for stratification and sample selection; Estimation procedure; Degree of accuracy needed for the purpose described in the justification; Unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.
The member exit survey will be administered to all exiting AmeriCorps members (census); sampling will not be employed. The survey administration procedure will consist of a series of pre-exit email notifications to members to invite and encourage them to take the survey; a link to the survey in the member portal; and email reminder notifications to complete the survey. The steps are listed in detail below.
Step 1: Pre-exit notifications triggered in system. Approximately 30-60 days before each members’ service completion date (exact timing depends on each program’s protocol for exiting members), a “pre-exit” notification will be triggered and emailed to the member using the email address registered to the member’s account. The notification will consist of language informing the member of the impending date of the end of their term of service, as well as the close out process, a component of which will be taking the exit survey. The notification will have portal link prominently displayed, and will include language on how to access and complete the survey. The pre-exit notification is already part of programs’ service close-out protocol, and the timing and number of additional survey-specific notifications will be determined during requirements gathering and development, based on the technical capabilities of the portal system.
Step 2: Reminder notifications triggered in system during close-out period. Approximately 30 days after the date of each member’s completion of his or her term of service, a reminder notification will be triggered and emailed to the member. The notification will consist of language informing the member of the remaining days available to complete the member exit survey (exact timing to be determined after additional consultation with program staff). The timing and number of reminder notifications is to be determined depending on the technical capabilities of the portal system and will be finalized during requirements gathering and development.
To incentivize participation and achieve higher response rates, CNCS will be offering a Certificate of National Service Completion to members after they successfully complete a term of service in any AmeriCorps program, and complete the member exit survey. Certificates will pull the appropriate member information from CNCS’ eSPAN database (member name, service completion date, etc.) to populate the Certificate. CNCS staff with the appropriate user role will be able to update the signatories and other general information as needed. We anticipate that receipt of the Certificate will be contingent upon a member completing the member exit survey.
As members complete the exit survey, response data will be stored in CNCS databases. Because members exit throughout the year, data will be pulled at a consistent time annually to maintain continuity of reporting periods across years. We anticipate that data will be pulled and analyzed annually by staff in the Office of Research and Evaluation after the end of each fiscal year.
Exhibit 1. Survey Administration Steps
Analysis of the exit survey data will include basic descriptive statistics and correlations of survey items. Analyses will begin with tabulations of responses to items for the overall member population, followed by tabulations of responses to items by program (State and National, VISTA, NCCC), member subpopulations, and other variables of interest. Particular attention will be paid to descriptive statistics related to survey items representing each of the four pathways in the member theory of change. Correlations between variables such as member focus area or program and items representing the four pathways detailed in the theory of change may also be generated. Where relevant, we will report frequencies for the top two response categories combined (referred to as “top 2 box” in this document, e.g. strongly agree and agree), as we have found this to be more easily interpretable by diverse agency stakeholders. Item variance may be reported depending on the survey item and information needs.
A nonresponse bias study was completed in 2014 for the current exit survey. To assess the impact of nonresponse bias in our study, we will conduct statistical analysis as needed to identify any characteristics of respondents that are correlated with response to the exit survey (more information is described in B3). Most survey questions will be required, therefore we do not anticipate item-specific nonresponse.
B3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.
We estimate a response rate of 60%-65% will be achieved in the first year, based on the revisions to the survey content, improvements to the pre-notifications and reminder emails, and the inclusion of the Certificate of National Service for those completing the survey. We will aim to increase response rates even further in future years. In the past, the member exit survey has achieved response rates of between 32% and 53% (see Table 1 below).
Table 1: Historical Response Rates to Member Exit Survey
YR |
OVERALL_RESPONDENTS |
OVERALL_N |
OVERALL_RATE |
2011 |
27838 |
84558 |
0.3292 |
2012 |
44312 |
83280 |
0.5321 |
2013 |
39266 |
74566 |
0.5266 |
2014 |
32815 |
71406 |
0.4596 |
*Note that the current member exit survey was first administered in September, 2011 (calendar year). Members completing a term of service before September, 2011 thus may have completed the survey with a significant amount of time elapsed.
To improve response rates, the new member exit survey will amend the administration process to feature improved pre-exit notifications to complete the survey, improved placement of the survey link on the portal site, enhanced survey “look and feel,” and improved post-service reminders to enter the portal to complete the survey. Additionally, the incorporation of a Certificate of National Service Completion, triggered by the completion of the exit survey, will incentivize survey completion by offering a concrete proof of service that members can use when seeking post-service employment or education. The portal will also feature an improved display of the survey link, making identification of the survey’s location more prominent and intuitive. Finally, the survey redesign process, with encouragement from the program working group and staff in Office of Research and Evaluation, has promoted increased buy-in from program stakeholders within CNCS on the importance of high response rates to the survey. We will continue to work with programs and state commissions to encourage members to complete the exit survey.
The member exit survey will utilize a number of strategies to maximize response rates while maintaining cost control. Data collection will be conducted through a web survey administered on CNCS’ online member portal. Since exiting members must access the portal to complete administrative tasks prior to ending their term of service, hosting the survey in the portal will reduce data collection costs and minimize respondent burden by taking advantage of a “one stop shop” for close out activities. Additionally, the multiple email reminders triggered in the portal before and after each member’s term of service make efficient use of existing portal capabilities.
B4. Describe any tests of procedures or methods to be undertaken.
To ensure the integrity of content in the revised survey, staff in the Office of Research and Evaluation (R&E) convened a working group of program stakeholders representing AmeriCorps State and National (ACSN), VISTA, and NCCC. As described in Part A, the working group members refined the theory of change, consulted on the administration process, and contributed feedback and suggested revisions on a number of drafts of the survey instrument. After convening the working group, R&E implemented a robust pilot testing process that sought to gather both qualitative and quantitative feedback about the survey instrument. The questions used in the follow up interviews are included in the appendices; the instrument used in the pilot testing process is attached.
First, a pilot draft of the survey was programmed into SurveyMonkey. This survey was the best available draft after final review from the working group and R&E staff. Pilot testers were recruited from each AmeriCorps program to ensure all service experiences were represented in the testing phase. Working group members and points of contact from their respective programs were asked to provide names of potential participants; in the case of ACSN, only 9 testers were sent a pilot survey invite in order to comply with the Paperwork Reduction Act. Since NCCC and VISTA members are not subject to similar restrictions, larger numbers of VISTA and NCCC members were invited to participate in the pilot. 183 total pilot testers completed the survey; 4 of those completing the survey were ACSN members, 157 were VISTA members, and 22 were NCCC members.
At the end of the pilot survey, testers were asked if they would be willing to be contacted by staff from CNCS to provide more detailed feedback during a 15-20 minute phone interview. 59 total members provided contact information (1 ACSN, 54 VISTA, and 4 NCCC members). One ACSN member and 4 NCCC members were contacted via email to set up a time to speak with researchers from CNCS. Three members from VISTA were randomly selected and contacted to be interviewed (plus two alternates), with all three VISTA members successfully completing the interview.
Table 2 shows the complete array of analysis activity conducted using data from the pilot survey, as well as modifications made based on this analysis to construct the final survey. A detailed discussion of the analysis activity and changes made to the survey follow.
Table 2: Detailed Analysis Plan
Pilot Questionnaire Item (PQ) |
Source of Item |
Qualitative Analysis Activity |
Quantitative Analysis Activity |
Change Made to Final Questionnaire? |
Final Questionnaire Item (FQ) |
|
New |
None |
Review distribution |
- |
Eliminated |
|
Old Exit Survey, Q.1 |
- |
Compare pilot results with data from old survey |
- |
1 |
Focus areas |
New item |
- |
Review distribution |
- |
2 |
Training evaluation |
Edited from old exit survey Q.4 |
Cognitive interview question |
Compare pilot results with data from old survey |
- |
3a |
Job training and supervision |
Edited from old exit survey Q.5 |
Cognitive interview question |
Compare pilot results with data from old survey |
- |
3c |
Frequency of different activities |
Edited from old exit survey 7a, 7b, 7c |
Cognitive interview question |
Compare pilot results with data from old survey |
Select response option wording edited to be more inclusive |
4-6 |
Cultural competency |
Chen & Starosta |
- |
Since only 4 items from this scale were used from different subfactors we will seek to determine inter-correlations among items, Cronbach alpha |
- |
7 |
Self-Efficacy Scale |
Schwarzer & Jerusalem |
Cognitive interview question |
Factor analysis to see if items are unidimensional as found by Schwarzer. Cronbach Alpha. Check item means and intercorrelations from US sample in Schwarzer’s 2002 paper. |
- |
8 |
AmeriCorps experience |
Old exit survey, Q.9 |
- |
Compare pilot results with data from old survey |
- |
9 |
Satisfaction with AmeriCorps Service |
Old exit survey, Q.11 |
- |
Compare pilot results with data from old survey |
- |
10 |
Likelihood of participating in specific activities |
Old exit survey Q.13 and Q.14 |
- |
Compare pilot results with data from old survey |
- |
11 |
Discuss politics with friends or family |
CPS Civic Supplement, S11 |
Cognitive interview question |
Compare with 2013 CPS results in relevant age group |
Eliminated |
Eliminated |
Use internet to express opinions |
CPS Civic Supplement S3 |
- |
Compare with 2013 CPS results in relevant age group |
Eliminated |
Eliminated |
Boycotted a product or service |
CPS Civic Supplement, S2 (b) |
- |
Compare with 2013 CPS results in relevant age group |
Eliminated |
Eliminated |
Frequency of discussing political, social, local or national issues affect on the community |
Civic Engagement responsibility scale (Furco, Muller & Ammon) –Q2 |
Cognitive interview question |
Cronbach Alpha, distribution, TBD |
- |
12 |
Encourage others to participate in the community |
Civic Responsibility Scale (Furco, Muller & Ammon)—Q.16? |
Cognitive interview question |
Cronbach Alpha, item distribution, TBD |
Eliminated |
Eliminated |
Neighbors do favors |
CPS Civic Supplement, S16 |
Cognitive interview question |
Compare with 2013 CPS results in relevant age group |
Eliminated |
Eliminated |
Trust people in the neighborhood |
CPS Civic Supplement, S18 |
Cognitive interview question |
Compare with 2013 CPS results in relevant age group |
- |
13 |
Trust in institutions |
CPS Civic Supplement, S21 |
Cognitive interview question |
Compare with 2013 CPS results in relevant age group |
- |
14 |
Vote in last presidential election |
CPS Voting Supplement, PES1 |
Cognitive interview question |
Compare with 2012 CPS Voting Supplement results in relevant age group |
Response option added to be more inclusive of younger members |
15 |
Registered to vote in the last presidential election |
CPS Voting Supplement, PES2 |
Cognitive interview question |
Compare with 2012 CPS Voting Supplement results in relevant age group |
Response option added to be more inclusive of younger members |
16 |
Competence for civic action |
Competence for Civic Action Scale from Flanagan, et al.(2007) |
Cognitive interview question |
Cronbach Alpha |
- |
17 |
Community items |
Civic Engagement Responsibility Scale (Furco, Muller & Ammon) |
Cognitive interview question |
Cronbach Alpha, item distributions, TBD |
- |
18 |
Value of participation in AmeriCorps |
New items |
- |
Cronbach Alpha, item distributions |
- |
19 |
AmeriCorps as a defining personal experience |
Item from old exit survey Q.17a |
- |
Compare pilot results with data from old survey |
- |
20 |
Defining professional experience |
Item from old exit survey Q.17a |
- |
Compare pilot results with data from old survey |
- |
22 |
Recommend AmeriCorps to family/friends |
New item |
- |
|
- |
24 |
Associations of AmeriCorps service |
Item from old exit survey, Q.19 |
Cognitive interview question |
Compare pilot results with data from old survey |
Edited response options for clarity |
25 |
Plan for using Ed Award |
Edited item from old exit survey Q.20a |
Cognitive interview question |
Compare pilot results with data from old survey |
Added response options to be more inclusive |
26 |
Plans after service |
New item |
Cognitive interview question |
None |
Instruction language edited |
27 |
Listing experience on resume |
New item |
Cognitive interview question |
None |
Question reframed |
28 |
Adequacy of training |
New item |
N/A |
None |
- |
29 |
Cognitive Interviews1
Throughout the survey redesign process, specific items were flagged for follow-up with testers based on question wording, item content, and availability of relevant response options. Given the need to keep the cognitive interviews between 15-20 minutes long, we narrowed the list of items to cover those listed in the attached protocol (see Appendix B). Questions generating substantial comments, and for which changes were made in the final survey, are described below.
PQ.4 and 5 asked about training, resources, and supervision from AmeriCorps and the grantee program/site. The respondents we interviewed expressed confusion regarding these questions because their particular programs had an intermediary structure, whereby both the intermediary and host site provided supervision and training. Though there are some ACSN programs with a similar structure, we revised the survey in response to this feedback. We decided not to ask separately about training provided by intermediaries, and clarified wording in the question to be specific to a site or project sponsor.
PQ.14 through 17 (PQ.14-16 eliminated; PQ.17 now FQ.12) asked about various modes of political engagement. Interviewees had strong reactions against being asked these questions2, reporting that the questions were provoking, sensitive, and seemed to have some ulterior motive. We considered altering the order or location of the political questions, or adding an explanation for their inclusion, but ultimately decided to drop them.
PQ.18 (eliminated) asked about frequency of encouraging others to participate in the community. Interviewees interpreted the question differently, with one respondent relating the action to political recruitment or activity more generally. A second respondent linked the question to general civic activity, and a third linked this question directly to their daily work as an AmeriCorps member. Because of potential confusion around political activity, and because there seemed to be little agreement on what the question was asking, this question was eliminated.
PQ.19 and 20 (PQ. 19 was eliminated; PQ. 20 maps to FQ.13) asked respondents about their neighbors. Since the word “neighbors” implies a place-based interpretation of relationships with others, we needed to verify respondents’ interpretations of the word. All three interviewees reported that they considered their neighbors to be the people living near their home; this is in contrast with a comment from an NCCC member extracted during qualitative analysis from the survey’s free response options (see below). Relatedly, we asked respondents to discuss their interpretation of the word community, which was present in several questions related to civic engagement and does not necessarily imply a place-based interpretation. Two interviewees distinguished between “neighbors” in PQ.19 and 20 (where they live; PQ.19 eliminated, PQ.20 now FQ.13) and statements relating to “community” in PQ.11 (FQ.9) (where they work/who they serve). No changes were made based on this information.
PQ.30 (FQ.25) asked respondents how closely they associated their service with various spheres of the national service community. Respondents noted confusion differentiating between the first two response options, “broader national service community" and "national AmeriCorps program,” with each interviewee interpreting these in different ways. It became clear that these response options, as written in the pilot survey, were too vague to generate reliable data. We amended these response options to “AmeriCorps” and “NCCC, FEMA Corps, VISTA, or AmeriCorps State and National” to provide distinction between the overall AmeriCorps program and each individual stream of AmeriCorps service.
Finally, PQ.33 (FQ.28) asked about the likelihood of including AmeriCorps experience on one’s resume. The question was posed to interviewees about the relevancy of the question, and overall, respondents felt that it was more relevant to ask about how the experience would be recorded on the resume. This change was incorporated into the final version of the survey, as it would generate more useful information for member training and development purposes.
Qualitative Feedback from PQ.32 (FQ.27) and PQ.36 (eliminated)3
In
addition to cognitive interviews, we analyzed two open response
questions (PQ.32 and PQ.36; FQ.27 and eliminated). PQ.32 (FQ.27)
asked about members’ plans after the end of their term of
service. It became clear that several respondents could have
benefited from the ability to select more than one answer, as many
listed their most preferred course of action as well as their
contingency plans (e.g. “I applied for another year as a team
leader if that does not work out I will probably go home and find a
job”). A small number of respondents commented that they
selected the “Do not know” response option because they
wanted to choose multiple options. Since we intended respondents to
choose their most preferred course of action, we edited the question
wording to make this clearer. Additionally, it was suggested that we
add an option for joining the military/armed forces, as well as to
retire. The option for military service was added to the final draft,
along with an option for “Other” and a text box to
provide a description.
PQ.36 (eliminated) solicited general feedback about items on the survey that were confusing or unclear. Some respondents were confused about the response options for PQ.3 (FQ.2), which asked about focus area. These are CNCS terms that are not necessarily used by programs, but given the need for brevity in the survey, we chose not to make edits to this question. PQ.10 (FQ.8), which came from the General Self-Efficacy scale, and asked about perseverance, struck a tester as worded oddly. Because this item came from a scale that had been successfully validated with a similar population, and because there was only one tester who mentioned this language, we chose not to edit the question.
Similar to feedback provided in the cognitive interviews, several respondents (five) commented about how politics-related questions (PQ.14-17; PQ.14-16 eliminated, PQ.17 now FQ.12) seemed unexpected or did not fit in with the rest of the survey. Based on the combined qualitative feedback, we chose to eliminate these questions. For PQ.19 and 20 (eliminated and FQ.13, respectively), some respondents (four) remarked that they were unsure of who qualified as their “neighbors.” These questions are place-based, so for an NCCC member, who is part of a team that frequently moves from one place to another, there was the possibility for confusion. In the final survey, we decided not to alter the question wording to preserve the ability to map our results back to the CPS data. Finally, some respondents were confused about PQ.30’s (FQ.25) "broader national service community" or "national AmeriCorps program" response options. Given the variety of interpretations assessed here and in the cognitive interviews, we altered the response options to remove “broader national service community” and added an “Other- please specify” option with a text box to capture members’ interpretations. Additional changes made to the survey based on qualitative feedback included adding an option to PQ.31 (FQ.26) to indicate that a stipend was chosen over an education award (relevant only for VISTA members).
Quantitative Analysis
The purpose of the analysis was to ensure that the pilot survey produced sufficient variation in responses, and that concepts measured by items from validated instruments still measured those concepts after the scales were shortened. Quantitative analysis consisted of four parts:
Calculating item reliability of all pilot survey items;
Generating item distributions;
Where survey items had been drawn from validated surveys (e.g. Intercultural Sensitivity Scale), running factor analysis and item intercorrelations to assess similarity to the original sources;
Where items were drawn from the existing exit survey, analyzing the variation in those items to make sure it is sufficient, and confirming that the factor structure of those items has not changed.
The analysis focused on different sections of the pilot survey and yielded a number of results which suggest that various items included in the pilot instrument will yield more insightful data than the items they replaced in the earlier exit survey.
Items on “Reasons for Joining”(PQ.2; FQ.1)
The items in this section of the pilot were the same as those used in the earlier exit survey. The Cronbach alpha for this portion of the survey was quite respectable (.737). A factor analysis of these items yielded a factor structure similar to that observed with earlier exit survey. The factor structure of motivation to join in the original exit survey consisted of three factors, together accounting for 62% of the variance. The pilot exit survey also entails three factors which account for 61% of the variance.
Satisfaction with Training and Supervision (PQ 4, 5; FQ 3a, 3c)
The pilot survey used two new items that covered similar content as the items used in the earlier exit survey. Comparison of the distributions of these new items indicated that the new items yielded less extreme responses, which is a positive result. The old items had a top 2 box of between 79% and 84%. The new items had a top 2 box range of between 59% and 65%. This means that the response to these items are more dispersed and provide and exhibit greater variation as compared with the previous items which were clustered at either the positive or negative end of the scale. Analysis of the distributions of the old and new items indicates that the old items yielded more extreme results—a greater proportion of members report having very good or very poor training and supervisory experiences. The new items appear to provide a less extreme view of member experiences.
Table 3: Distribution of Training items
|
Top 2 Box |
Bottom 2 Box |
Old survey item Q.3 |
78.5% |
17.8% |
Old survey item Q.4 |
79.2% |
18.1% |
PQ.4 (FQ.3a) |
59.1% |
40.3% |
Table 4: Distribution of Supervisory Items
|
Top 2 Box |
Bottom 2 Box |
Old Item Q.5 |
84.2% |
15.8% |
PQ.5 (FQ.3c) |
65.2% |
32.6% |
Item Inter-correlations for old items were moderately inter-correlated between .47 and .64
Table 5: Item Inter-correlations for Old Exit Survey Items
Items |
Q.3 |
Q.4 |
Q.5 |
Old Item Q.3 |
1 |
.62 |
.47 |
Old item Q.4 |
.62 |
1 |
.64 |
Old Item Q.5 |
.47 |
.64 |
1 |
Item Inter-correlation of the new items PQ.4 (FQ3.a) and PQ.5 (FQ.3c) are modestly inter-correlated at .31.
Frequency of Various Activities While Serving in AmeriCorps (PQ.6-8; FQ.4-6)
The items used in this measure were based generally on similar items found in the original exit survey, which were then discussed with the programs and modified to reflect the reality that members experience during their service. The data from these 16 items reveal that the incidence of these activities is highly variable. Across the 16 items the incidence ranges from 28% to 92% which suggests that members may or may not encounter particular experiences but that some experiences are quite common. This variation in responses will be helpful for programs and agency staff to better understand (and possibly modify and improve) the member service experience.
In the table below we have provided top box and bottom box statistics describing distributions associated with each of the items. Top 2 Box including scores of “often” plus “very often”. Bottom 2 box includes scores of “rarely” and “never”.
Table 6: Top and Bottom Box Statistics for PQ.6-8 (FQ.4-6)
Activities |
Top 2 Box |
Bottom 2 Box |
6a. Solve unexpected problems or find new and better ways to do things. |
71.2% |
4.6% |
6b. Lead or facilitate a meeting or event. |
54.3% |
17% |
6c. Lead a team |
41.8% |
28.3% |
6d. Help other individuals learn a new skill. |
49.7% |
15.2% |
6e. Support a meeting, activity, or event through planning or coordinating. |
75.7% |
5.7% |
7a. Gather and analyze information. |
67.8% |
9% |
7b. Set priorities for multiple tasks. |
85.9% |
2.8% |
7c. Meet deadlines effectively. |
92.1% |
2.8% |
7d. Work independently. |
89.9% |
2.3% |
7e. Work on a team for a common purpose. |
70.6% |
7.4% |
8a. Listen to other people's suggestions and concerns. |
89.9% |
4% |
8b. Negotiate and compromise with others. |
68.9% |
7.9% |
8c. Decrease conflict between people. |
27.7% |
37.8% |
8d. Work with people different from yourself. |
79.1% |
5.7% |
8e. Form organizational partnerships. |
55.9% |
14.7% |
8f. Leverage community resources. |
48.0% |
19.2% |
Cultural Competency Items (PQ.9; FQ.7)
As discussed in Part A, these items were selected from the larger Chen & Starosta’s Intercultural Sensitivity Scale. Consistent with the work of those authors, a factor analysis of the reduced scale yielded a single factor. The items were highly inter-correlated between .65 and .80, and had a strong Cronbach Alpha value of .899.
Self-Efficacy Items (PQ.10; FQ.8)
These 10 items came from Schwarzer & Jerusalem’s General Self-Efficacy scale. The authors’ psychometric analyses found that the items formed a single factor when factor analyzed. The same single factor result was found here. The Cronbach Alpha for these items was also quite high at .896. The items in the scale had significant inter-correlations which ranged from .21 to .71
AmeriCorps Experience Measurement (PQ.11; FQ.9)
The 11 items used in this scale were the same measures used in the earlier exit survey. The factor structures that emerged from factor analysis of the pilot survey items and the earlier exit survey evidenced some similarity in structure. Comparison of the top 2 box responses from both sets of items yielded relatively similar results. In the pilot survey, the results were slightly less positive, which means that the data provided a less extreme picture.
Factor analysis of the original exit survey data yielded a two-factor solution which accounted for 65% of the common variance. One of the factors was “Change in perspective” which accounted for 34% of the variance. The items involved learning about the real world and re-examining attitudes and beliefs. The second factor was described as “contributing to and understanding the community” and accounted for 31% of the variance.
In the pilot study, factor analysis of the data yielded a three factor solution as shown below:
Table 7: Factor Analysis of PQ.11 (FQ.9) Data
PQ.11 (FQ.9) |
Rotated Component Matrix(a) |
||
|
|
Component |
|
|
1 |
2 |
3 |
I felt I made a contribution to the community. |
0.769 |
0.163 |
0.153 |
I re-examined my beliefs and attitudes about myself. |
-0.067 |
0.833 |
0.115 |
I was exposed to new ideas and ways of seeing the world. |
0.328 |
0.793 |
0.08 |
I felt part of a community. |
0.742 |
0.357 |
-0.083 |
I learned more about the "real" world or "the rest" of the world. |
0.511 |
0.6 |
0.059 |
I gained an understanding of the community(s) where I served. |
0.837 |
0.116 |
0.035 |
I gained an understanding of the solutions to the challenges faced by the community(s) where I served. |
0.806 |
0.098 |
0.058 |
I felt I made a difference in the life of at least one person. |
0.593 |
0.119 |
0.165 |
I did things I never thought I could do. |
0.302 |
0.635 |
0.237 |
I figured out what my next steps are in terms of educational goals. |
0.11 |
0.145 |
0.877 |
I figured out what my next steps are in terms of career/professional goals. |
0.075 |
0.135 |
0.882 |
The first factor appears to reflect “making a contribution and understanding the community where I served” (41% of the variance). The second factor focuses on “thinking differently and doing new things” and accounted for 15% of the variance. The third factor reflects that upon completion of service members feel that they have” figured out their next steps in educational and career/professional goals” and accounts for 11% of the variance. Thus, this factor structure is somewhat similar to that identified in the original exit survey.
As can be seen in the table below, PQ.11 (FQ.9) and old exit survey item Q.9 show results that are relatively similar. In general, results from the old version of the exit survey are more favorable.
Table 8: Top and Bottom Box Statistics for PQ.11 (FQ.9) and Old Exit Survey Item Q.9
Activities |
Top 2 Box-Pilot (PQ.11) |
Top 2 Box—Old Survey (Q.9) |
I felt I made a contribution to the community. |
86.2 |
89.5 |
I re-examined my beliefs and attitudes about myself. |
68.4 |
76.4 |
I was exposed to new ideas and ways of seeing the world. |
82.2 |
83.2 |
I felt part of a community. |
74.2 |
79.9 |
I learned more about the "real" world or "the rest" of the world. |
64.4
|
76.6 |
I gained an understanding of the community(s) where I served. |
89.6
|
76.6 |
I gained an understanding of the solutions to the challenges faced by the community(s) where I served. |
85
|
87.3 |
I felt I made a difference in the life of at least one person. |
87.9 |
82.8 |
I did things I never thought I could do. |
59.7 |
73.1 |
I figured out what my next steps are in terms of educational goals. |
44.3
|
70.2 |
I figured out what my next steps are in terms of career/professional goals. |
59.2
|
73.1
|
I felt defeated by the scope of the problems I worked on* |
12.0 |
N.A. |
I re-examined my beliefs and attitudes about other people* |
54.1 |
N.A. |
*New items not used in the old exit survey
Overall Satisfaction Measures (PQ.12; FQ. 10)
PQ.12 (FQ.10) is identical to that used in the old exit survey (Q.11) except for the differences in explicitly describing each of the response options. The distributions of these items are shown below (the scores on the old exit survey were 5-point scales anchored only on the extremes, with one anchor being extremely satisfied and the opposite anchor being extremely dissatisfied). The data indicate that the pilot instrument (which provides details regarding the meaning of each response alternative) evidences less extreme positive responses. This suggests that describing each response option, rather than only providing two anchor points, will yield more valid data4.
Table 9: Distributions for PQ.12 (FQ. 10) and Old Exit Survey Item Q.11
Item |
Extremely Satisfied |
Less Extreme Satisfaction Responses |
Extremely Dissatisfied |
PQ.12 (FQ.10) |
34.5% |
63.8% |
1.7% |
Old Exit Survey Item Q.11 |
49.7% |
49.1% |
1.2% |
Likelihood of increased civic engagement activity (PQ.13; FQ. 11)
This set of questions was focused on whether members expected to participate in a number of different civic engagement activities related to community organizations, voting, and keeping informed about public issues. The stems of these questions were identical in a number of cases on both instruments, but in the pilot survey the responses focused on “how much more likely members were to engage in civic activities now that they have completed their service” rather than on how likely they will be to engage in those activities (as in the old exit survey). Thus, the pilot survey is looking at the “relative change” while the old exit survey is focused on the total “likelihood of occurrence.”
The response alternatives used in the pilot survey (PQ.13) had labeled response options indicating that respondents were much more likely, somewhat more likely, no effect, somewhat less likely and much less likely. In the original exit survey the response alternatives only had labels for the two extreme anchor points.
Comparison of top 2 box responses from both sets of items showed that the pilot survey responses were less extreme but generally the same direction as those from the original exit survey. Indeed, since the responses to the original exit survey were framed in terms of the total likelihood of occurrence of an activity, these responses should be more positive than responses to the new exit survey which ask how much more likely a behavior will take place. Table 10 below shows top 2 boxes for items from both the pilot survey and from the old exit survey.
Table 10: Top 2 Box Items from Pilot and Old Exit Surveys
Activities |
% Top 2 Box-Pilot (PQ.13) |
%Top 2 Box—Old Exit Survey (Q.14) |
Participate in community organizations (school, religious, issue-based, recreational) |
68% |
81% |
Vote in elections |
31% |
71% |
Keep informed about news and public issues |
61% |
77% |
Help to keep the community safe and clean (Pilot Only) |
66% |
N/A |
Help to keep the neighborhood safe (Old Exit Survey Only) |
N/A |
73% |
Help to keep the neighborhood clean and beautiful (Old Exit Survey Only) |
N/A |
75% |
Volunteer for a cause or issue that I care about (Pilot Only) |
71% |
N/A |
Donate money or goods to a cause or issue that I care about (Pilot Only) |
60% |
N/A |
These results suggest that a sizeable proportion of members who complete their service expect to be involved in civic activities in the future.
Measures of Civic Engagement (PQ. 17; FQ. 12)
The pilot survey contains one item from the Civic Responsibility Survey (from Furco, Muller, & Ammon, 1998). The item asks how often the respondent discusses and thinks about how political, social, local and national issues affect the community. This question appears as PQ.17 (FQ.12) in the pilot survey. Furco, Muller & Ammon’s psychometric analyses have not been updated since the survey was first created in 1998, however at that time, it was demonstrated that the survey was reliable, with an internal consistency of .93. The survey was also assessed for face validity at the time and was found to be sufficient.
Exhibit 2: PQ.17 (FQ.12) Distribution
As shown in the exhibit above, discussions concerning the political,
social, local or national issues impact on the community appear to
be quite common such that 62% of respondents indicate that they have
such discussions a few times a week or more frequently while 11% do
not have such discussions at all.
Trust and Confidence Questions (PQ.20 and 21; FQ. 13 and 14)
There are one trust and one confidence questions included in the pilot survey which were part of the CPS Civic Engagement Supplement in 2011. These items focus on trust of the people in one’s neighborhood as well as confidence in three societal institutions—corporations, the media and public schools. Comparisons of the pilot study results with CPS results are shown in the exhibit below.
Exhibit 3: PQ. 20 (FQ.13) Pilot Data vs. CPS Data
These results suggest that AmeriCorps members are somewhat more likely to trust people in their neighborhood than were members of the general population sampled in the CPS.
Exhibit 4: PQ. 21 (FQ.14) Pilot Data vs. CPS Data- Trust in Corporations
Data from this question suggests that members have somewhat less confidence in corporations than do members of the general public surveyed as part of the CPS). Thus, 41% of AmeriCorps members indicated that they had some or a great deal of confidence in corporations while 65% of members of the general public (as evidenced by responses in the CPS) had some or a great deal of confidence in corporations.
Exhibit 5: PQ. 21 (FQ.14) Pilot Data vs. CPS Data- Trust in the Media
The chart shown above suggests that AmeriCorps members have less confidence in the media (38% have some or a great deal of confidence in the media) than do members of the general public (55% of general public on the CPS indicated that they had some confidence or a great deal of confidence in the media).
Exhibit 6: PQ. 21 (FQ.14) Pilot Data vs. CPS Data- Trust in Public Schools
As indicated in both the CPS and the pilot exit survey, most people have some or a great deal of confidence in the public schools in this country. This finding is true for AmeriCorps members as well as the population at large.
In general, it appears that pilot survey results from trust and confidence items closely match the distributions observed in the CPS results. However, the more youthful nature of the AmeriCorps member population may lead to some predictable differences between pilot study results and results from CPS interviews.
Overall, it would appear that the new items on the new exit survey are comparable in many ways to the original exit survey or to the CPS, while providing a higher degree of variability than has been experienced with the original exit survey.
Summary of Changes Made to Create Final Questionnaire
Relying on feedback from our working group and our qualitative and quantitative analyses, the following changes were made to the pilot survey to create the final questionnaire:
Eliminated political questions;
Edited response options for PQ.23, 30-33 (FQ.16; FQ.25-28) to increase clarity and cover more respondents;
Added text boxes to PQ.3 (FQ.2) to provide space for qualitative response;
Edited down response options from old exit survey items (e.g. PQ. 8 [FQ.6]);
Changed anchoring of scales to have all response options explicitly labeled.
B5. Provide the name and telephone number of individuals consulted on statistical aspects of the design, and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
CNCS will oversee data collection activities and analyze the data collected. The individuals at CNCS assigned to this project include:
Diana Epstein, Ph.D, Senior Research Analyst, 202-606-7564
Adrienne DiTommaso, MPA, Research Assistant, 202-606-3611
The Project Officer for CNCS is Diana Epstein, Ph.D, Senior Research Analyst, 202-606-7564.
TIAG has been contracted build the survey interface in CNCS’ web portal. The COR at CNCS for this contract is:
Melissa Merens, Business Analyst, 202-606-6971
References
Chen, G. M., & Starosta, W. J. (1996). Intercultural communication competence: A synthesis. Communication yearbook, 19, 353-384.
Flanagan, C., Syvertsen, A., and Stout, M. (2007). Civic Measurement Models: Tapping Adolescents’ Civic Engagement. Circle Working Paper 55.
Furco, A., Muller, P., and Ammon, M. (1998). Civic Responsibility Survey for K-12 Students Engaged in Service-Learning. Service-Learning Research and Development Center, University of California, Berkeley.
Schwarzer, R., & Jerusalem, M. (1995). Generalized self-efficacy scale.Measures in health psychology: A user’s portfolio. Causal and Control Beliefs,1, 35-37.
Appendix B: Cognitive Interview Protocol
AmeriCorps Member Exit Survey: Pilot Test
Cognitive Interviews- Script and Questions
Respondent Name:
AmeriCorps Program:
Interviewer Names:
Date:
Duration of call:
Follow up items:
Hi [respondent name], this is [interviewer name] calling from the Corporation for National and Community Service, how are you?
Is this still a good time for you to talk about your experience with our pilot test version of the member exit survey? [If no, reschedule a time on the phone]. [If yes:] Great!
Let’s cover a few quick points before we get started with some questions. First, we want to thank you for taking the pilot test version of the survey; this is going to help us immensely as we improve the formatting and content of the exit survey for future AmeriCorps members. Second, we want to remind you that your responses both to the survey and to the questions we’ll ask you today are only going to be used for the purpose of refining the final exit survey. Your data is being protected and you won’t be personally identified or linked to your responses. We’ll be scrubbing our interview notes today to remove your name or any other identifying info to protect your privacy. Finally, we want you to know that you can be as honest or blunt as you want; we really want to make sure this survey is in top shape before we reprogram it and send it out, so we welcome constructive criticism and you don’t need to hold back on your answers. Sound ok?
Ok, the way this will work is that we have a short list of questions, both about the ease of use of the survey and about some of the content of the questions. It might help you to have a copy of the survey in front of you, either on paper or on your computer. If you don’t have it handy, that’s ok, we can read you the specific questions that we ask about. We’re going to go through the list but please feel free at the end to add anything you think we missed that you think is important.
Did you feel like the instructions for taking the survey were clear and sufficient? Did you know what you were expected to do and how long it would take to do it?
How did you feel about the number of questions per page?
Did you feel that the time it took to complete the survey was reasonable?
[If the respondent has served multiple terms] When you took the survey, did you find that you referred to your most recent term of service or did you consider all of your service terms when answering the questions?
Is there anything you noted about the process of taking the survey you want to tell us about?
Survey Item Content Questions:
Q4 and Q5: how did you interpret the difference between the trainings we asked about in Q4 and Q5? [Clarify that this relates to HQ provided training and resources, not the program; Clarify that this is the placement or site]
Q6-8: were these particular response options relevant to the work that you are doing/did during your AmeriCorps service? Were there any activities missing that you feel are critical or central to your work?
Q10 r.o. #2: “to get what I want”- what was your understanding of the wording of this response option?
Q11: How did you feel about the number of response options presented in the question? [Follow up with prompt if necessary] Did you feel like there were too many response options presented at once?
Q11: This question frequently refers to a “community”. How did you interpret “community” when answering the question? What community did you think about when you were answering the question?
Q18: How did you interpret “encourage others to participate in the community”?
Q19, 20: how did you interpret “neighbors” and “neighborhood”? [Especially for NCCC members, this may not be a single place or be place-based]
Q23: For this question, we assumed everyone was eligible to vote at the time of the last presidential election. Were you eligible to vote?
Q24: Was the number of response options for this question too many?
Q25: For this question, did you interpret “community” in the same way or in a different way than you did for Q18.
Q30: How did you interpret r.o. #1?; r.o. #2- what does this mean to you?; r.o. #3; r.o. #4?
Q31: is there any option you felt was left out?
Q32: How did you interpret the phrase “immediately after”? Is there any option you felt was left out?
[After finishing question list or reaching end of scheduled time] Thank you so much for speaking with us, this has been really insightful and it will help us improve the survey. If you think of anything in the coming days that you want to tell us about, please feel free to email me or [note taker’s name] directly. Thanks so much for your time and your service!
1 Note that all question numbers listed in this section refer to the pilot version of the survey, noted as PQ. Question numbers for the final version of the survey (FQ) are given in parentheses.
2 It is important to note that these interviewees were VISTA members; VISTA strictly prohibits their members from engaging in politically motivated activity, and therefore these questions seemed provoking and offensive to these current members.
3 These questions were related to the pilot process and do not appear in the final questionnaire.
4 This change was made throughout the revised survey; while the old survey had only the most extreme response options explicitly labeled, the new survey will have each response option labeled.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | DiTommaso, Adrienne (Guest) |
File Modified | 0000-00-00 |
File Created | 2021-01-25 |