TO: Savi Swick
Employment and Training Administration (ETA)
U.S. Department of Labor (DOL)
FROM: Green Jobs-Health Care Impact Evaluation Team
Abt Associates Inc.
Mathematica Policy Research, Inc.
SUBJECT: 18-month survey pre-test results
DATE: July 31, 2012
This memo summarizes the pre-test results from the Green Jobs 18-Month Follow-Up Survey conducted by Mathematica Policy Research. The purpose of the pre-test was to estimate survey length, assess respondents’ understanding of the survey questions, and identify improvements in the flow and structure of the instrument.
A. Testing Details and Procedures
Nine pre-test interviews were conducted from June 22 to 28, 2012, by five Mathematica survey staff: Laurie Bach, Stephanie Boraas, Derekh Cornwell, Mindy Hu, and Brian Roff. To estimate timing and approximate the fielding conditions for the computer-assisted telephone interview (CATI) survey, staff conducted five pre-test interviews over the phone. The other four interviews were conducted in person, which allowed the interviewers to observe the participants’ reactions to the questions and to probe for more information after the survey was completed. The survey was administered to all nine participants without interruption. After completing the survey, all interviewers asked the participants a standardized set of follow-up debriefing questions, which probed on issues related to the recall of information, understanding of key terms and concepts, and perceptions of flow and question redundancy. To thank the respondents, each received a $25 payment for participating in the hour-long interview and debriefing. Results from the interviews were reviewed, and recommendations based on these findings are outlined in Table 3 below.
B. Recruitment and Respondent Profiles
Respondents were recruited from One-Stop Career Centers and community training organizations in four major metropolitan areas: Washington, DC; Oakland, CA; New Brunswick, NJ; and Chicago, IL. Respondents from the DC area were recruited from the Workforce Development Center (WDC) of United Community Ministries (UCM), a nonprofit organization located in Northern Virginia. WDC assists area residents in finding jobs or improving their employment situation through career counseling, resume preparation, computer training, and other services. Oakland respondents were recruited from the Oakland One-Stop Career Center—Downtown, which offers computer training, on-site employer recruitment, career counseling, job-search assistance, and pre-employment and life-skills training. Respondents from New Brunswick were recruited from the Middlesex County One-Stop Career Center, a comprehensive One-Stop center offering services such as workforce development, job-search assistance, career counseling, and re-employment workshops. Participants in Chicago were recruited from the Jane Addams Resource Corporation, a nonprofit community-development organization that offers adult learning programs and training courses for jobs in the manufacturing sector. Table 1 summarizes the mode of the interviews and demographic characteristics of the pre-test sample members.
Table 1. 18-Month Follow-Up Survey: Pre-Test Respondent Profiles
Characteristics |
|
Number of Respondents |
Interview mode |
In-person |
4 |
Phone |
5 |
|
Gender |
Female Male |
6 3 |
Age |
23–29 30–39 40–49 50+ |
1 1 2 5 |
Highest education level attained |
No information Less than high school High school graduate/GED College graduate |
1 2 4 2 |
Currently enrolled in training |
Yes No |
1 8 |
Race/ethnicity |
White Black or African American Other (black/Hispanic) |
2 6 1 |
Total sample size |
|
9 |
C. Key Findings
1. Interview Length
The estimated average time needed to complete the pre-test interview
was 36 minutes.
Table 2 shows the average timing estimates for
each survey section. To produce the expected average for all
sections, we assumed that respondents would take 2 minutes to answer
the four basic screening questions in Section A (which was not
administered to pre-test participants). Individual timing estimates
for the survey ranged from 24 to 52 minutes. The reason for this
variance stems largely from the employment and training histories of
the respondents, which varied substantially. A few respondents, for
example, had experienced long periods of unemployment and were not
involved in career training or adult education within the past 18
months. For these respondents, many questions in the survey were not
applicable. This was particularly true for the two largest sections
of the survey—Sections B and D, which address employment
history and training experiences, respectively. In contrast, some
respondents reported being actively involved in employment and
job-training activities within the past 18 months. Overall, seven
respondents reported being employed at some point within the past 18
months; six respondents indicated that they participated in education
or training courses within the same time frame.
As shown in Table 3, the pre-test results highlighted several areas where additional modifications and survey questions may be needed. Although some of these additions will increase survey time, other proposed changes, such as streamlining the skip logic by adding screening questions, will likely reduce survey time. Moreover, once the survey is programmed into the CATI system, respondents will be able to navigate the survey much more efficiently.
Because the maximum target length of the survey is 40 minutes, we will work to ensure that any future modifications do not cause the average survey length to exceed this limit.
Table 2. Survey Length, in Minutes
Section |
Mean |
A. Introduction |
2* |
B. Employment |
8 |
C. Barriers to Employment & Opinions About Work |
2 |
D. Service Receipt & Educational Outcomes |
13 |
E. Financial Hardship |
3 |
F. Current Family Status & Demographics |
1 |
G. Income & Receipt of Public Benefits |
5 |
H. Contact Information |
2 |
Total |
36 |
*Estimate based on assumed time needed to answer four screening questions.
2. Survey Items
The debriefing protocol focused on the ability of respondents to recall details of their employment and education/training histories within the past 18 months. It also gauged whether certain concepts and terms used in the survey were difficult to understand and assessed general perceptions of survey flow. The results indicate that, overall, respondents were able to recall details of their experiences with little difficulty. When asked about the difficulty of reporting events that occurred within the past 18 months, such as the start and end dates of training courses and jobs, the locations of courses, and the dollar amount of earnings, most respondents reported that these were relatively easy to remember and that they were confident in the accuracy of their answers. With respect to understanding core concepts and terms used throughout the interview, the pre-test results were more mixed. Several respondents reported difficulty understanding certain concepts such as “on-the-job-training,” “defined career path,” and “job-placement assistance.” The findings also revealed issues related to the flow and redundancy of certain survey items, such as the sources of income questions asked in Section G. Several respondents said that many of the items listed in Section G were redundant and that the question battery was too long.
We can draw three general conclusions from the pre-test results:
The current time estimate for the instrument is below the targeted maximum average of 40 minutes, but the need to add questions and make refinements may increase the time.
The 18-month recall ability of pre-test respondents appears to be strong, and respondents report a high degree of confidence in the accuracy of their answers.
Certain concepts and questions will need to be modified to ease the burden on respondents and to improve the quality of the data collected.
Table 3 summarizes the key findings from the pre-test and proposed changes.
Table 3. Key Findings and Proposed Changes
B. Employment |
Issue 1 |
B Section: There is currently no way of gauging the primary activity respondents have been engaged in around the interview date if they are not working or in training. |
Recommendation 1 |
Add a question asking about a respondent’s primary activity during the previous week, similar to the question used in the Current Population Survey. The response categories should also include an option for military service. |
|
Issue 2 |
B2, B3, B5, B7: Questions about current and past employment do not consistently clarify that the respondent should report all types of employment for pay. |
|
Recommendation 2 |
Standardize the text in B2 and re-use it for subsequent questions; interviewers should read aloud the types of employment that qualify and say that all jobs must be for pay |
|
Issue 3 |
B8, B10: Pre-test respondents appeared to have strong recall ability and were able to remember their start and stop dates of employment with a high degree of confidence. Based on these findings, it may be possible to increase the precision of the self-reported data collection. |
|
Recommendation 3 |
Change the question wording to ask for the day (in addition to the month and year) that a respondent started and stopped working. If a respondent cannot remember the day, accept the month and year. If a respondent can remember a beginning, middle, or end of the month, the survey will contain fields to code the responses accordingly. |
|
Issue 4 |
B13c: Some pre-test respondents were confused by the term “defined career path.” Respondents were uncertain whether this had to be part of the position itself or whether a motivated employee could define a career path for him- or herself. |
|
Recommendation 4 |
Provide a definition of “defined career path” that emphasizes that opportunities for promotion and growth are specific to a given employer. |
|
Issue 5 |
B24: The first response code for the question is redundant with a “yes” answer to B23, which asks whether a respondent has been covered continuously for the 18-month period since the random assignment date. No respondent in B24 should be coded as 1. |
|
Recommendation 5 |
Change the logic so that a respondent is only asked B24 if his or her answer to B23 is “no.” |
|
Issue 6 |
B25: Three issues: (1) The response categories are wordy and can confuse respondents. (2) The wording of the question and labels on the response categories refer to “types” of insurance, but the categories really refer to “sources” of insurance. (3) There is no response category for whether a respondent receives coverage through his or her parents. |
|
Recommendation 6 |
(1) Shorten the text that interviewers need to read by dropping the “A health insurance plan…” leader on each answer category. (2) Emphasize in the wording that the question is about sources of coverage. (3) Add a response category for coverage through a parent’s or guardian’s plan, which may be more relevant for younger respondents. |
|
C. Barriers to Employment & Opinions About Work |
Issue 1 |
C Section: The section does not have an introduction. |
Recommendation 1 |
Add introductory language such as “We are interested in learning more about your household and issues that may affect your ability to work…” to ease the transition and refocus the respondent. |
|
Issue 2 |
C1: Two issues: (1) Child care needs may not be applicable to respondents without children, but the “no children in household” option is not a response the interviewer is instructed to read aloud. This is likely to lead to data errors or inconsistencies. (2) Some respondents may provide care for elderly family members rather than children. |
|
Recommendation 2 |
(1) Add two screening questions that ask respondents about the size and composition of their household, and include the number of children younger than 12 in the household as a subquestion. Only ask C1 if a respondent reports having a child younger than 12 in the household. (2) Add a follow-up question with pre-coded responses about whether respondents face any other barriers to employment, and include caring for any elderly family members in the household as an answer option. |
|
Issue 3 |
C1, C2, C4: Questions about child care, transportation, and health conditions that may affect the ability to work do not include a time reference. |
|
Recommendation 3 |
Add an explicit time reference for the past month, and then ask about any time between a random-assignment date and before the past month, such as between [random-assignment date] and December 1, 2012. |
|
Issue 4 |
C3: Placement of the question about the reservation wage throws off the flow of the battery because it is not clearly connected to the other questions in the battery. |
|
Recommendation 4 |
Move the question on reservation wage to the end of the battery but before Question C5. |
|
Issue 5 |
C5–C8: The question battery about criminal history does not include transition language and may be off-putting to respondents. |
|
Recommendation 5 |
Introduce this subsection with transition language such as “In this section, we are interested in learning about other issues that may affect employment, such as past arrests or criminal convictions. Please be assured that all responses to these questions will be kept private and will never be associated with your name.” |
|
D. Service Receipt & Educational Outcomes |
Issue 1 |
D Section: Two issues: (1) The section does not contain any questions about use of student loans among respondents, even though this is of policy interest. (2) The section does not contain questions about enrollment in and experiences with career prep courses. |
Recommendation 1 |
(1) Include questions about the dollar amount of student loans that respondents have borrowed when asking about the financing sources for their training courses. The response category can be open ended and recoded later for analysis. (2) After Question D1d, include a question numbered D1e that asks respondents whether they have taken any courses focusing on school, work, or general life skills. Interviewer instructions may be needed to clarify what types of courses would qualify. If respondents answer “yes” to this question, they will be asked the full battery of questions about the number, location, and financing of these courses. |
|
Issue 2 |
D1d: Two issues: (1) Respondents were not clear about the definition of vocational training, and some had difficulty distinguishing it from adult basic education. (2) The question must clearly state that vocational training courses cannot be for college credit to avoid double-counting them with courses that provide credit toward a college degree. |
|
Recommendation 2 |
(1) Provide an explanation of vocational training in the question that clearly differentiates it from adult basic education. The question could be modified to read “By vocational training, we mean courses or programs where you are trained for a specific occupation. This training usually leads to a certificate or license in a specified field.” This definition can be repeated for subsequent questions referencing vocational training, such as D7. (2) Modify the question to instruct respondents to only include training that was not for college credit. |
|
Issue 3 |
D4–D7: Reading the answer categories as part of the question adds to survey length and appears to disrupt the flow of administration. |
|
Recommendation 3 |
Eliminate the answer categories from the question text and instead allow respondents to provide an open-ended answer as to how many courses or training classes they took. These responses can then be mapped to precoded response categories by the interviewer, which should save time and improve flow. |
|
Issue 4 |
D5: Some pre-test respondents were confused by the use of the term “training programs” to refer to preparation for a high school diploma. |
|
Recommendation 4 |
Replace the term “training” with “courses” or “classes,” and use these terms consistently throughout the survey when referencing activities other than vocational training. |
|
Issue 5 |
D9g, D9h, D10g, D10h, D11h, D11i, D12j, D12k: A pre-test respondent indicated that his training was paid for by student loans. However, the answer categories do not allow for this response, which should be distinguished from paying for courses out-of-pocket without debt. |
|
Recommendation 5 |
Modify the existing source-of-funding categories to distinguish between students who pay for their courses out-of-pocket and students who pay with student loans or borrowed money. Also clarify in the question wording that the focus is on tuition payments, not on books or other materials. Clarify in the interviewer training that receipt of financial gifts from parents, relatives, and so on would be considered out-of-pocket. |
|
Issue 6 |
D10f: Question does not ask about when the high school diploma was awarded. |
|
Recommendation 6 |
If respondent answers “yes” to D10f, ask for the date of award (month, day, and year). |
|
Issue 7 |
D11k: Question asking “Did this payment cover the total cost, a year, a semester, a quarter, or some other portion?” may be confusing and difficult to answer for certain respondents. |
|
Recommendation 7 |
Reword the question by asking about the portion of the tuition that the respondent or respondent’s family covered. |
|
Issue 8 |
D12: The vocational training series does not currently have a question about the main or primary job that a respondent was trained for. |
|
Recommendation 8 |
Add an open-ended question about the primary job that the vocational training prepared the respondent for. |
|
Issue 9 |
D12f: If a respondent answers “yes” to the question, there is no follow-up asking for the date the credential was received. |
|
Recommendation 9 |
Add a follow-up question asking for the date (month, day, and year) the credential was awarded if the answer to D12f is “yes.” |
|
Issue 10 |
D12h: Question asks whether the respondent got a new job or a promotion as a result of the vocational training program, without differentiating between those two outcomes. |
|
Recommendation 10 |
Separate new jobs from promotions by asking separate questions for each. |
|
Issue 11 |
D13, D15a, D15d, D15e: Several respondents did not have a clear understanding of concepts such as on-the-job training (D13), academic advising (D15a), career counseling (D15d), and job-placement assistance (D15e). |
|
Recommendation 11 |
Provide definitions and/or simple examples of these concepts in the question text that is read to respondents. For D13, add “or clinical experience or practicum” to the question text. |
|
Issue 12 |
D15 series: Series does not ask respondents how many times they received academic advising, career counseling, job-placement assistance, or financial-aid advising. |
|
Recommendation 12 |
Modify D15 series to ask about frequency of receipt for all services that a respondent indicates he or she receives. |
|
Issue 13 |
D16: Two issues: (1) Pre-test respondents indicated that the D16 series was redundant because the D15 series appeared to ask about many of the same types of assistance. (2) Some answer categories provided in D16 are not mutually exclusive—e.g., the first item, “Assistance in searching for work,” overlaps with other answer choices. |
|
Recommendation 13 |
(1) Reduce and consolidate the number of answer categories to clearly differentiate these additional sources of assistance from those already asked in D15 and to reduce any overlap between categories. (2) The first item, “Assistance in searching for work,” should be replaced with a category labeled “Any other assistance looking for work” and moved to the end of the list to serve as a general category for any forms of assistance not listed. |
|
Issue 14 |
D17: Some respondents were confused by the answer category “Working clothes or tools?” |
|
Recommendation 14 |
Change the category to only offer “clothes” as an option and eliminate the modifier “working.” The “clothes” response category should also be moved to the top of the list so that it immediately follows the question text and the context of the answer category will be clear to respondents. Similarly, the “tools” category could be separate because putting clothes and tools together may confuse respondents. |
|
Issue 15 |
D18: Precoded response categories may not be extensive enough to capture respondents’ open-ended answers to the question. |
|
Recommendation 15 |
Include additional precoded answer categories such as “offers good job stability” and “it’s a growing field.” |
|
E. Financial Hardship |
Issue 1 |
E1: Some pre-test respondents were unclear about the definition of household used in the survey because interviewers are only instructed to offer a definition with a probe. |
Recommendation 1 |
Include a definition of household directly in the question text to remind respondents. |
|
Issue 2 |
E3 and E4 series: The E3 questions about foreclosures and mortgage defaults will not be applicable for many, if not most, respondents. Although the question includes a “not applicable” category, this may be awkward to implement because it needs to be read by the interviewer. The E4 series on renting, although likely to be more applicable, should also be preceded with a screening question to help with flow. |
|
Recommendation 2 |
Insert a screening question before the E3 and E4 series to ask if the respondent has owned or rented a residence since the random-assignment date. Answering “yes” to the home-ownership question will prompt the E3 series. Answering “yes” to the renting question will prompt the E4 series. A respondent may also answer “yes” to both ownership and renting questions, in which case they would be routed to both series. |
|
Issue 3 |
E5e: Question is redundant because of Question G1m in Section G, which asks about receipt of financial assistance and the amount of that assistance. |
|
Recommendation 3 |
Cut Question 5e. |
|
G. Income and Receipt of Public Benefits |
Issue 1 |
G1a series: Two issues: (1) Respondents indicated that the series provides a long list of programs and income sources that are tedious to listen to and seem redundant. The respondents may not answer accurately because of the perceived tedium. (2) Some pre-test respondents did not remember to include all household members in their answers about income sources and benefits. |
Recommendation 1 |
(1) Shorten the list of benefit questions by using an alternative grouping of government programs. After asking about specific programs such as TANF, SNAP, UI, SSI, and WIC, other programs may be grouped by source of funding. One group of questions might list together sources such as financial support from family and friends (G1m) with child support and alimony payments (G1k). A second group of questions might ask about other forms of government assistance such as General Assistance (G1d), Trade Adjustment Assistance (G1h), or Disability Insurance (G1j). A third group could list retirement benefits and other income sources together (G1f, G1g, and G1l). (2) Include a definition of “household” in the G1a question so that respondents remember to include all relevant household members in their income calculations. |
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Julie Williams |
File Modified | 0000-00-00 |
File Created | 2021-01-30 |