Pretest Results Memo

Att C. YCC Pretest Results Memo-DOL.docx

Youth CareerConnect Evaluation

Pretest Results Memo

OMB: 1290-0016

Document [docx]
Download: docx | pdf

M EMORANDUM





P.O. Box 2393

Princeton, NJ 08543-2393

Telephone (609) 799-3535

Fax (609) 799-0005

www.mathematica-mpr.com



TO: Jessica Lohmann and Gloria Salas-Kos



FROM: Nan Maxwell, Alicia Harrington, and Veronica Severn DATE: 9/1/2017


SUBJECT: YCC Student Follow-up Survey Pretest Findings

Shape1


In preparation for the Youth CareerConnect (YCC) student follow-up survey instrument, Mathematica Policy Research conducted a pretest of the student follow-up instrument. The pretest had four main objectives: (1) assess the overall organization and logic of all items; (2) capture respondents’ understanding of key terms and questions; (3) assess the accuracy and relevancy of the questions and the completeness of the information captured; and (4) provide burden estimates.

This memo provides an overview of the pretest design and presents key findings from the debriefings with respondents. The final survey instrument is attached in Appendix A. In addition, prior to the start of the pretest, we received the first round of feedback from DOL reviewers on the survey instrument. Our responses to the comments from DOL reviewers can be found in Table 1.


A. Pretest Methodology

The pretests were conducted in stages. In the first stage, respondents returned a completed parent consent form and student survey instrument. In the second stage, they participated in a telephone debriefing conducted by trained Mathematica staff using a retrospective protocol (Appendix B) to identify any issues with the survey instrument.

We conducted a total of five interviews as part of the pretesting. All pretest respondents received a $25 gift card for completing the survey instrument and the debriefing.

Mode: The pretests were self-administered using a paper version of the instrument. Because the survey instrument will ultimately be self-administered either by paper-and-pencil or web, this format best approximates the manner in which the survey will be fielded. The survey instrument was sent to respondents via email. Respondents were asked to print out and complete the survey instrument and scan and return it to Mathematica by email or fax upon completion. Respondents were instructed to keep a copy of their completed survey instrument as a reference during the respondent debriefing call, which was conducted upon receipt of their completed hardcopy.

Sample. We conducted pretests with students referred by program directors from two existing YCC grantees (not participating in the follow-up survey administration) to ensure the pretest sample was similar to the population that will complete the survey. The sample consisted of students in the YCC program, as well as students who did not participate in YCC, similar to those in the study’s comparison group.

B. Debriefing Findings

Feedback provided during the debriefings for the student follow-up survey instrument produced burden estimates and minimal updates to the instrument. The average administration time was 22 minutes. Only a few small text changes were undertaken as a result of the pretesting.

Student debriefs revealed most respondents were unable to accurately define the term “capstone course.” In order to clarify the term, we added the definition, “A capstone course is a class usually completed at the end of high school that uses skills and information you have learned throughout your education.” to the survey item. While most students were familiar with the term “dual enrollment,” one student expressed confusion at the term. We changed the description of the term from “These are courses that are taken during high school, but also count as college credits.” to read, “These are courses that are taken while you are still enrolled in high school and, if passed, count towards credit for a college degree.”

Some students were confused by the term “community service learning.” We modified the item to include a description of the term, “Community service learning is a hands-on learning activity for students that also includes a community service component.” All respondents indicated that they were unfamiliar with the term “skill badge.” We modified the item to read, “Earn a ‘badge’ for a specific skill, talent or other achievement.”

Many respondents were unfamiliar with the types of degrees or certificates that they might earn through their high school coursework or activities. To provide examples, we used data from the PTS to add the text, “For example, you may have earned a CNA license or a certification for CPR, Microsoft Office programs, Adobe, OSHA, AutoCAD, or other occupational skills.” Respondents also indicated confusion with the term “vocational certificate.” We changed the definition provided from, “A vocational certificate is a certificate from a college or trade school for completion of a program providing job-focused training for specific careers such as physician’s assistants, paralegals, pharmacy technicians, automotive mechanics, or information systems programmers.” to “A vocational certificate is a certificate you might receive from a college or trade school after completing a program that prepares you for a specific career.”

C. Mathematica’s Responses to DOL Reviewer Feedback

DOL reviewers provided feedback on the follow-up student survey instrument draft prior to the pretest effort (Appendix C). Revisions made to the survey instrument in response to DOL feedback were included in the version of the instrument that was pretested. Table 1 outlines the item number (consistent across instrument versions), comments made by reviewers, and how each comment was addressed.

Table 1. Mathematica’s responses to DOL reviewer feedback

Item #

DOL comment

Survey revision

Introduction

Make the text “The survey should take around 35 minutes to complete. To thank you for completing the survey, we will send you a gift card worth [FILL $25 OR $40]. The card can be used anywhere that a credit or debit card can be used.” more prevalent by highlighting or bolding it.

We bolded the text.

Introduction

Is it necessary to state the sentence, “Your information may be linked with federal or state administrative data, such as your school grades or attendance record, for future study purposes?” If not, I suggest deleting it.

We made no change; this text is required for IRB.

Introduction

Add a positive comment, “However, with your input youth programs like [YCC or the name given by the school district] can become stronger, more appropriate, and better able to meet youth needs in the future” to the text, “Nothing bad will happen to you if you don’t want to participate. You can stop being in the study or completing the survey at any time.”

We added the text, “However, with your input, youth programs can become stronger, more appropriate, and better able to meet youth needs in the future.” We did not include the reference to YCC because many students might not know their program by this name.

I3/I4

Can this not be matched administratively or prepopulated with the name?

We made no change; we need these questions for verification purposes to compare against the baseline and ensure we have the correct student.

I5

Is there a reason students should not list a school email address? What if the school email is the one they use most or their only address?

We changed the instruction to read, “Please do not list a school email address unless it is the only email address you use.”

We want to avoid school emails, if possible, since administration of the survey may go into the summer at which time many students my not check this email.

A6b/ A7b

Should there be a “do not know” answer for all the grade questions?

We did not add a “don’t know” option to these items as respondents may be more likely to select that option, rather than provide a grade. Respondents will be able to skip the question without entering a response if they do not know what grade they earned.

A13

Do you want to give some examples of support services?

We modified the question to read, “Since [SCHOOL START DATE], did you receive any support services at school other than the ones previously listed?”

B1

Are there common examples that could be given here or in the next question?

We added the text, “For example, you may have earned a CNA license or a certification for CPR, Microsoft Office programs, Adobe, OSHA, AutoCAD, or other occupational skills.”

D1h

Diligent about what? Say instead, “I am conscientious and care about my work.”

We made no changes. This item is from the grit measure and was used at baseline.

D3

If respondent select no, it may be interesting to ask if they have a child support obligation.

We added D3a to ask, “Do the children rely on you for financial support?”

D4c

Some people take regularly prescribed medications. Maybe include a fourth question to ask if they are under doctor care and take prescription meds or maybe say “illegal” to make sure they aren’t answering for their own prescriptions.

We made no changes. This item was used at baseline so we will keep it as is to compare changes.




An Affirmative Action/Equal Opportunity Employer

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorALeonard
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy