Attachment G Baseline Pretest Memo

0990_Attachment G_Baseline Pretest Memo.pdf

Pregnancy Assistance Fund Feasibility And Design Study(Positive Adolescent Futures)

Attachment G Baseline Pretest Memo

OMB: 0990-0424

Document [pdf]
Download: pdf | pdf
ATTACHMENT G
BASELINE PRETEST MEMO

111 East Wacker Drive, Suite 920
Chicago, IL 60601-4303
Telephone (312) 994-1002
Fax (312) 994-1003
www.mathematica-mpr.com

MEMORANDUM

TO:

Amy Farb

FROM:

Laura Kalb, Jennifer Walzer, and Sarah Forrestal

SUBJECT:

Pretest Findings for the Pregnancy Assistance Fund

DATE: 3/14/2014
PAF - 009

This memo describes the Pregnancy Assistance Fund (PAF) pretest that Mathematica Policy
Research conducted on February 13, 2014, for the purposes of improving the instrument for
baseline data collection. The memo describes 1) the youths selected for the pretest, 2) the pretest
and debriefing process, and 3) overall findings about the clarity and relevance of the survey
content. At the end of this memo, we include the questions from the baseline survey and
comments on issues discovered during the debriefing, along with possible solutions to these
issues (Appendix A). We also include a version of the survey instrument with the suggested
revisions in track changes (Appendix B). We recognize that the possible solutions will require
further discussion before making modifications to the instrument.
A. RECRUITING PROCESS AND FINAL PRETEST SAMPLE
Mathematica worked with a Chicago community-based organization, Options for Youth,
which serves pregnant and parenting young women, to recruit respondents for the pretest.
Mathematica staff met with the director of Options for Youth and explained the study, pretest
process, and parental consent forms. Options for Youth program staff identified a group of young
women in one of their programs who matched the criteria for potential survey respondents.
Program staff explained the study and handed out parental consent forms to interested
participants. They reached out to 11 adolescents in an attempt to ensure we had nine pretest
participants. Mathematica staff collected signed consent forms at the time of the pretest. 1 The
pretest and debriefing interviews were conducted at Options for Youth’s program site in
Chicago, Illinois. Pretest respondents arrived at their usual meeting time for the program, aware
that this week they would be working with Mathematica on the pretest and debriefing. Upon
completion of the debriefing, participants were given $50 gift cards for their participation and
time. We also distributed gift cards to two additional respondents who were recruited and came
to the pretest location but were not included in the pretest and debriefing due to OMB
constraints.

1

Three of the pretest respondents had reached the age of majority and therefore signed the consent form for
themselves. Consent forms for minors were signed by parents or legal guardians.

An Affirmative Action/Equal Opportunity Employer

MEMO TO: Amy Farb
FROM:
Laura Kalb, Jennifer Walzer, and Sarah Forrestal
DATE:
3/14/2014
PAGE:
2
In total, nine pregnant or parenting young women participated in the pretest. Table 1
summarizes the characteristics of the pretest population.
Table 1. Pretest Population Characteristics
Respondent Type

Educational Attainment

Parenting, Not
Pregnant

Pregnant, Not
Parenting

Enrolled in High
School

Enrolled in
College

Total

15–16

3

0

3

0

3

17–18

2

1

3

0

3

19–20

3

0

1

2

3

Total

8

1

7

2

9

Age

Our pretest sample was slightly different from the expected PAF study population. Most of
the pretest sample members were parents, but the PAF study population will primarily be
enrolled when they are pregnant. Very few members of the evaluation sample will have
participated in a subsequent pregnancy prevention program, but among the pretest participants,
all but one of the respondents had been in a program focusing on subsequent pregnancy
prevention for one to three years. Pretest sample participants in the program had already received
training on long-term contraceptive use and were required to use a long-term method of birth
control. Many terms in Section 5 of the survey were therefore familiar to them already. We do
not expect the evaluation sample to have such knowledge at baseline. Keeping this in mind,
throughout the debriefing, we asked the young women more general questions about how “other
young women your age” might understand a term. The pretest respondents were, however,
representative of the literacy levels we might expect to encounter in the actual study. This was
helpful for the more general sections and for identifying wording issues.
B. PRETEST AND DEBRIEFING PROCESS
The administration of the pretest occurred in the room where the young women normally
attend their weekly program meetings. After everyone gathered and took a seat, a researcher
gave the introduction, read through the assent form, and had the young women sign the assent
form before starting the survey. Then the researcher gave verbal instructions about how to
complete the questionnaire and asked the young women to circle any questions or words that
were confusing or unclear so that we could address these during the debriefing. We explained
that upon completion of the survey, everyone would break into small groups (two young women
paired with one researcher) for the debriefing and that when the debriefing was done, they would
receive their incentive gift card. We reminded participants that their actual answers to the survey
questions were less important than the process of formulating and providing the responses and
that they did not have to reveal their answers to the group.

MEMO TO: Amy Farb
FROM:
Laura Kalb, Jennifer Walzer, and Sarah Forrestal
DATE:
3/14/2014
PAGE:
3
We asked the pretest respondents to record start and end times on the front and back of their
surveys so researchers would know how long each respondent took to complete the selfadministered questionnaire. Once the respondent finished the questionnaire, the researcher
assigned to interview her confirmed the end time and collected the questionnaire for a review
before the debriefing. The researcher reviewed any comments respondents wrote or items they
circled in the instrument and made notes on a blank copy. We did this so the young women could
look at their individual surveys during the debriefing and the researcher could have notes from
both respondents in the group in one place. Pretest participants were interviewed either
individually or in a pair, and the debriefing interviews lasted approximately one hour.
Before the pretest, all five researchers involved in the pretest attended a two-hour training to
review logistics, review best practices for talking with youth about sensitive subjects, what to
prioritize during the debriefing, and how to address issues or problems should they arise during
the debriefing, and to review the pretest debriefing interview guide. Although the debriefing
guide included specific probes for many items in the survey, each researcher was given latitude
to rephrase the questions as needed and to choose which items to ask about if time ran short. The
debriefing guide focused on how the respondent came up with their answer, that is, the process
they went through in their heads to arrive at the answer they recorded on the form; whether they
followed instructions and completed the survey as expected; and if there were any questions or
words that were confusing or out of date. In a few cases, researchers gave alternate question
wording or answer categories to respondents during the debriefings and asked them which
versions they preferred.
C. SURVEY ADMINISTRATION OVERALL
We were interested in learning more about a number of different survey administration issues,
including length of time needed to complete the questionnaire, if our instructions were clear and
followed, and whether questions with skip logic were understandable. Overall, no major issues
were identified in these survey administration areas. The pretest respondents had little trouble
completing the instruments and following directions.
• Length of the survey instrument. The average time for completion on the pretest was
22 minutes (range 15–30 minutes). For actual survey administration, Mathematica
had planned 20 minutes for filling out the survey and 10 minutes for obtaining
participant contact information for the longitudinal needs of the survey. Although
administration was slightly over the planned 20 minutes, the pretest times might be
somewhat higher compared to an actual administration because respondents were
instructed to circle terms that were unfamiliar and think about any questions or
concerns for the planned debriefing conversations.
• Instructions on how to complete the questionnaire. During the debriefing, we asked
if the respondents read the instructions, and if they did, whether the instructions were
confusing in any way. All respondents reviewed the instructions and reported that

MEMO TO: Amy Farb
FROM:
Laura Kalb, Jennifer Walzer, and Sarah Forrestal
DATE:
3/14/2014
PAGE:
4
they were clear and straightforward. However, several respondents mentioned that
they only skimmed the instructions and did not need to turn back to them at any point
in the survey. Our review of the returned questionnaires confirmed that respondents
did not have issues with how to mark items.
• Issues with skip logic and miscellaneous recording errors. None of the respondents
reported confusion about the arrows next to questions with skips. There were very
few recording errors that were apparent from reviewing the completed surveys. For
example, all respondents followed the skip patterns correctly and no one skipped
whole pages inappropriately. One respondent did not want to answer questions about
her father or the person she thought of as a father in Section 2, so she skipped them
all. However, she inadvertently skipped some questions at the end of the section
because she thought they were still about her father, even though they were not. On
the true/false series, some young women wrote in comments such as “don’t know,”
“would like to know,” or in some cases provided what they thought we should have
said. In all of these cases, the young women still selected either “true” or “false” for
the item.
In addition to the general comments above, the pretest provided insights about several
potential issues on specific questions. The suggested revisions based on this feedback are listed
in detail in Appendix A, question by question. For questions that had been administered on
another survey (such as the Pregnancy Prevention Approaches [PPA], the Personal
Responsibility Education Program [PREP], the Teen Pregnancy Prevention Replication Study
[TPP], and the Concordance survey), we were mindful of whether revisions would prevent the
PAF survey data from being compared to estimates from other samples. The instrument draft in
Appendix B contains all of the recommendations in track changes. This version has not been
fully re-formatted to prevent tracked formatting changes from detracting from content changes.
We will format the questionnaire once all agreed upon revisions are incorporated.

cc: Matthew Stagner, Susan Zief, Melissa Thomas

APPENDIX A
RECOMMENDED REVISIONS FOR BASELINE SURVEY ITEMS

APPENDIX B
BASELINE SURVEY INCORPORATING RECOMMENDED REVISIONS


File Typeapplication/pdf
AuthorDPatterson
File Modified2014-04-16
File Created2014-03-27

© 2024 OMB.report | Privacy Policy