ATTACHMENT h
Pretest Report and Question by Question Discussion
M
EMORANDUM
955 Massachusetts Avenue, Suite 801
Cambridge, MA 02139
Telephone (617) 491-7900
Fax (617) 491-8044
www.mathematica-mpr.com
FROM: Laura Kalb, Kristen Velyvis, Alan Hershey DATE: 9/22/2009
SUBJECT: Pregnancy Prevention Approaches Pretest Findings
This memo describes the Pregnancy Prevention Approaches pretest conducted August 17-18, 2009, by Mathematica and Child Trends for the purpose of improving the data collection instruments and procedures. The memo describes (1) the recruiting process and youths selected for the pretest, (2) the pretest and the process for debriefing participants, and (3) the overall findings about survey administration and content. At the end of this memo, we have included an appendix listing individual questions and issues discovered during the debriefing. We have also included a copy of the debriefing guide used.1 When issues are raised we have not included all possible solutions, as we recognize that further discussion is needed before we make modifications to the instrument. Indeed, because many of our questions come from well-established national surveys, it is important to weigh the findings of our pretest—and possible alterations of wording designed to make questions as clear as possible—against the aim of comparability with other research and national data.
Mathematica worked with three community-based organizations (CBOs) in Princeton, New Jersey, that serve teens to recruit pretest participants: Home Front, Corner House, and HiTops. Each organization received flyers with information about the pretest, which it displayed or distributed to age-appropriate youths served by the organization. The flyers asked teens to call Mathematica if they were interested in participating. In a few cases, the contact person at the CBO identified likely respondents and approached them on behalf of Mathematica. Mathematica or CBO staff then talked directly with all interested teens to explain the pretest and the need to obtain parental consent prior to their participation. As part of that call, the teens were asked whether they would want to be part of the sexually active group or the non-sexually active group and whether they had ever had sex.
The pretest and debriefings were conducted at the offices of Mathematica and HiTops (one of our community partners), both in Princeton, New Jersey. The teens were asked to participate in one of five pretest administrations, during which small groups (of four or five teens) completed the self-administered questionnaire in a group setting and then went to a one-hour one-on-one debriefing with a Mathematica or Child Trends researcher. Upon completion of the debriefing, the teens were given $50 in cash for their participation. Teens from Home Front were brought to Mathematica by a member of Home Front staff for their appointments; teens from Corner House were brought to Mathematica by a parent or family member; and teens from HiTops met us at the HiTops center.
In total, 17 teens participated in the pretest—8 sexually active youths and 9 non-sexually active youths; 8 boys and 9 girls who ranged in age from 13 to 17. The following table details the distribution.
Age |
Male– Sexually Active |
Male– Non-sexually active |
Female– Sexually Active |
Female– Non-sexually Active |
Total |
13 |
0 |
1 |
0 |
0 |
1 |
14 |
0 |
0 |
1 |
2 |
3 |
15 |
1 |
2 |
1 |
3 |
7 |
16 |
1 |
1 |
1 |
0 |
3 |
17 |
2 |
0 |
1 |
0 |
3 |
Total |
4 |
4 |
4 |
5 |
17 |
In many ways, the pretest sample represented the two population extremes that we are likely to find in the real study. At one end, three of the pretest teens are peer counselors in the HiTops Teen Council program (similar to Teen PEP), in which they receive training in peer mentoring on how to talk with other teens about sexual health, including issues such as unplanned pregnancy, contraception, HIV/AIDS, other sexually transmitted infections, homophobia reduction, dating violence, date rape, sexual harassment and the impact of alcohol and other drugs on sexual risk taking. The teens are all from an affluent community in which the organization is located. At the other end of the spectrum, 13 youths are currently being served by Home Front, a community organization that helps struggling families find adequate and affordable housing and provides skills and opportunities to these families to ensure adequate incomes.
The administration of the pretest mirrored (as closely as possible) what will happen during the actual study in a classroom environment. That is, we gathered the teens together in one room, had a researcher give an introduction and verbal instructions about how to complete the questionnaire, and had the teens read and sign the assent form before opening the survey packet. Each teen was then told to open his or her survey packet, which contained one Part A questionnaire and two Part B questionnaires (a sexually active Part B and a non-sexually active Part B). The only differences between the instructions we gave to teens during the pretest and the instructions that will be given to teens during implementation of the actual study are that in the pretest we asked the teens to circle any questions or words that gave them trouble so that we could address them during the debriefing, and we explained how the one-on-one debriefing and incentive payment would work.
Researchers timed how long each teen took to complete the questionnaire. When a teen finished the questionnaire, the assigned researcher took the youth to a private office for the one-on-one debriefing. The assigned researcher was matched by gender to the pretest participant. Each debriefing lasted approximately one hour.
Before the first pretest, all six of the researchers involved in the pretest attended a three-hour training to review logistics, best practices for talking with teens about sensitive subjects, what to prioritize during the debriefing, and how to handle upset respondents or address issues or problems that they arose during the debriefing. Although the debriefing guide prepared by Mathematica and child Trends included specific probes for many items in the survey, each researcher was given latitude to rephrase the questions as needed and to choose which items to ask about given that there were more listed than could be covered in the one-hour debriefing. The debriefing guide focused on how the respondents came up with their answers, that is, the process they went through in their heads to arrive at the answers they recorded on the form; whether they followed instructions and completed the survey as expected; and if there were any questions or words that were confusing or out of date. In a few cases, alternative question wording or answer categories were given to respondents during the one-on-one interviews and they were asked which version they preferred and why. It should be noted that each researcher had the completed questionnaire and referred to it during the debriefings, especially when answers appeared contradictory to what had been said during the debriefing. These discrepancies are good indications of problems with question wording.
We were interested in learning about a number of different survey administration issues, including the length of time needed to complete the questionnaire, if our instructions were clear and were followed by the teens, and in particular, whether teens understood how to select the correct version of Part B. The debriefings provided the following feedback on survey administration:
Length of the Survey Instrument. The overall length of the questionnaire seems to be good. The time needed for teens to complete the survey (Parts A and B) ranged from 20 to 37 minutes, with an average time of 28 minutes. It should be noted that the participant who took 37 minutes to complete the survey filled out both Part B questionnaires. We expect that the study introduction, distribution, and collection of the survey packets will take about 10 minutes when we get into a classroom setting. (The pretest is not a good test for these administrative functions, given the need to explain the debriefing process and the fact that the pretest groups were much smaller than we will typically encounter in a classroom environment.) The estimated total administration time of around 38 minutes should fit within a typical class period.
Instructions on How to Complete the Questionnaire. During the debriefing we asked if the instructions were clear or confusing in any way. Many teens reviewed the instructions and reported that they were fine, clear, and straightforward. However, through further questioning a number of teens admitted that they had not actually read the instructions at the beginning of Part A when they were completing the questionnaire on their own. This was particularly problematic for completing skips correctly (see next bulleted item). In addition to problems with skip logic, there was some misunderstanding about what “check all that apply” means. After researchers learned in debriefings of the early pretest groups that teens were not reading the instructions, we spent more time—with the last group—reviewing the instructions on how to complete the instrument before participants began the survey.
We recommend spending more time explaining how to complete the instrument during the introduction phase in the actual data collection. We will also review to see if changing questions from “check all that apply” to “check one answer for each item” makes more sense. Alternatively, we will consider using a different phrase, such as “you can check one answer or more than one.”
Issues with Skip Logic. Some teens told us that they did not really know what the arrows in the questionnaire were for, and consequently they did not skip questions correctly. In a few cases the respondents followed the arrows to the left of a response, but did not answer the question first, so we do not know the response to the original question, nor if they skipped correctly. Skip errors were very prevalent in Section 4 of Part B1 and came up in Sections 1, 2, and 5 as well.
Despite not following the skips correctly, teens were usually able to answer questions they should have skipped and continue with the survey, as most of these questions have a “never” or “none” option, thereby allowing all to answer. Because it is likely that many youths will not take the time to read instructions and might therefore answer certain questions incorrectly, we will review all questions to make sure anyone could answer and/or will consider removing the skips altogether.
Selection of Part B. For the most part, the teens read the instructions and correctly identified which version of Part B they should complete. However, 3 of the 17 teens responded incorrectly to Section B. One respondent filled out the wrong Section B because he did not read the instructions carefully. Two teens (both of whom were young and seemed to have other comprehension issues) completed both versions of Part B. One of them correctly followed the instructions and did Part B2 first, but then thought that because 2 follows 1 she should have done version 1, and proceeded to go back and do that as well.
Miscellaneous Recording Errors. There were few other apparent errors. No one skipped pages and only a few skipped questions inappropriately (beside skip logic errors). Some respondents inappropriately skipped questions because they could not find an appropriate answer. Details on some of these are provided in the appendix. In some cases respondents wrote in “0” or “none” rather than selecting that choice; this often happens on self-administered questionnaires and is easily dealt with in data cleaning.
In addition to determining problems or issues with the completion of the instrument, the pretest was designed to review the content and language used in the questionnaire. The following are some general observations and themes that emerged from our discussions with teens.
Terminology and Language Level. Teens were asked if specific words and terms were commonly understood and up to date. Issues with terms are summarized in detail in the appendix, some with suggestions for additions or alternate wording. Examples of terms that were troublesome for at least some respondents include “reproductive health,” “religious services,” “menstrual period,” “abstain” and “abstinence.” In addition, some of the sexually transmitted infections were unknown to youths, but this is to be expected.
Date Versus Age. In a number of places in Section 4 of Part B1 of the questionnaire we asked respondents to report when an activity happened. In the pretest we wanted to better understand if teens could remember specific dates, how they arrived at those dates, and if it was easier for them to recall their age at the time or the month and year an event occurred. Most teens reported that it would be easier for them to report their age when something happened rather than the month and year. Despite saying reporting age would be easier, for two questions requesting dates most respondents were able to supply an answer (four of six for date of first sex and four of five for date of first oral sex).
For those who recorded dates, they said they were able to do so because they linked the activities to some event or time of year (for example, “It first happened in the summer and I know it wasn’t August so I put July,” or “It happened at a party,” so they thought about when the party took place). Some teens also reported that what they remembered was the grade they were in at the time and then translated that into either their age or a date. We could try to promote use of such memory aids with question probes in the text of the questionnaire.
We also noted that at least one person could not think of the month, so they put “DK” for both the month and year. We could consider putting a “DK” option for the month as well as the year, to elicit as much information as possible. We also noted that for at least one respondent, the age she gave was likely older than she was at the time; had she given only the year we would have had a better idea of her age.
Given our preference for specific dates (over ages) and that for the most part teens were able to come up with some date (even if it was off by a few months), we probably should continue to request specific dates. We could consider double questions—asking them to record either a date or an age—which while likely reducing non-response might encourage those who could have estimated a date to provide the easier age answer.
Definitions. When asked directly, most teens told us that they liked having definitions and thought the ones provided were clear and understandable. When asked if they would have liked more definitions, many said they would have liked definitions for oral and anal sex or thought others would, but fewer thought it was necessary to have a definition of condoms. The debriefing question about wanting definitions for oral and anal sex was in relation to the question used for branching to different versions of Part B (Q3.18). Even though teens thought having definitions provided at this question (or earlier) would be helpful, most were still able to answer Q3.18 and all but three completed the correct Part B based on their responses. (As noted previously, two youths completed both versions of Part B, although one teen did the correct Part B first, then went back and did the other version.)
Too Many Words. Some of the teens admitted that they did not bother to read the full question wording (or definitions or instructions), often skipping to the individual items in a list and using the answer categories or headings to understand what was being asked. This is an interesting dilemma, especially in light of the previous comment about wanting more definitions. We might consider simplifying the stems of the questions and adding more information to the individual items to ensure teens understand what they are answering. However, this will actually increase the number of words on the page. Alternatively, we might consider breaking apart some of the list questions into shorter, simpler questions. We will review questions with any eye toward what teens would likely “see” if they just skim over the question stems.
Don’t Know and/or Neutral Answer Options. For a number of opinion questions (Q2.10, Q2.18, Q3.1, and Q3.2) a number of teens suggested they would have liked having a neutral (neither agree nor disagree) or a “don’t know/not sure” option. When teens said they had not done something or had never discussed it, they felt uncomfortable answering that they agreed or disagreed with the statement. For example, teens who had never had sex (or never discussed it much) did not want to voice an opinion about whether using a condom was a hassle. Likewise, two teens said that their parents neither approved nor disapproved of them having sex, provided they were being safe about it. About 20 percent of the responses to Q3.2 about views on condom use were left blank or filled with “DK”. We will need to discuss the pros and cons of adding a neutral category or “don’t know” to one or more of these opinion questions. We also could consider adding a category—“Never thought or talked about this”—that is slightly different than the neutral or “don’t know” option. Adding any of these options might, however, decrease the number of teens giving a substantive opinion; alternatively, it may reduce the incidence of missing data. We also should review the original question sources as comparability should be considered if we decide to reopen the discussion about neutral responses or mid-points on opinion scales.
Household Makeup and Father/Father Figure Series. There are a number of issues with the household composition question, which also potentially affect the follow-up series about fathers or father figures. Although some teens had no difficulties with the household composition question, others were confused about the “check all that apply” instruction, confused about whether to include people who do not actually live in their households, or took some questions too literally. Examples of the latter include not understanding how to answer based on the way we labeled the potential household members: One teen mentioned not understanding what the parentheses around the “s” at end of some categories meant, as in sister(s) or cousin(s). Another teen did not want to mark the “Grandparents” category because she lives with her grandmother only, not with her grandfather, and a third teen asked how you were supposed to respond if you had a twin brother or sister or gay parents.
We recommend that we consider breaking this question down into multiple, simpler questions, perhaps having them mark yes or no to each potential household member.
There were many issues with the Father/Father figure series. Four teens who said in the household roster they did not live with their fathers filled out the sections about their fathers/father figures anyway. Two of them filled Q2.12 consistently with the household roster, but did not skip correctly. The third teen said in Q2.12 that she did live with her father, but did not mention him in the roster. The fourth youth did not answer Q2.12, but did not mention a father in the roster. Two other teens marked fathers or father figures in the household roster and did not fill out the father series about them, both saying “no” to Q2.12 and skipping the father questions. Yet another left Q2.12 blank, although, based on her roster, she skipped appropriately from there. It appears at least 30 percent of the respondents had difficulty with the roster and/or father series sections.
We should consider what we want the father figure series to represent: the biological father even if he is not living in the teen’s household, or only a father/father figure in the household. Further compounding the issue is whether or not we can actually tell who the respondent was thinking of when answering the father series questions.
cc: Chris Trenholm, Kristin Moore, Mustafa Menai, Kristine Andrews, Kassim Mbwana
Q1.3 – Hispanic or Latino origin. In error we have a skip over the race question for people who are Hispanic. We should remove the skip.
Q1.8 – Classes or special programs covering different topics. Seemed to work well. One issue was some confusion over what “special programs” might mean (different teens had different definitions) but generally got the concept. No change suggested. About 7% of the answers in this series were DK, which might suggest respondents didn’t know what some of the topics were, but these were fairly randomly divided among the 7 sub-questions – 3 for Q1.8a and 3 for Q1.8g, and 1 only for 4 of the other 5 sub-questions. Several of the debriefed respondents mentioned they did not know what some of the topics were, but the majority seemed to get them. Some said if they didn’t know a topic, they answered no, assuming they didn’t have a class on it. One respondent suggested definitions would help. “Refusal skills” was unclear to one R; birth control methods and peer pressure to another; physical development and reproduction to a third (pregnancy and puberty or growing up might have been better). One respondent pointed out that b-g are part of sub-question a.
Q1.9 – Time spent in after-school activities. Couple of minor issues – some teens included weekends and others didn’t. At least two miscalculated hours – one because he thought we were asking for days not hours, and the second just put a number in but when talking through during the debrief came up with a different number (said he didn’t take the time to really think about it when completing the survey). One teen asked whether paying jobs that didn’t involve a pay check (such as babysitting or cutting lawns) counted as paying jobs. Another teen said that his hours changed so much from week to week that he ended up putting don’t know (rather than picking some number of hours). Recommendation – need to review idea of limiting to only week days, how to stress hours (not days) and adding examples for paying jobs. Not sure other issues are worth addressing.
Q1.10 – Future plans. Teens asked about their preference for using a ‘certain’ scale or a ‘likely’ scale. Those with a preference were fairly evenly divided, with slight preference for the ‘certain’ scale. The answers respondents gave were largely the same regardless of the scale used. Of five respondents who answered 5 questions each using each scale, four answers changed (4/25=16%). However, we need to look more closely at what these changes mean for the data we would be collecting. Regarding graduating from high school, being not at all certain is very different from it being not at all likely. We need to re-think what data it is we want, as the debriefings did not show trouble with either scale, and differences were merely, and really, that – differences. Also, if we change to a ‘certain scale,’ we should probably do so for all the current likelihood scales.
At least one respondent “didn’t get” what graduating from a 2-year college program meant. We might consider combining 2 and 4 year college programs as “2 or 4 or both.”
Q2.1 – Household make-up. There are a number of different issues with this question. Teens have different ideas of what “most of the time” means and tend to say this means someone living in the household most days during the week, almost always, or at least a week out of the month. What is interesting is that two teens said that if a person was in their household when they were asked to complete the survey, they would have included them (like her mother’s boyfriend or a sibling at home for the summer). More problematic is the fact that in two cases, teens included their father as living in the household despite telling us in the debriefing that their father lived elsewhere and either they split their time between their mom and dad or that their dad visited regularly.
Respondents also mentioned trouble understanding what (s) means and what “all that applies” means. Twins and same sex parents also were identified as problems to enter, and the definition of biological mother was requested by one respondent. (There’s also a typo in this question – younger brother(s).)
Recommendation – break this into smaller, simpler questions first about mom and dad, then about other adults, and last about kids in the household. Might want to consider marking yes/no to each person listed to get around the mark all that apply issue (recognizing that some will only mark the ‘yes’ column). Write out possible plurals, such as “younger brother or brothers.”
Q2.2 – Parents you live with being married. While some teens said they had no problem with this question, those not living with both biological parents had more issues. Two teens skipped this question altogether saying they thought it only applied to those living with two parents. Another teen said that she wanted to answer both “2-No, they are not married to each other,” and “3-No, I do not live with two parents.” Recommendation - this question still needs work.
Q2.3 – Family eat meals together. There did not seem to be any problems with this question. One respondent mentioned she thought of dinner when read “meal.”
Q2.4 – Set up question about mother/mother figure. Seemed to work well as a set up. All teens answered the series, although two did not mark yes to 2.4 before following the arrow. Most said the arrow was straightforward, except the two that made the errors. It appears both saw the arrow and followed it, but did not realize they should answer the question first.
One teen said he expected the answer choices for this question to be Yes-live with mother; Yes-live with mother figure; No-don’t live with mother or mother figure, as opposed to a straight yes/no choice.
No recommended changes other than discussions prior to administration about arrows and how to complete and follow skip patterns.
Q2.12 – Set up question about father/father figure. Similar to mother question, the same two teens skipped straight to the next question without marking Q2.12. Based on Q2.1, neither has a father living with them, so both should have skipped to Q2.20. Only one did. In addition, two other teens completed the father section about their dads who do not live in the household (they had no father figure living in their households). 25% of respondents filled this question out in a way that is at odds with Q2.1, and more than 15% of respondents filled this section out in error.
Q2.10/Q2.18 – How mom/dad would feel about having sex. About half of the teens (6) asked about this question said they were only thinking of vaginal sex and did not include oral or anal sex, five teens said they were thinking of any type of sex, including oral and anal. Perhaps we need to be more specific about what we mean.
Three teens mentioned that they knew of parents who would neither approve nor disapprove of their kids having sex (provided they were being safe) but only one knew parents who would neither approve or disapprove of their kid having a baby. They asked about including a neither approve nor disapprove category. Only one respondent asked for a definition of disapprove, asking, “Does that mean mad?”
Q2.20 – Relationship with parents. Most teens said they were thinking about June when asked about “the last month” but this was likely an artifact of the timing of the pretest since the first item on the list is about school which necessitated them thinking back to June This should not be an issue when actual data collection takes place as it will occur during the school year. Most teens also preferred the alternative version with the statements using “I” rather than “you”. Seven respondents completed the alternative question versions, answering 5 questions each of each version. Ten out of 25 answers changed (40%). The changes seem to go in both directions – I could not detect a pattern to the changes, but all but one moved only one step on the scale in one direction or the other. One alternate question was left blank. Recommendation – change “you” to “I” throughout unless we feel that using the “I” in the phone version is too awkward.
Q2.21 – Talking with parents. No problem with “one of your parents” concept. Teens were split about whether “how things are going in school” and “discussing grades” was the same thing, with more (6) respondents thinking they were different and 3 or 4 thinking they were the same. For those who thought of these as different they still had no problem coming up with an answer, but thought their answers would be different if the two were split.
Q3.2 – Condom use. All said they knew what a condom was, but some still thought adding a definition would be a good idea.
When asked about other names for a condom, the other term most commonly used and recognized by teen is “rubber,” but some said condom is still more common. These were the first questions (Q3.1 and Q3.2) where teens mentioned they would have liked a “don’t know” response and a “neither agree nor disagree” option. In some cases this was because the teen said it was neither good nor bad and for others who had not yet experienced sex some felt they really didn’t know.
For Q3.1, one respondent said she did not think that sex is ‘good’ for her, but it’s not a bad thing either. She thought better wording might be “having sexual intercourse isn’t a bad thing for me to do at my age.”
Q3.4 - Likelihood of doing things with person you like. For the touching chest item two teens thought this item was odd when thinking about males, but three teens thought it was fine.
Most respondents thought of “them” as either a male or female, not a group of boys and girls.
Slightly more with a preference preferred the “certain” scale over the “likely” scale, while some noted a difference without having a preference, and one preferred likely. Four respondents answered 5 questions each; five of the answers changed with the different scale in the alternate question. We might look more closely at the respondents’ interpretations of these two scales before deciding whether to change anything in the original question. See discussion in Q1.10.
Q3.6, Q3.10, Q3.14 – Confidence in answer. The teens generally thought the follow up confidence questions were to determine if they were guessing in the previous question. All said that they would use “a little bit” confident if they weren’t sure in their previous answer, which is what we wanted. “A little bit,” “somewhat,” and “very” were all used as responses, but no one used “not at all” confident as a response. I would guess if they were not at all confident, they would use DK in the previous question. However, those who used DK in the previous questions still answered the confidence questions, so that seems odd to me – what does that mean? Don’t think it is necessary to change this question, just recognize that the lowest end of the scale may not be used often.
Q3.15 - Likelihood of having sex at different points in the future. Teens were split on whether or not to include a definition of oral sex, 7 saying they would have liked one or thought others would, and 3 saying it was not needed. One other offered a definition of her own without an opinion on whether one was needed. Those that offered a definition on their own (when asked during the debriefing) had a good concept of what it meant (although one teen suggested that oral sex was talking about sex - this was the 13 year old and said having a definition would have helped).
The teens were also split on which version of the answer categories (certain or likely) they preferred, 3 preferring likely and 1 certain. See discussion in Q1.10 about the differences in meaning that come from changing this scale. Having sex by age 18 being somewhat likely is very different from someone being somewhat certain about it happening. The teens seemed to pick up this difference and discussed it, but did not have a problem with either scale. The question for us is what data do we want. At least for this series, I seem to think likely is closer to what we are looking for.
Q3.16 - Dating in past 3 months. Several teens alerted us to possible confusion with the terms “going out” and “gone out.” Both of these terms have 3 possible meanings:
a singular date
a relationship
out with friends
Thus, in a few cases the teen thought we were asking how many people they had gone out with rather than number of times they’d gone out. In another two cases teens said that going out could be a group of friends going to a place together. One possible solution is to drop the phrase “or gone out with someone” from our original question or change it to “gone out on a date.” Everyone seemed clear on what a date was.
Q3.17 – Number of people dated in past 3 months. This question highlights the problem with “gone out” again, and highlights two definitions for “dated.”
Someone you have had a date with
Someone you have had a relationship with.
In five cases the teens put the same number in Q3.16 and Q3.17 and it is not clear that they understood the difference between these two questions. For one teen when asked specifically about the number of dates she had she said 10-20, but her recorded response was 1, which is the same answer as she gave in Q3.16. Again, we can consider dropping the term “gone out with” from this question, and leave it as, “…about how many different people have you had a date with?” We might lose partners they had relationships with but did not have a date with, however.
Q3.18 – Ever had sexual intercourse, oral, or anal sex. Most teens suggested that having definitions of oral and anal sex at this question would have helped. Furthermore, they suggested that the reason a person would mark “don’t know” to this question would be because they didn’t know what oral or anal sex was and therefore couldn’t say whether or not they had done it. Only one teen suggested a reason for a “don’t know” response might be being drunk or drugged and not remembering whether or not they had sex. Despite claiming definitions would have helped, most were able to answer this question (that is there was only one “don’t know” response) and based on the debriefing questions, most appeared to have answered this question “correctly” based on their sexual experiences.
All said they felt safe answering this question truthfully citing our confidentiality. Only one teen suggested they might feel uncomfortable answering this question in a classroom setting. One suggested we strongly highlight confidentiality in schools.
As mentioned earlier, three had problems selecting the correct Part B version (two teens did both versions of Part B and one teen picked the wrong version of Part B). For this teen, he said he was confident that he picked the right version, but admitted that he did not read the front page of the Part B packet and jumped straight into the survey. Nearly all (except the two that did both versions) stated that the directions were clear and that they were confident in their choice of Part B versions. One teen did suggest that respondents may have trouble picking the right packet due to being misinformed or not knowing what oral and/or anal sex is and reiterated that a definition earlier would help with this problem. There was also one potential problem with a respondent starting with B1 instead of A. We might work harder to distinguish the A packet from the two B versions to avoid that.
In terms of comfort selecting and doing this packet in a classroom setting, most said they don’t think this would be an issue citing that the packets look the same and that everyone would be doing their own packet. Two teens did express some concern, but one stated she would likely just cover her answers.
Please note, only 8 respondents were sexually active, but 10 completed section B1. One sexually active respondent did not complete Part B1 in error, but read the questions in the debriefing and responded to the debriefing questions asked. Since not all respondents were asked all debriefing questions, some debriefing questions have responses from fewer than 11 respondents.
Q4.1 – Sexual intercourse. The four respondents asked whether they read the directions to this question had varying responses. One said she read them, two said they read the first few sentences, and the last said he scanned them. This suggests we should simplify and make instructions shorter and more part of the question.
Of the four who were asked what “kept private” means, three knew and one had no idea. She said it meant nothing to her. She was having some cognitive and emotional challenges with the pretest and debriefing, I think.
The majority seemed to think the definition of sexual intercourse was helpful in Q4.1. One respondent said it was not – she still didn’t know what it meant, but others thought it was helpful to distinguish exactly what we meant.
Q4.2 – Date of first intercourse. Of the six respondents who responded to Q4.1 that yes, they had had sexual intercourse, 4 produced a date and two marked DK. Two remembered the date by an event – for one a birthday helped her place the month and year and for the other a party did; two said you just remember your first time – one of them determined the month by the season, but she had to think about the year. For one of the two who marked DK, she was not able to recall the specific month, so did not put the year in either; she had determined her grade, the year (which she was confident of) but could not nail down the month. The second had problems remembering the month and year. Two others had trouble with the month. One entered it with a “?” and the second did not fill out this section, but told the person who debriefed him he had trouble recalling the month.
This suggests we should consider putting a “Don’t know” option for the month, so that respondents can give a DK for month and still answer for year.
Q4.3 – Age at first intercourse. Of the six respondents who responded to Q4.1 that yes, they had had sexual intercourse, 5 produced their age and one left this question blank. Of the six respondents asked about supplying their age of first sex, most thought it was easier than remembering the date. Five said they just knew or remembered their age. One of them thought about the grade she was in, but noted that she gets confused about age because it changes in the summer. A second of them thought of his first sexual experience which was when he lived in a certain town, and he remembered how old he was when he lived there. Two of them thought of the first person they were with and then about their age when with that person for the first time. These processes indicate that actually people don’t “just know.” They tag the event back to something else and use that to determine their age. We should figure out how to build on that if we choose to ask age.
Incidentally, the respondent who could remember the year but not the month determined her age was 16 when she had sex. She turned 16 in November of 2008, so chances are that if she really had sex for the first time in 2008, she was 15, or would have remembered the season… She had first put 2007, which makes me guess again that she was closer to 15. Dates are harder to manipulate than age.
Finally, one respondent (who left age blank) said that she feels like she lost her virginity twice – once as a baby when she was raped and once when she wanted to. She wasn’t sure how to answer the question and said that adding the word consensual would have made the question easier to answer. She did put a date in the previous question, however. It appears thinking of age made her think of the first event, while the question about the date was more something under her control. Just an observation.
Q4.4 – First intercourse voluntary. Of the four asked about what voluntary meant, all got the right idea. Free will was a little foggier for them. Two got it right. Of the other two, one said “whether I was convinced to-emotionally, not physically.” The other said, “your own time.” I think they got it, but we might want to stick with voluntary and not add “free will.”
Q4.6 – Birth control at first intercourse. Six respondents were asked about birth control methods. Two from Home Front thought the list given was complete and one thought nothing needed to be defined. Three from HiTops thought we should add withdrawal, one noting that some peers only use that method. Also called “taking it out.” They thought withdrawal should be defined, and thought our definition was fine (teens are more familiar with “coming” than “ejaculation”). Two of them also thought we should add the Nuva ring, or the ring. One noted she had never heard the term “Depo-Provera” but may have heard of injectibles once before. This might be a class distinguisher. Not sure.
Q4.7 – Sex more than once. We asked about two aspects of this question. First, trying to distinguish if they understood the number of times they had sex versus the number of people. All five respondents asked about this would answer it correctly. Second, we tried to determine if they understood the skip structure. Two of the three respondents with notes regarding the skip indicate they did it correctly. One did not. When queried, she did not understand what the arrows were for.
Q4.9 – Number of times had sex. This question worked fine. Of the five respondents asked about this, all thought about the time period concerned and answered as well as we could expect. Some could count, using anchors like when their partner visited or events in school. Those with only one partner or a few partners in the time period could count the number of times, as expected. Those with many partners in the time period had a harder time counting – and one of them marked DK. That was the only DK. No one found the question confusing or hard per se.
Q4.10 – Number of times used condom. For respondents who did not have sex many times in Q4.9 (that is, they could easily count the number of times), counting the number of times they used condoms was not difficult. One of the HiTops respondents noted she liked the alternative question better, that had proportions of the time. While this would be easier for her, we would lose data, since she was able to answer with numbers. Another HiTops respondent noted that the alternative did not have “all of the time.” He added it and used that as his answer.
One respondent noted that questions Q4.10 and Q4.11 seemed the same to him, since they both asked how many times the respondent used condoms in the last 3 months. Since he only used condoms, it was not clear to him how the questions were different. Had his partner been using another form of birth control and he used a condom, he would still be confused – he wouldn’t know whether to just count the condom use or his partner’s birth control method. He thought we should not ask about condoms twice or be more focused in what we want them to think about in Q4.11. He did not comprehend “any method.”
Q4.12 – Oral sex. Of the four respondents to answer this debriefing question, all agreed with our definition of oral sex – saying it matched their own. One thought this definition was unnecessary because he thought everyone knows what oral sex is. (See Q3.15 as well for commentary on this)
One respondent said her answer would change if she had oral sex that was not voluntary, but did not say how or why.
Q4.13 – Date of first oral sex. Two respondents answered debriefing questions about this question. Both recalled their answers by first identifying who their partner was. The first said the question was easy because it just happened this year. The second had more trouble, guessing the answer based on what grade she was in. Both thought the alternative question was easier, but both did answer with month and year. Four out of five respondents who had oral sex were able to give dates for the month and year. One of the four put a “?” near the month, but answered.
Q4.14 – Number of oral sex partners. Only one person was asked this question. She found it easy to remember and answer this question, but did not have lots of partners.
Q4.16 – Condom or oral dam with oral sex. Of the five respondents who were asked about this, only one knew what a dental dam was. He was a HiTops peer leader, and he thought only those who had ever used one would know what one was. Two others had heard of it before, but did not know what it was. One of them said that a definition would help.
Two noted this question was easy because their answer was zero. Another answered based on condom usage, and did not mention it being difficult to answer.
Q4.19 – Anal sex. No one was asked this question.
Q4.21 – Same sex sex. The wording of this question was only troublesome to one respondent, who was having cognitive difficulties with many questions. She thought the alternative version was better. Three other respondents did not have trouble and two didn’t think the alternative was better, although one liked the alternative better.
One respondent said she would not count kissing as sexual experience. Anything more than kissing would count, she said (i.e., clothes removed, oral sex, or fingering (defined as putting your finger in another’s vagina for sexual gratification)).
Q4.22 – Pregnancy. Two people responded to this and did not have trouble finding the male version question. However, one thought that the use of “version” could be confusing, suggesting that respondents were in the wrong Part B. He thought, “females answer this question…, males answer this question” might work better.
Q4.23 – Talked with health professional about reproductive health. Five respondents answered this question. Two of them had no idea what reproductive health was; one answered the question based on an understanding of birth control. The other three knew it had something to do with reproduction, but it was a bit fuzzy. One thought it was pregnancy, another thought it was when someone is giving birth, and the third thought it was checking if everything is ok, checking for lumps, STIs, cleanliness, birth control. Two of these three were 16 and 17 year old HiTops kids. This is likely a problematic term that requires definition, replacement or removal.
Regarding “other health professionals,” one respondent would count Planned Parenthood, since they are trained professionals. Another would not.
Other health professionals who would count include the following: Three respondents would count nurses (school nurse, mom – who is a nurse’s aide) and doctors. One would count people who work with health. One also added youth program staff, and another added staff from HiTops (both are HiTops kids). We probably need to be specific in our list of who we want them to count.
Q4.25 – Diagnosed with STD. The skip for no and DK for this question should be to Q4.27, not to Q5.1. That is an error.
Q4.26 - STDs. All three respondents to this question indicated that if they had not heard of one of the STDs on the list, they did not have it, marking it as “no” instead of DK.
Please note, only 9 respondents were non-sexually active, but 10 completed section B2, as one sexually active respondent completed Part B2 in error. Since not all respondents were asked all debriefing questions, some debriefing questions have responses from fewer than 10 respondents.
Q4.1 – Reasons not to have sexual intercourse. Several of the debriefing responses suggested that for some of the non-sexually active teens, sex was not a salient subject in their lives. Of 7 debriefed on this question, one or two did not know what sexual intercourse was, despite the definition. Two wanted a response “sex is not something I like / want to do, “or “don’t care to answer.” Another thought others might like such a response. Another doesn’t have a boyfriend, but answered anyway, and another said she based her answers on the world around her – had to guess based on TV or movies and felt she had to read the response categories several times to understand the answer choices. Therefore, while we might get answers to these questions, I’m not sure how much they will tell us.
Q4.2 – Reasons to have sexual intercourse. Of the two debriefed on this question, neither seemed to be thinking along the lines we question writers were. One said that teens shouldn’t be thinking about sex until they are older or married and marked all but one item as not at all important. The other noted that she felt like she needed different answer categories.
Also, the lettering of these questions is not consecutive and repeats f.
Q4.3 – Signs of puberty. Most male respondents did not have trouble finding the male version of questions when there were male and female versions, although one did. Another respondent suggested “version” might be a confusing word suggesting the respondent picked the wrong Part B, and that perhaps we should say, “males answer this question and females answer this question.” A decent idea….
Regarding the term menstrual period, one out of three females asked about the term noted she didn’t know what it meant, and wondered it if we meant “regular period.” We should consider taking menstrual out perhaps.
Q4.5 – Talked with health professional about reproductive health. (See Q4.23 above and in B1.) We asked about three aspects of this question: health professionals, birth control and reproductive health.
For health professionals other than doctors, of the three respondents asked, we heard nurses, parents or friends with knowledge, and an “I don’t know.”
For birth control, one respondent said that meant the pill, but could not think of other methods, and one said, “when a girl has her tubes tied to not have a baby” and one said “something that works around [prevents] pregnancy.”
For reproductive health, none of the respondents knew what it meant. One said maybe the health of your organs or inside your body. Another said it was the mental health of a baby when it is born, and the third did not know. That is clearly the most problematic term.
Q4.6 – Condoms in future. One respondent volunteered that the response categories between 4.6 and 4.7 reverse in order from "not at all" to "very" and then from "very" to "not at all." We should check that throughout.
Time frame – one respondent commented on the timeframe issue and said that “at any point in the future” meant anytime. Not much to go on, but we know from question writing literature that there should be a specific time frame noted here, at the beginning of the question.
Three respondents answered the alternative question and two changed their answers. One went from “somewhat” to “very” and the other went from “a little bit” to “very.” The original is amorphous and doesn’t give the respondent anything concrete to decide on. The interpretations of these two versions should probably be looked into more before deciding whether to change the question.
Q4.7 - Someone might hurt you. (See Q4.27 in B1.) Only two respondents were asked about this. One said she didn’t know because she is not dating anyone. The other circled “hurt you” in the survey and in the debriefing noted “hurt” could mean that sex might hurt or you might get a broken heart, OR it could mean that someone could beat on you or abuse you physically, sexually, mentally or emotionally. She suggested we add the word “physically” to the question. She answered the alternative question and went from “very fearful” to “no” because she added “physically” to the question the second time she answered it. We should be clearer on what we are looking for here. She liked the two question alternative better, as do I, but it requires a skip, which is not great.
Please note, since not all respondents were asked all debriefing questions, some debriefing questions have responses from fewer than 17 respondents.
Q5.3 – Smoked on how many days. One of our debriefing questions asked whether respondents preferred an open response category for the actual number of days, or category ranges to fit in. One respondent preferred the open question, so he could write the actual number of days. One respondent thought the open question gave him the opportunity to pick a midpoint for the range in the original question. One respondent preferred the ranges, but suggested splitting 5-25 to 5-10, 11-16… Based on smoking data, this would not add a lot of variation for us. One respondent, who smoked every day, didn’t prefer either over the other. Three respondents filled out the alternative question. The numbers always fit in the ranges, which is good. Everyone was able to give a number, which is also good. For two respondents, this might have been easy because it was 30 out of 30 days. For the third respondent, he chose 15, as the mid-point. The question is, will we get better data this way. The literature suggests yes and our limited experience suggests that three people could answer the question.
One HiTops respondent mentioned that at first he read the question as how many cigarettes as opposed to how many days. Perhaps we should highlight that somehow.
Respondents had different ways of counting days. One only smokes at concerts, so counted based on that. Another smoked every day, so found answering easy. A third counted the number of days he didn’t smoke, and subtracted that from 30.
Q5.6 – Drank on how many days. Seven respondents noted they did not find anything confusing in this question. No one found this confusing.
Four respondents noted that their answers would be the same at different times of the year. One respondent said that her drinking was more than usual because she had just returned from vacation.
Five respondents noted that it being illegal did not affect the way they responded. One said she was reassured because we had said her answers would be kept private.
One respondent said she was reminded by a previous question that we were asking about days.
A HiTops respondent noted that it was good that sips were differentiated from drinks, as many young kids think of sips as drinks.
Q5.8 – Marijuana. Other names for marijuana:
Weed – 8 respondents (two said this was more common than our terms);
Cookies – 1 respondent;
Cookies and milk – 1 respondent (said it was a family thing so the younger kids wouldn’t know what they were talking about);
Bud – 1 respondent (who said it was a commonly used term);
Haze – 1 respondent
One respondent noted that grass is out-dated.
Five respondents noted that it being illegal did not affect the way they responded. One said she was not affected because she trusts answers will be kept private (same R as in 5.6).
One respondent was hesitant to answer this question. She said lots of people drink, but this one is more difficult (HiTops). Two respondents thought that some users might find the question direct or to the point or might need to be reassured that they won’t get into trouble by answering this question honestly. He said maybe we should say again that this information will be kept private and they won’t be judged.
One R said she also counted inhaling second-hand smoke.
Q5.10 – Inhalants. Seven respondents defined “altered state.” One as being out of your mind, being in a different place; a second as delusional; a third as to become someone else, act differently (as a negative behavior); and four defined it as getting high. Four did not know what it meant (and so one ignored it).
Three did not think it was a weird thing to say (but two thought getting high was sufficient/better). One thought “a different state of mind” would be better. Six respondents thought it was a weird thing to say.
We should consider a different way to say this.
Q5.11 – Illegal drugs. Drug name commentary:
never heard of speed
pot is the most popular word for that drug
add the following – crack, dope, E-pills (Ecstasy)
also
known as Lucky charms or Skittles
add coke and crack
add heroin
add crack; Dope=crack
Again we heard that the illegality of the activities in this question did not affect responses. (4 Rs) One of them mentioned that maybe with other kids it may be helpful to remove the word ‘illegal’ or reassure them they won’t get into trouble.
Q5.12 – Prescription drugs. All respondents understood what we were trying to get at with this question. In the process of explaining it to us, some respondents mentioned some prescription drugs that get misused.
Demerol or morphine (1R)
Pain killer drugs
Motrin
One respondent thought that other respondents might be confused into thinking that if their mother gave them her own prescription they could mistakenly say yes to this question, when really they are not abusing the prescription because their mother gave it to them.
Please note, since not all respondents were asked all debriefing questions, some debriefing questions have responses from fewer than 17 respondents.
Q6.2 – Friends think. All 8 respondents thought that answer categories worked for them.
One respondent considered everyone in the school her friend – everyone knows everyone. Decided if half of the school does something, she chooses half. She used this logic throughout.
Another said she only thought of her 8 close friends. If she broadened it to all friends, her answer would have changed. We could consider being more specific in what we want them to think about…
Another respondent said she did not talk with her friends about these things, so chose DK for every question. Two other respondents noted that they liked the DK option, since they either don’t know what their friends think or don’t like to get into their friends’ business.
All that commented noted that the categories would be hard for those with fewer than 4 friends, but that did not seem to be an issue with these respondents. We should note the selection of these respondents should make them more socially connected than others, so this could be a problem for other kids.
Q6.3 – Friends do. One respondent counted her 8 closest friends. Two respondents just knew – did not have to count, but one said he knew because he talked with his friends about them and the other said he does not talk with his friends about these things. A third respondent said some of her friends tell her, which is how she knew for them. She said she knew for most. For her answer, she just had a feeling where to place herself on the continuum.
Q6.4 – Pressure from friends. All five respondents thought the answer choices/scales were fine. One said he did not notice the reversal of order and another said it did not affect him.
Two responded regarding how they would answer if some of their friends pressured them and some didn’t. One said he would pick “some pressure.” The other said it would not change her answer: A little pressure.
Q6.5 – Sexual attraction. Five respondents said they did not know anyone who needed a neither category. One of them said it might still be good to have it. One respondent thought we do not need a category for “neither” as those folks could choose “not sure.” Another respondent said he knew 3-4 people who needed the neither category.
One respondent marked both and said that meant sexual attraction.
Q6.6 – Abstinence pledge. Six respondents were able to tell us what “abstain” means more or less correctly. One respondent did not know what it meant, nor what an abstinence pledge was, although she had heard about it on TV. Another wanted to clarify that to be abstinent meant refraining from oral sex as well as sexual intercourse. This illustrates that there are lots of definitions for this concept out there.
1Not all teens were asked all the debriefing questions because there were more questions than could be covered in the allotted time.
File Type | application/msword |
File Title | ATTACHMENT J |
Author | LKalb |
Last Modified By | Seth F. Chamberlain |
File Modified | 2010-02-01 |
File Created | 2010-02-01 |