Alternative Supporting Statement Instructions for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
Youth Empowerment Information, Data Collection, and Exploration on Avoidance of Sex (IDEAS)
OMB Information Collection Request
0970 – New Collection
Supporting Statement
Part B
August 2020
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families (ACF)
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, DC 20201
Project Officer:
Caryn Blitz
Part B
The Youth Empowerment Information, Data Collection, and Exploration on Avoidance of Sex (IDEAS) project is a descriptive study designed to identify strategies and approaches that resonate with youth and empower them to make healthy and informed decisions related to sexual risk avoidance, teen pregnancy prevention, and their own well-being. This is a one-time data collection effort using four survey instruments: a Youth Survey (ages 14 to 18) split into Part 1 and Part 2, a Young Adult Survey (ages 19 to 24), and a Parent Survey of parents of youth ages 14 to 18.
The data collected from the surveys will:
Provide the Administration for Children and Families (ACF) with current information from youth and parents on topics such as communication between youth and parents, attitudes and beliefs about life event sequencing and youth sexual behaviors, and education on topics such as sexual behavior and health and well-being.
Inform ACF’s efforts to develop and refine programs, tailor educational messages to youth, and empower parents and other adults to help youth avoid or cease sexual risk behaviors.
This study is intended to produce internally valid, descriptive estimates that only reflect the knowledge and opinions of the AmeriSpeak panel participants, and not to promote statistical generalization to the national population.
The data collected with the survey instruments will inform current and future Adolescent Pregnancy Prevention and Sexual Risk Avoidance Education programs, both within and outside ACF. We intend to conduct subgroup analyses by demographics such as region of the United States, gender, and age. Therefore, NORC’s AmeriSpeak panel, which approximates a representative sample of the U.S. population is a convenient way to reach the target population for this study (see Attachment D: Technical Overview of the AmeriSpeak Panel and Section B2). We intend to use the AmeriSpeak panel to generate exploratory evidence that would not be considered representative of the nation at large. We will give appropriate caveats because of the exploratory nature of the study and the use of online panel data.
As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.
The target populations for this study are driven by the analytic goals. We plan to conduct analyses with three populations, all recruited through the AmeriSpeak Research Panel. The three populations are:
Parents1 of youth ages 14–18, living in the United States, who speak English or Spanish
Youth ages 14–18, living in the United States, who speak English or Spanish
Young adults ages 19–24, living in the United States, who speak English or Spanish
We also plan to conduct a set of dyad analyses matching parents’ data with their children’s data to compare the responses of the two groups on similar survey questions.
Table B2.1 indicates the target population size in NORC’s AmeriSpeak panel for each population of interest. Because the youth sample is developed directly from the parent sample and may include both youth who are in the panel and youth who are not in the panel, the number in the table is an estimate based on NORC’s experience with this age group. Sampling procedures are discussed further below.
The Youth (both parts), Young Adult, and Parent Surveys will be administered online using NORC’s AmeriSpeak research panel and will use a multi-stage sampling method that varies by the target population. Sampling begins with NORC’s creation of the AmeriSpeak Panel, as described below. Then, for the Parent and Youth Surveys, sampling takes place first at the panel level by selecting all panel households in the target population. This step is followed by within-panel–level, random sampling of one parent (if a known two-parent home) as well as within-household, random sampling of the youth based on household rostering of all youth who live in the household and fit the eligibility criteria for the Youth Survey (within-household sampling procedures are discussed further below). For the Young Adult Survey, sampling takes place only at the panel level, with all eligible young adults in the panel selected for this survey. The stages of sampling are described in detail next.
NORC’s AmeriSpeak Panel Sampling Procedures. The AmeriSpeak panel is constructed from NORC’s National Frame, an area probability sample used for other NORC surveys such as the General Social Survey (GSS). Building of NORC’s National Frame begins with an area probability sample that is constructed using a two-stage probability sample design.2 Made up of almost 3 million households in all 50 states and the District of Columbia, the frame also includes over 80,000 rural households whose information is not available in the U.S. Postal Service Delivery Sequence File (USPS DSF), but who are identified by NORC field staff through direct listing.
The AmeriSpeak panel is created by sampling and recruiting households from NORC’s National Frame of addresses using probability-based sampling in conjunction with mail, telephone, and in-person (face-to-face) contacts. The panel also recruits from an address-based sample from the USPS delivery sequence file in four states where the NORC National Frame has inadequate sample size. Once a household is recruited, NORC works with the primary household-level respondent to recruit other, within-household members into the panel. Thus, the AmeriSpeak panel will not necessarily include every member of every household in the panel. However, recruited households can add household members into the panel at any time, which will be useful for the purposes of the Youth Survey and the within-household sampling procedures (described below). Table B2.1 shows the total population for each of the Youth Empowerment IDEAS project survey target groups in NORC’s AmeriSpeak panel.
Table B2.1. Youth Empowerment IDEAS project: Survey target populations in NORC’s AmeriSpeak panel
Respondent |
Eligible sample in AmeriSpeak panel |
Parents of youth ages 14–18 |
4,586 |
Youth ages 14–18 |
Up to 1,240a |
Young adults ages 19–24 |
3,183 |
a Youth sample will be developed by randomly sampling one eligible youth from a household roster within the Parent Survey. Some of the youth sampled will be part of the AmeriSpeak panel. Others will not be in the panel but will still be sampled and invited to participate in the Youth Survey. The total sample size for the Youth Survey both in and out of the AmeriSpeak Panel is 1,240.
Within-Household Sampling for Parent and Youth Surveys. The Parent Survey will include all 4,586 AmeriSpeak households with at least one panel member who lives in the United States, who speaks English or Spanish, and who is a parent of a youth age 14–18 living in the same household. More than one parent in a household may be in the AmeriSpeak panel. Ahead of data collection, we will select the parents for the Parent Survey. In two-parent households when both parents are known panel members, we will randomly sample one parent to receive the Parent Survey invitation.
To identify eligible youth for the Youth Survey, the Parent Survey rosters all youth ages 14-18 living at least half time in the parent respondent’s household, requesting the youth’s first name, age, and their relationship to the parent respondent. One youth per household will be selected for the Youth Survey. If the household contains more than one youth meeting the requirements for the target population, the web survey will use an algorithm to randomly select one youth for the study. At the end of the Parent Survey, parents will be asked to provide consent to contact the sampled youth by email for the Youth Survey. Through this process, we will create the sample for the Youth Survey. The youth sampling cannot be done before the Parent Survey is completed because 1) parental consent is required to proceed with the Youth Survey, and 2) not all members of a household are in the panel, and those who are, may have had a living arrangement change (for example, moved out) between the time they were recruited into the panel and the time of the survey invitation for this study. As noted, some youth will be members of the AmeriSpeak panel, and others will not. We will attempt to conduct the survey with all sampled youth, even those not currently listed as panel members. We anticipate a total sample of 1,240 youth.
Sampling Young Adults. The Young Adult Survey will be a census of all adults ages 19–24 in the AmeriSpeak panel who live in the United States and speak English or Spanish, for a total of 3,183 panel members.
NORC’s AmeriSpeak Panel Response Rate. NORC has formally documented the response rate of recruitment efforts for the AmeriSpeak household-level panel (Montgomery et al. 2016). Using the American Association of Public Opinion Research (AAPOR) response rate 3 (RR3) calculation, the 2014–2018 panel has a weighted household recruitment response rate of 34.2 percent. NORC collects basic demographic information such as age, gender, and race/ethnicity for all AmeriSpeak panel members at recruitment, which allows for sampling of targeted populations in specific studies—such as this project with its targeted populations of interest.
When designing the Parent, Youth and Young Adult Surveys, the study team reviewed 11 national or regional, public health and public opinion surveys and 8 research and evaluation instruments that were all used before and that cover topics relevant to adolescent decision making and risk behavior (see Attachment A: Research Questions, Sources, and Item Crosswalk).
Although the study team drew items in the IDEAS instruments from these existing surveys, our process for developing the item pool identified question constructs that existing items lacked. After subject matter experts and ACF leaders reviewed the instruments, the study team modified some previously used items and added new items that fit the goals of the IDEAS project better. Attachment A shows how questions across the three survey instruments were mapped to research questions and study objectives.
Due to the number of new or modified items on the instruments, and to decrease the chance for measurement error, the full surveys were pre-tested with more than nine people, and thus required OMB clearance. These pre-tests were conducted under generic OMB clearance, Pre-testing of Evaluation Data Collection Activities (0970-0355) to get feedback on the clarity and completeness of the questions and response options, as well as overall timing for instrument completion. Many of the items included in the surveys have been tested before and used in other national-level studies, such as the National Survey of Family Growth and the National Survey of Adolescents and Parents, or in other HHS studies, including STREAMS and the Federal Evaluation of Making Proud Choices! Therefore, the pre-test primarily focused on questions developed specifically for this study and probed for input about any questions or response options that pre-test participants found confusing or difficult to answer.
An iterative process was used for the pre-testing. Across the three instruments, a total of 173 pre-tests were conducted across two rounds of testing in three geographically diverse locations: Bethesda, Maryland; Dallas, Texas; and Kansas City, Missouri. The pre-testing took place in small group sessions with a total of three sessions for each survey instrument at each location (totaling nine groups per location, per round). After each round of pre-testing, the survey instruments were revised based on the findings. There were three or four people in each session, and sessions lasted around 90 minutes to two hours. Participants independently completed a paper version of the survey during the first part of each session. After completing the survey, the participants engaged in a small-group debriefing session to discuss particular questions of concern as predetermined by the instrument design team, along with other questions that respondents found unclear or problematic. The respondents across all pre-tests included males and females, varied in their ages, and had a mix of racial and ethnic backgrounds.
The results from the first round of pre-testing, along with revisions from ACF leadership, led to wording changes on several of the new or modified items. The changes clarified the meaning of items to ensure all respondents interpreted them the same way and that all respondents thought there was a choice that accurately reflected their attitudes and beliefs. The revised instruments were tested in the second round. Round 2 pre-test feedback resulted in modest wording changes to improve clarity, and once those changes were implemented, the resulting instruments became the final version of the instruments.
The majority of the survey sample members are composed of a group of panel participants who have agreed to be on call for multiple survey data collections. As discussed in Section B1, we propose using the AmeriSpeak panel, developed and maintained by NORC at the University of Chicago. ACF has contracted with Mathematica for work on this study, and Mathematica will coordinate data collection activities with NORC. Data collection will run for about four months, starting shortly after OMB approval. All data will be collected through instruments programmed as web-based surveys. Details on the data collection plan for each instrument are provided below.
As discussed in Supporting Statement A, Section A2, participants have already been recruited and consented by NORC to participate in the AmeriSpeak Panel. Therefore, data collection for Instruments 1 and 4 will begin with an email sent to all parent panel members with youth ages 14–18 and all young adults in the panel ages 19–24, notifying them of a new survey to complete (see Attachment E: Survey Invitation Emails). Parents and young adults will access their survey via a link in the notification email. The invitation email and all subsequent contacts with the panel participants will be handled by NORC in collaboration with Mathematica. Panelists can also view their survey invitation at the online member portal or on the AmeriSpeak app. Prior to beginning the survey, all participants will be required to provide their consent for participation in the survey. Nonrespondents will receive email follow-up reminders throughout the four-month data collection period. Panel members who, at the time of panel recruitment, have opted into receiving text message alerts will receive reminders through that format as well. Reminder calls will be made to encourage panel members to complete their survey via the web and if they wish to complete the survey by telephone at that time they may do so. Nonrespondents will also receive a postcard encouraging and reminding them to complete the survey. All reminder email, postcard, text, and phone scripts are in Attachment F: Survey Nonresponse Reminders.
The Parent Survey is expected to take 20 minutes to complete. The Young Adult survey is expected to take 35 minutes to complete.
Due to the sensitive nature of the topics on the Youth Survey (Part 1 and Part 2), parental consent will be obtained in the Parent Survey for the sampled youth, at which point the parent will also be asked to provide an email address and phone number for the Youth Survey invitation. Parental consent will be part of the Parent Survey instrument; see Instrument 1, Item J1 for specific wording.
An invitation to Part 1 of the Youth Survey will be sent by email to all youth ages 14–18 whose parents gave consent (Attachment E: Survey Invitation Emails). The invitation email will notify the youth of a survey to complete and will contain a direct link to it. Survey nonrespondents will receive follow-up emails, texts if they have opted in to receive them, and reminder telephone calls, along with a postcard mailed to their address (Attachment F: Survey Nonresponse Reminders). Similar to the Parent and Young Adult survey, at the time of reminder calls if the youth wishes to complete the survey by phone they can do so. Data collection for the Youth Survey will start shortly after the fielding of the Parent Survey, to allow time for all the parents to give consent, which will trigger the release of the Youth Survey. Part 1 of the Youth Survey is expected to take 20 minutes to complete. At the start of the Youth Survey, respondents will either be asked for their assent to participate (see Instrument 2, Item I.1 for specific wording). The assent and consent items describe the survey topics, risks of participation, and their rights as a research subject.
An email invitation will be sent for the second part of the Youth Survey to all youth ages 14–18 whose parents gave consent, regardless of whether the youth completed Part 1 (Attachment E: Survey Invitation Emails). Data collection for Part 2 of the survey will begin about four weeks after we send the email notification for Part 1. All sampled youth will receive an email notification to complete Part 2 which will contain a link to their survey. The data collection procedures noted for Part 1 will be followed for Part 2, including all nonresponse follow-up activities (specifically, follow-up emails, texts if they have opted in, and reminder telephone calls). We expect the Part 2 survey to have a lower response rate than Part 1, and consequently have fewer completed cases, due to sample attrition. Part 2 of the Youth Survey will take 20 minutes to complete. The Youth Survey assent items and demographic questions will only be asked of sample members who did not complete Part 1.
Prior to the start of data collection, the programmed web-survey instrument will undergo a thorough, two-stage testing process. First, the instrument design and data collection team at Mathematica will carefully review the instruments to ensure programming has been conducted as specified and that the web-based instruments work exactly as expected. Next, a quality control programming team will conduct instrument stress testing, where hundreds of cases will be randomly generated to follow all skip logic and other assumptions programmed for the instruments. The programming team will write additional code to verify the accuracy of all skip logic using this randomly generated data.
As part of the Youth Empowerment IDEAS project data collection procedures, the AmeriSpeak panel will use a “soft launch” during the initial fielding of the surveys. In the first few days of data collection, each survey will be released to 10 percent of the full selected sample. During this period, Mathematica will review all survey responses, conducting a thorough review of the frequencies to ensure no technical issues exist before releasing the surveys to the full sample.
Once participants are recruited into the AmeriSpeak panel, NORC uses rigorous methods to maximize their survey response rates and maintain their cooperation. The follow-up methods planned for the Youth Empowerment IDEAS project surveys are described in Section B4. To additionally improve response rates, survey respondents receive AmeriPoints for their participation in surveys, which can be redeemed for cash or physical goods (see Supporting Statement Part A, Section A9). NORC maintains strict rules to limit respondent burden and reduce the risk of panel fatigue. On average, AmeriSpeak panel members typically participate in AmeriSpeak web-based or phone-based studies two to three times a month.
Table B5 shows for each data collection instrument the respondent type, the number of eligible AmeriSpeak panel members invited to participate in the survey, the expected survey response rates, the estimated number of completes, and the final expected cooperation rate, which takes into account the panel’s household-level recruitment response rate. As discussed in Section B2, the AAPOR weighted RR3 for the 2014–2018 panel recruitment is 34.2 percent. The number of eligible youth invited to participate assumes 80 percent of Parent Survey participants consent for their teen to participate in the Youth Survey.
Table B5. Eligible population in panel, number of eligible panel members invited to take survey, estimated survey cooperation rate, estimated number of completes, and final expected survey response rate, by data collection instrument.
Data collection instrument |
Respondents |
Eligible number of households in the AmeriSpeak panel (number) |
Eligible sample members invited to complete the survey (number) |
Survey
cooperation rate |
Estimated number of completes |
Final response rate (panel response ratea * survey cooperation rate) |
(1) Parent Survey |
Parents of youth age 14–18 in the research panel |
4,586 |
4,586 |
33.8% |
1,550 |
(34.2 * 33.8) = 11.56% |
(2) Part 1 Youth Surveyb |
Youth ages 14–18 in the research panel |
4,586 |
1,240 |
54.4% |
675 |
(34.2 * 54.4) = 18.61% |
(3) Part 2 Youth Surveyb |
Youth ages 14–18 in the research panel |
4,586 |
1,240 |
47.6%c |
590 |
(34.2 * 47.6) = 16.28% |
(4) Young Adult Survey |
Young adults ages 19–24 in the research panel |
3,183 |
3,183 |
24.3% |
775 |
(34.2 * 24.3) = 8.31% |
a The household-level panel recruitment response rate is 34.2 percent.
b Based on NORC’s experience conducting surveys with the panel participants, we assume that of the 4,586 parents invited to complete the survey, 33.8 percent (1,550) will complete it. Parents must provide consent for all youth ages 14–18. NORC assumes, based on prior studies using their teen panel, that of the 1,550 parent completes, 80 percent (1,240) will provide consent allowing us to contact their child to participate. Youth Survey Parts 1 and 2 will be sent to all 1,240 consented youth.
c We anticipate a lower response to the Youth Survey Part 2 due to attrition.
A number of procedures will be used to reduce, assess, and handle nonresponse at the case (survey) level and item level, including using best practices in survey design, and implementing a responsive design approach to reduce survey-level nonresponse and mitigate nonresponse bias.
Survey design’s use of best practices to reduce survey- and item-level nonresponse. Research indicates that people are less likely to participate in surveys with sensitive topics (Tourangeau et al. 2010; Lind et al. 2013), and that item nonresponse increases for sensitive questions (Tourangeau and Yan 2007). Consequently, the surveys will be web-based, a format that is familiar to the panel participants and, because of the anonymity associated with this mode, is ideal for the sensitive questions asked in the survey (Tourangeau et al. 2000). Web surveys give this population of busy parents, young adults, and youth the flexibility to complete the survey when it is convenient to participate through their computer, tablet, or smartphone. During data collection, we will use best practices for following up with nonrespondents, including multiple reminder emails, a mailed postcard, and telephone reminders (see Attachment F: Survey Nonresponse Reminders).
Survey length can negatively impact response rates, with longer survey instruments increasing nonresponse and, potentially, nonresponse bias (Brick and Tourangeau 2017). During the survey design process, the research team sought to keep the instruments brief. Following pre-testing, the team further reduced the length of the surveys and broke the longest survey, the Youth Survey instrument, into two parts to be taken at separate times, with the explicit goal of mitigating survey nonresponse due to survey length.
Using responsive design to reduce nonresponse bias. Well established in the literature as responsive or adaptive design (Groves and Heeringa 2006; Wagner 2008; Schouten and Calinescu 2010; Couper and Wagner 2011), the approach relies on heavily (daily or weekly) reviewing paradata and live survey response data during data collection. The review of these data drive decisions on the need for interventions or varying approaches, with a goal of (1) ensuring enough completed cases by subgroups for analyses, and (2) decreasing the overall survey nonresponse bias. Throughout the data collection period, incoming key demographic data on age, gender, educational attainment, and race/ethnicity will be monitored to determine whether subgroups are participating at different rates and if nonresponse bias may be occurring. These data will be compared to the demographic percentages in the panel to determine whether changes to the data collection approach or an intervention aimed at increasing response for subgroups are necessary to reduce or mitigate the impact of nonresponse bias. If response bias is detected during data collection, we will implement immediate changes to the data collection protocol focusing on those subgroups with lower response rates, including more intensive and tailored follow-up, including telephone follow-up, with specific nonrespondent cases that, if completed, would balance the response rate and reduce the possibility of nonresponse bias by key demographic subgroups.
Item nonresponse. As shown in Table A11 of Supporting Statement Part A, the Youth Survey contains most of the sensitive survey questions. Compared with computer-assisted telephone interviewing, the web mode tends to yield higher overall levels of reporting, and more accuracy in reporting sensitive information (Kreuter et al. 2008). We anticipate item-level response rates will be high as most items in the surveys are not sensitive. To encourage response to sensitive items, the web-programmed survey instruments will contain soft checks, as described in Section B7.
Although the objectives of the Youth Empowerment IDEAS project do not include deriving national prevalence rates based on the survey data, it will be valuable and important to examine any variation in results across key subgroups of age, gender, and geographic region of the United States. No weighting procedures will be conducted that attempt to represent a national population. The data collected from the survey instruments will be used to provide ACF with information to help refine and guide program development in the area of adolescent pregnancy prevention and will not be used as the principal basis for public policy decisions.
The surveys for the Youth Empowerment IDEAS project will collect data through closed-ended questions about behavior, attitudes, opinions, and beliefs. The web program will increase data quality by implementing soft and hard checks. Soft checks will occur if a respondent leaves a question blank. The respondent will be notified that they have not responded to the question and will be allowed to either provide a response or continue without a response if they so choose. Unlikely responses that would be deemed out of range for a continuous variable will also trigger a soft check so the respondent can confirm the out-of-range response. Hard checks will be implemented as another data quality check. Hard checks will not allow impossible responses. For example, for items that instruct respondents to “Select one response only,” the program will not allow more than one response to be selected. Additionally, the web programming will route respondents only to questions that apply to them based on their earlier responses. For example, parents will only be asked follow-up questions about why they did not have a conversation with their child about a certain topic if they report never speaking to their child about the topic in the earlier item.
On each survey instrument, several items contain an “other, specify.” Respondents may type in their own responses here if none of the closed-ended responses apply. These responses will be reviewed by the data collection team, who will back-code them to any applicable existing response options. During this coding stage, if existing response options do not apply, new response options may be created if there are enough equivalent write-in responses to warrant a new code.
Nearly all the survey data will be categorical or dichotomous (for example, whether youth have ever discussed various topics with a parent), and for these data we will calculate percentages. For continuous variables (for example, respondent age), we will calculate means. Estimates will be tabulated separately for subgroups of interest, such as younger and older youth, males and females, and people in different regions. Comparisons between the parent, youth, and young adult will be examined on survey questions that are the same across instruments. We will also conduct analyses on parent/youth dyads to better understand whether parents and their youth have similar or different opinions, experiences, and perspectives.
The primary purpose of the estimates produced from the Youth Empowerment IDEAS project survey data will be to inform current and future Adolescent Pregnancy Prevention and Sexual Risk Avoidance Education programs. ACF may also write practitioner-focused short reports or briefs that help share what was learned with program providers. A codebook will be provided to ACF, along with a memo describing and documenting the methodological approaches used for the study.
To develop the Parent, Young Adult, and Youth Surveys, ACF contracted with Mathematica. Mathematica will lead the data collection activities described in this ICR. Table B8 lists the individuals responsible for instrument design, data collection, and the statistical aspects of the surveys, and includes their affiliation and email address.
Table B8. Individuals responsible for survey development and data collection procedures
Name |
Affiliation |
Email address |
Caryn Blitz |
Office of Planning, Research, and Evaluation Administration for Children and Families U.S. Department of Health and Human Services |
|
Kathleen McCoy |
Business Strategy Consultants Staff, Office of Planning, Research, and Evaluation Administration for Children and Families U.S. Department of Health and Human Services |
|
Susan Zief |
Mathematica |
|
Tiffany Waits |
Mathematica |
|
Eric Grau |
Mathematica |
|
Sarah Forrestal |
Mathematica |
|
Jennifer Walzer |
Mathematica |
Attachments
Attachment A Research Questions, Sources, and Item Crosswalk
Attachment B 60-Day Federal Register Notices
Attachment C AmeriSpeak Privacy Statement
Attachment D Technical Overview of the AmeriSpeak Panel
Attachment E Survey Invitation Emails
Attachment F Survey Nonresponse Reminders
Attachment G Individuals Submitting Public Comments Through Grassroots Messages
Instrument 1 IDEAS Parent Survey
Instrument 2 IDEAS Youth Survey Part 1
Instrument 3 IDEAS Youth Survey Part 2
Instrument 4 IDEAS Young Adult Survey
Brick, J.M., and R. Tourangeau. “Responsive Designs for Reducing Nonresponse Bias.” Journal of Official Statistics, vol. 33, no. 3, 2017, pp. 735–752.
Couper, M.P., and Wagner, J. “Using Paradata and Responsive Design to Manage Survey Nonresponse.” ISI 58th World Statistical Congress, Dublin, Ireland, 2011.
Dahlhamer, J.M., A.M. Galinsky, S.S. Joestl, and B.W. Ward. “Sexual Orientation in the 2013 National Health Interview Survey: A Quality Assessment.” Vital Health Statistics, vol. 2, no. 169, 2014.
Dennis, J.M. “Technical Overview of the AmeriSpeak panel: NORC’s Probability-Based Research Panel.” Available at AmeriSpeak.norc.org. 2017. Accessed April 6, 2020.
Groves, R.M., and S.G. Heeringa. “Responsive Design for Household Surveys” Tools for Actively Controlling Survey Errors and Costs.” Journal of the Royal Statistical Society. Series A (Statistics in Society), vol. 169, no. 3, 2006, pp. 439-457.
Kreuter, F., S. Presser, and R. Tourangeau. “Social Desirability Bias in CATI, IVR, and Web Surveys: The Effects of Mode and Question Sensitivity.” Public Opinion Quarterly, vol. 72, no. 5, 2008, pp. 847–865.
Lind, L., M. Schober, F. Conrad, and H. Reichert. “Why Do Survey Respondents Disclose More When Computers Ask the Questions?” Public Opinion Quarterly, vol. 77, no. 4, 2013, pp. 888–935.
Montgomery, R., J.M. Dennis, and N. Ganesh. “Response Rate Calculation Methodology for Recruitment of a Two-Phase Probability-Based Panel: The Case of AmeriSpeak.” NORC at the University of Chicago: AmeriSpeak Research, 2016. Available at http://d3qi0qp55mx5f5.cloudfront.net/amerispeak/i/research/WhitePaper_ResponseRateCalculation_AmeriSpeak_2016.pdf
Schouten, B., Calinescu, M., and A. Luiten. “Optimizing Quality of Response Through Adaptive Survey Designs.” Statistics Netherlands, vol. 11, 2011, pp. 1-27.
Tourangeau, R., R.M. Groves, and C. Redline. “Sensitive Topics and Reluctant Respondents: Demonstrating a Link Between Nonresponse Bias and Measurement Error.” Public Opinion Quarterly, vol. 74, 2010, pp. 413–432.
Tourangeau, R., L. J. Rips, and K. Rasinski. The Psychology of Survey Response. New York: Cambridge University Press, 2000.
Tourangeau, R., and T. Yan. “Sensitive Questions in Surveys.” Psychological Bulletin, vol. 133, 2007, pp. 859–883.
Wagner, J. Adaptive Survey Design to Reduce Nonresponse Bias. Ann Arbor, MI: University of Michigan Ph.D. thesis, 2008.
1 Parents are defined as parents or guardians with biological children or children they are responsible for, such as stepchildren, adopted children, and foster children.
2 NORC at the University of Chicago: 2010 National Sample Frame. Available at http://www.norc.org/Research/Projects/Pages/2010-national-sample-frame.aspx.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Mathematica Standard Report Template |
Author | Sclark |
File Modified | 0000-00-00 |
File Created | 2021-01-13 |