SUPPORTING STATEMENT: PART A
OMB#
Pilot Implementation of the Violence Against Children and Youth Survey (VACS) in the US
October 18, 2021
Point of Contact:
Jeffrey D. Ratto, MPH
Health Scientist
Contact Information:
Centers for Disease Control and Prevention
National Center for Injury Prevention and Control
4770 Buford Highway NE, MS S106-10
Atlanta, GA 30341-3724
phone: 404-498-0370
email: [email protected]
CONTENTS
Section Page
Summary table 3
Justification 4
A.1. Circumstances Making the Collection of Information Necessary 4
A.2. Purpose and Use of Information Collection 6
A.3. Use of Improved Information Technology and Burden Reduction 6
A.4. Efforts to Identify Duplication and Use of Similar Information 7
A.5. Impact on Small Businesses or Other Small Entities 9
A.6. Consequences of Collecting the Information Less Frequently 9
A.7. Special Circumstances Relating to the Guidelines of
5 CFR 1320.5(d)2 9
A.8. Comments in Response to the Federal Register Notice and
Efforts to Consult Outside the Agency 9
A.9. Explanation of Any Payment or Gift to Respondents 11
A.10. Protection of the Privacy and Confidentiality of Information
Provided by Respondents 15
A.11. Institutional Review Board (IRB) and Justification for Sensitive
Questions 16
A.12. Estimates of Annualized Burden Hours and Costs 16
A.13. Estimates of Other Total Annual Cost Burden to Respondents
or Record Keepers 18
A.14. Annualized Cost to the Government 18
A.15. Explanation for Program Changes or Adjustments 20
A.16. Plans for Tabulation and Publication and Project Time Schedule 20
A.17. Reason(s) Display of OMB Expiration Date is Inappropriate 22
A.18. Exceptions to Certification for Paperwork Reducation Act
Submissions 22
Attachments
A Authorizing Legislation: Public Health Service Act
B Published 60-Day Federal Register Notice
C Privacy Impact Assessment (PIA)
D Domestic VACS Head of Household Consent Form
D.1 Cognitive Interviewing Parent/Guardian Consent Form
E Domestic VACS Head of Household Questionnaire
F VACS Respondent Initial Information and Consent Form
F.1 Cognitive Interviewing Respondent Information and Consent Form
G Institutional Review Board (IRB) Approval
H Household Invitation Letter
I Screener Questionnaire
J VACS Core Youth Respondent Questionnaire
K Screenshots
L Response Plan, Mandatory Reporting, and Distressed Respondent Protocol
M Philippines 2012 Cognitive Interview Report
N Malawi 2013 Cognitive Interview Report
O Colombia 2017 Cognitive Interview Report
P VACS Adaptation Expert Consultation Report
R Cognitive Interview Recruitment Flyer
S Screening Script for Cognitive Interviews
SUMMARY TABLE
This request describes the Pilot Implementation of the Violence Against Children and Youth Survey (VACS) in the United States, which CDC has conducted in 24 countries globally and has adapted for a domestic context. More specifically, this request details the adaptations to the survey and methods for piloting the survey in one urban and one rural area of the United States.
Goals of the Pilot Test:
Intended use of the resulting data. The findings from this pilot study will be used primarily to better understand the feasibility and effectiveness of implementing VACS in the U.S., which will ultimately inform CDC’s approach to understanding the magnitude of violence against children and youth and its underlying risk and protective factors. The CDC will be using the results of this survey to inform future efforts to collect high-quality data to assess violence and related factors and use data-driven approaches to prevent violence and refine practices related to the protection of children. Results will be used to make recommendations to relevant organizations in Baltimore (and similar cities in the US) on data-driven procedures to develop, improve and enhance prevention and response strategies to address violence against children and youth.
Methods to be used to collect data. Data will be collected through a combination of in-person and online qualitative interviews (for cognitive interviewing), in-person convenience sampling household surveys (for field testing and rural feasibility pilot), and in-person probability-based household surveys (for full pilot implementation). The field testing, rural feasibility pilot, and survey pilot will be conducted using a combination of interviewer administration and Audio Computer-Assisted Self-Interview (ACASI) Software on tablets.
The subpopulation to be studied. For the cognitive interviews, the subpopulation will include youth ages 13-24 in urban Baltimore, Maryland who speak English. For field testing, rural feasibility pilot, and full pilot survey implementation, the subpopulation will include youth ages 13-24 in urban Baltimore or rural Garrett County, Maryland who speak English.
How data will be analyzed. Qualitative data analysis and summary statistics will be used for cognitive interviewing, field testing, and feasibility pilot. Data from the survey pilot will be analyzed using appropriate statistical software to account for the complex survey design to compute associations between study variables and provide qualitative and quantitative information about the feasibility of implementing the VACS at a state or local level in the U.S.
|
A. JUSTIFICATION
A.1. Circumstances Making the Collection of Information Necessary
CDC seeks OMB approval for up to 3 years of data collection to conduct a pilot test of the Violence Against Children and Youth Surveys (VACS) in the United States, specifically in Baltimore and Garrett County, Maryland. In recent years, the VACS have been implemented in 24 countries across Africa, Southeast Asia, Eastern Europe in Central and South America, yielding nationally representative and sub-national data on the burden, contexts, and consequences of violence against children and youth.i In 11 countries, the survey was implemented in English as well as other languages; all prior VACS implementations have been completed in low- and middle-income countries in Africa, Asia, the Caribbean, Eastern Europe and Latin America.
Violence against children is a global human rights violation that spans every country worldwide and affects a billion children each year.ii In the US, a large number of youth are the victims of multiple forms of violence and abuse. An estimated 10 million children in the US have experienced child abuse and neglect (CAN), iii iv while approximately 676,000 victims of CAN were reported to child protective services in 2016.v Each day, about a dozen youth are victims of homicide and more than a 100 times that number (~1,400) are treated annually in emergency rooms for physical assault injuries.vi Youth are also involved in high levels of peer violence, which is one of the leading causes of death for people ages 10-24.vii Among U.S. high school students, YRBS data indicate that past-year prevalence of physical fighting (23.6%), school bullying (19.0%), electronic bullying (14.9%), physical dating violence (8.0%), sexual violence (9.7%), attempted suicide (7.4%), and lifetime prevalence of forced sexual intercourse (7.4%) demonstrate that violence is common among youth in the US.viii
A body of research has shown that the impact of violence against children goes far beyond the initial incident, and that those who have experienced emotional, physical, and sexual violence can experience severe short to long-term health and social consequences.ix CAN has serious negative physical health effects and is associated with a variety of chronic diseases as adults.x Victims of CAN also suffer from negative psychological sequelaexi and increased risk for smoking, alcoholism, and drug abuse.xii Neurobiological and behavioral research indicates that early childhood exposure to violence can affect brain development and thereby increase the child’s susceptibility to a range of mental and physical health problems that can span into adulthood including anxiety or depressive disorders, cardiovascular health problems, and diabetes.xiii,xiv,xv Youth violence is associated with poorer academic performance,xvi,xvii,xviii,xix,xx,xxi suspension, unexcused absences,xxii higher dropout rates,xxiii,xxiv and delinquency for its victims.xxv,xxvi,xxvii,xxviii Similar deleterious outcomes are found for violent victimization by dating partners in adolescence,xxix,xxx,xxxi,xxxii with nearly one in five youth reporting sexual or physical victimization by a partner.xxxiii
Given the serious and lasting impact on children, it is critical to understand the magnitude and nature of violence against children in order to develop effective prevention and response strategies. Currently, data to guide state and local violence prevention and response efforts at the state or local level in the US are quite limited. While some studies have provided information on the risks and impact on violence against children, they are mostly limited in scale and cannot be generalized to the scope of violence against youth across the US or for specific regions.xxxiv,xxxv Existing data on violence against youth in the United States are based on telephone studies, such at the National Survey of Children’s Exposure to Violence, which includes caregiver respondents for young children. Disclosure of violence in this study therefore may be lower, as parents do not know or wish to reveal the full extent of their child’s exposure to violence and youth respondents may not have the privacy needed to feel comfortable disclosing.xxxvi In addition, the existing national studies cannot yield estimates at the state and local levels, making the data less actionable at these levels. These studies also often have lower response rates.
Another well-known data system that measures child abuse is the National Child Abuse and Neglect Data System (NCANDS), a voluntary data collection system that gathers information from all 50 states, the District of Columbia, and Puerto Rico about reports of child abuse and neglect. However, this system is limited to official reports of child abuse and neglect. The NCANDS is also not nationally representative, and analyses cannot be extrapolated to the entire U.S. population of children.
Furthermore, less focus has been given to understanding protective factors for violence against children. Consequently, this study will assess potential risk and protective factors for violence against children and include risk and protective factors that have been associated with violence in other countries. For example, this study will examine a number of factors related to degree of parental involvement and will ask respondents about whether a parent has died, how long a respondent lived with each biological parent, reasons why they may no longer be living with a parent, parent education level, relationship quality with parents, and perceived family and social support. Although some of these factors are not readily modifiable, these associations would have implications for identifying those at highest risk for violence and therefore help to determine how best to identify and allocate available prevention resources. CDC anticipates that a greater understanding of the risk and protective factors influencing violence against children could guide the development of prevention strategies designed to buffer against these risks and bolster facets of protection. The current data collection results will be used to evaluate the questionnaire and methods testing to further the end goal of informing reliable population estimates at state or local levels in future studies.
Without integrative research into the breadth and depth of the problem and investigation into why violence is so highly stigmatized and hidden, current response options may be ineffective, leaving children with limited access to services and protection. Moreover, there are gaps in what data is available at the state and local level, which make it difficult to determine how many children are exposed to violence and to characterize the circumstances and contexts in which such violence occurs. There is a need for data that are representative of the entire spectrum of violence against children and youth and are comprehensive in terms of describing multiple types of violence (physical, sexual, and emotional) as well as the contexts of violence victimization, and its risk and protective factors. The objective of the current study is to assess the use of household survey methods to generate such comprehensive data at state or local levels in future studies.
Current Request
VACS measure the magnitude of physical, sexual, and emotional violence against children as well as associated risk and protective factors. VACS have contributed to research throughout the world, demonstrating the high prevalence of violence against children in a variety of countries and cultures, and have proven to be critical tools that can fill data gaps in ways that are vital to informing strategic planning and evidence-based public health efforts in many countries. The development of a standardized global survey and questionnaire were led by CDC in collaboration with UNICEF and the public-private partnership Together for Girls with extensive external consultation from a wide variety of experts.
Although the VACS provides valuable information to inform public health initiatives to prevent violence, VACS have not been implemented in the U.S., and the existing representative datasets of violence against youth in the U.S. have significant limitations that prevent the data from being actionable for prevention planning by public health departments at the local level. VACS adapted for implementation in the U.S will help fill this gap with rigorous probability-based estimates of the problem of youth violence at state or local levels combined with an internationally tested approach to embed the VACS survey into the local strategic planning process of local public health partners. Since the U.S. is a much different context than the countries that have previously conducted VACS, this request is for pilot testing the adapted survey and methodology in two contexts: (1) a representative sample of 13-24-year-old youth in Baltimore and (2) a convenience sample of 13-24-year-old youth in rural Garrett County, Maryland to test the adapted VACS questionnaire and methodology in a rural location. CDC will also conduct two pre-tests prior to the full pilot implementation: Pre-test 1, cognitive interviewing of the adapted U.S. VACS questionnaire, will include interviewing 30 respondents to detect cognitive/comprehension feedback. The cognitive interviews are designed to evaluate key VACS questions for correct interpretation by respondents and to assess the ability of respondents to accurately answer the survey questions, yielding data on comprehension and the thought process respondents use to answer the VACS survey questions. The cognitive interviewing will include a diverse sample by age, race, and gender and sexual minority status. Cognitive interviewing will also be used to pilot the Audio Computer-Assisted Self-Interview (ACASI) data collection approach by assessing usability and user comprehension (see attachments D.1, F.1, R, and S for cognitive interview materials). CDC will submit a change request to OMB following cognitive interviewing in Pre-test 1 if changes need to be made to the questionnaire or survey administration. The findings from Pre-test 1 will be used to inform instrument revisions for Pre-test 2, field testing. Pre-test 2 will test the finalized instrument and survey procedures and identify any problems with the instrument and allow time to fix those problems prior to the full-scale pilot. Pre-test 2 will include 60 respondents to test the final instrument and all study protocols in the field after training but before the full pilot implementation. Pre-test 2 will inform the survey procedures, including but not limited to: community entry, approaching households, gaining consent, as well as the referral process for respondents who need and want help for violence experiences. Pre-test 2 will take place directly after interviewer training and will thus give interviewers an opportunity to implement and practice their training in the real world prior to the full pilot. In addition, administering the questionnaire in the Pre-test 2 field test will provide preliminary information on the average length of interviews and information about the questionnaire format. If results from the field test indicate that changes are needed to survey protocols or questionnaire, an IRB amendment and change requests to the present Information Collection Request will be submitted. After both pre-tests (the cognitive interviewing and field test) are completed, the full pilot implementation will begin, with a target sample of 1,020 completed interviews in Baltimore, as well as the feasibility pilot in Garrett County, with 50 completed interviews.
The use of probability-based sampling in Baltimore is based on the need to pilot methods for developing scientific estimates of violence against children in an urban area where the VACS in-person methodology has proven to be successful in other countries. The pilot survey will thus be conducted at scale in Baltimore to evaluate the adapted methods and also allow for a large enough sample to assess response rates and results to inform the sample design of future VACS implementation in a US context. The results of the pilot implementation of the survey in Baltimore will yield information to further refine and adapt methodology for future implementation of a full survey that could provide scientific estimates of violence against children with representative samples in urban settings. A feasibility pilot in a rural location, Garrett County, Maryland, will provide preliminary information about the field procedures in a rural setting. The objective of the data collection in the rural location is to assess the feasibility of data collection procedures in rural settings, focusing on the implementation of community entry, household entry, and consent procedures and protocols. The feasibility pilot in rural Garrett County will not utilize probability-based sampling; it will rely on a convenience sample and will involve testing implementation of field procedures.
Authority for CDC’s National Center for Injury Prevention and Control to collect these data is granted by Section 301 of the Public Health Service Act (42 U.S.C. 241) (Attachment A). This act gives Federal health agencies, such as CDC, broad authority to collect data and carry out other public health activities, including this type of study.
A.2. Purpose and Use of Information Collection
This research initiative seeks to examine the epidemiologic patterns of violence victimization as well as risk and protective factors for violence among children and youth in Baltimore for the purpose of informing future implementation of VACS questionnaire and methodology adapted for the U.S. This data collection will also be used to inform the development and implementation of effective prevention strategies in Baltimore. The information collection is focused on the following goals:
Adapt the Violence Against Children and Youth Survey (VACS) methodology for implementation in the U.S.
Conduct cognitive interviews to assess the comprehension, clarity of questions, and other aspects of the VACS questionnaire adapted for the U.S.
Pilot the adapted VACS methodology in Baltimore using a representative sample of youth ages 13-24.
Assess the feasibility of field procedures with a convenience sample in rural Garrett County, Maryland.
Test minor variations in the methodology to determine which variations result in better response rates and indicate less sample bias.
Identify risk and protective factors associated with physical, emotional and sexual violence against children and youth to inform stakeholders and guide future implementation of the adapted VACS in the U.S.
Identify the health and social consequences associated with violence against children and youth.
Assess the knowledge and use of medical, psychosocial, legal, and protective services available for children and youth who have experienced sexual, emotional and physical violence.
Identify areas for further research.
Assess the feasibility of VACS adaptations in the U.S., including data collection through ACASI as well as adapted procedures for community entry and household entry.
The findings from this pilot study will be used primarily to better understand the feasibility and effectiveness of implementing VACS in the U.S., which will ultimately inform CDC’s approach to understanding the magnitude of violence against children and youth and underlying risk and protective factors at state and local levels. Results from this pilot study will be used to inform future efforts to adapt and implement VACS in domestic settings at the state or local level in the collection of high-quality data to assess violence and related factors and use data-driven approaches to prevent violence and refine practices related to the protection of children. Results will be used to make recommendations to relevant organizations in Baltimore (and similar cities in the US) on data-driven methods to develop, improve, and enhance prevention and response strategies to prevent violence against children and youth.
Following COVID-19 guidance, at the time of the cognitive interviews and survey, social distancing and other public health safety measures will be implemented as necessary when data collection begins. CDC awarded NORC at the University of Chicago a contract to provide scientific services to adapt the VACS methodology for implementation in the United States. NORC’s role includes managing and conducting the interviews in the field. NORC has developed a Planning for In-Person Data Collection Task Force to create a template for projects to reintroducing in-person data collection during the COVID-19 pandemic in such a way that is safe and accepted by interviewers and respondents and ensures the quality of data collection. Protocols that have been developed to date include requirements for maintaining physical distance, requiring both the interviewer and respondents to wear masks (NORC staff will have extra masks with them to provide a mask to the respondent if they do not have one available to wear), disinfecting wipes to be used after each interaction with materials and equipment, and interviewing outdoors when possible or siting in an indoor area with adequate ventilation. As the situation evolves, the task force will continue updating protocols to ensure the safety of respondents and interview staff. All interviewers and field managers will be trained on all protocols prior to entering the field. Study procedures will adhere to the most up-to-date CDC guidance for testing, quarantine, and other mitigation efforts. In addition to following guidance developed by the NORC task force, the study team will follow the example of other federal research organizations, such as the U.S. Census Bureau, and train interviewers on adhering to public health guidelines and applicable state and local orders within Maryland.
The in-person data collection methodology is essential for this research for several reasons. First, given the sensitivity of the questionnaire, in-person data collection will help ensure the privacy and safety of the respondents. Interviewers will be trained to secure a private space within the home so that other household members do not see or overhear sensitive questionnaire items (see section A.10.). Interviewers will also be trained to implement a response plan, mandatory reporting procedures, and a distressed respondent protocol (Attachment L). Interviewers will connect respondents with local services, including referrals to an on-call social worker and/or other resources within the community. Additionally, in-person data collection has been shown to yield higher response rates and to minimize response bias compared to telephone or online survey methods.xxxvii CDC has had success with implementing VACS in 24 countries using in-person data collection methodology. CDC, through a contract with NORC, has adapted the VACS questionnaire and methodology and will conduct a pilot implementation of the adapted VACS as an in-person household survey in Baltimore and Garrett County. The purpose of this data collection is to inform future data collection efforts in the United States using the questionnaire and methodology adapted for the U.S. As outlined below in section A.4., the in-person methodology is essential for filling a gap in research on violence against children and youth in the United States. In-person data collection will proceed when COVID-19 conditions in the planned data collection areas are favorable and in compliance with all public health protocols in Maryland.
A.3. Use of Improved Information Technology and Burden Reduction
The pilot implementation of an adapted VACS in the United States intends to use electronic data collection, specifically tablets for interviewer-administered and Audio Computer-Assisted Self-Interview (ACASI) Software survey items. This pilot study will be the first VACS study to use ACASI. This software capitalizes on the use of improved information technology, allowing respondents to be able to read, listen, and respond to questions on the tablet themselves, instead of responding to the interviewer. ACASI with a headset for working privately will be used because it allows for the collection of sensitive information in a way that maximizes respondent comfort with disclosure, increasing the likelihood of the collection of reliable and valid data on sensitive items.
A.4. Efforts to Identify Duplication and Use of Similar Information
The information to be collected from respondents is not available from other sources. As described above, while VACS has been implemented in 24 countries globally, it has never been conducted in the United States, and therefore representative data collected in-person on the prevalence, context, and risk and protective factors of violence against children does not exist.
While developing the global VACS, CDC engaged international partners, including UNICEF, WHO, USAID, and Together for Girls to ensure that the information was both useful and unique. In the development of the domestic surveys, CDC met with two state public health departments and experts from CDC, CDC Foundation, academia and non-profit organization partners to gather insight on important topics for inclusion in the questionnaire to gain a comprehensive understanding of violence against children in the U.S. The core VACS questionnaire draws questions and definitions from a number of well-respected survey tools which has the benefit of (a) being able to compare data on various validated measures, and (b) using measures that have already been field tested, implemented, and validated in other studies. The core VACS questionnaire was further adapted for the U.S. context to draw on questionnaire instruments and measures that have been validated and used in survey in the U.S. The following violence surveys and measurement tools served as resources to inform the core VACS questionnaire and the U.S. questionnaire adaptation (measurement tools implemented and validated in the U.S. are marked with an asterisk):
Attitudes Toward Women Scale for Adolescents (AWSA)*
Behavioral Risk Fact Surveillance System (BRFSS)*
Child Sexual Assault Survey (CSA)*
CDC’s Dating Matters® Questionnaire*
Communities That Care (CTC) Youth Survey*
Demographic and Health Survey (DHS)
Global Kids Online Survey (GKO)
Global School-based Health Survey (GSBHS)
Health Behavior in School-Aged Children Survey (HSBC)*
HIV/AIDS/STD Behavioral Surveillance Surveys (BSS)
Hopkins Symptoms Checklist*
Institute of Education Sciences Healthy Kids Survey*
ISPCAN Child Abuse Screening Tool (ICAST)*
Longitudinal Studies of Child Abuse and Neglect (LONGSCAN)*
Multi Cluster Indicator Survey (MICS)
National Intimate Partner and Sexual Violence Surveillance System (NISVSS)*
National Longitudinal Study of Adolescent Health (Add Health)*
National Longitudinal Survey of Youth (NLSY)*
National Survey on Children’s Exposure to Violence (NatSCEV) Juvenile Victimization Questionnaire (JVQ)*
World Health Organization Adverse Childhood Experiences International Questionnaire (ACEs IQ)
World Health Organization (WHO) Multi-country Study on Women's Health and Domestic Violence against Women
Youth Risk Behavior Survey (YRBS)*
The VACS core questionnaire was independently tested using cognitive testing methods by a team of expert scientists from the National Center for Health Statistics (NCHS) in the Philippines in 2012, in Malawi in 2013, and in Colombia in 2016 by Columbia University. The aim of the cognitive interviewing studies was to investigate how well survey questions performed when asked of respondents. That is, if respondents understood the questions according to their intended design and if they could provide accurate answers based on that intent. As a qualitative method, the primary benefit of cognitive interviewing is that it provides rich, contextual insight into the ways in which respondents 1) interpret a question, 2) consider and weigh out relevant aspects of their lives and, finally, 3) formulate a response based on that consideration. As such, cognitive interviewing provides in-depth understanding of the ways in which a question operates, the kind of phenomena that it captures, and how it ultimately serves (or fails) the scientific goal. The cognitive interviewing study was purposely conducted in three diverse locations to test the universality of the questionnaire in diverse contexts within countries and globally. Findings from each round of the cognitive interviewing project led to restructuring, reordering and rewording of questions and sections of the questionnaire. The reports from the global cognitive interviewing are included as Attachments M, N, and O. Additional cognitive interviewing will be conducted with a sample of youth through the pre-test 1 phase of the present study. The Cognitive Interview Protocol is included in Attachment Q .
Furthermore, while adapting the questionnaire for use in the U.S. pilot implementation, CDC collaborated with stakeholders from Baltimore City and Garrett County to elicit input on data gaps and priorities. Based on feedback from stakeholders across settings, an adapted VACS questionnaire was developed that will be used for the pre-test 1 phase, cognitive interviewing. This phase will involve assessing the comprehension and performance of the questionnaire, with a particular focus on questions and items that have not been tested and implemented previously in a U.S. setting or that have been substantially adapted. Results from the cognitive testing will inform further adaptation of the questionnaire.
The CDC acknowledges that a lack of comprehensive data on violence against children has been one of the challenges to plan, implement, monitor and evaluate appropriate policies and programing on child protection. While there are some sources of information related to violence against youth, such as the Youth Risk Behavior Survey (YRBS, OMB# 0920-0493), and Behavioral Risk Factor Surveillance System (BRFSS, OMB# 0920-1061), these sources are not violence-focused, and thus do not capture local prevalence and the comprehensive context of youth violence victimization and perpetration across the U.S. Furthermore, much ACEs data are focused on adults, rather than youth populations. Alternatively, there are other data systems that cover violence such as CDC’s National Intimate Partner and Sexual Violence Survey (NISVS, OMB# 0920-0822), the National Crime Victimization Survey (NCVS, OMB# 1121-0111) conducted by the Bureau of Justice Statistics, and the National Survey of Children's Exposure to Violence (NatSCEV) funded in the past by the Office of Juvenile Justice and Delinquency Prevention. The NISVS only includes adults aged 18 and older (VACS targets youth ages 13-24). The NCVS includes U.S. households with occupants age 12 or older but only includes victimization, with no coverage of perpetration of violence. Furthermore, the NCVS survey questions are couched within a crime context (querying respondents about being a ‘crime’ victim), but the VACS will ask behaviorally specific/neutral questions couched in a public health context which is associated with much higher levels of disclosure. Within a crime context, youth may not disclose violence because they do not equate violence by people they trust with crime. The NatSCEV is not actively collecting data (the last archived NatSCEV dataset was 2014) and involved three rounds of data collection, NatSCEV I (baseline), NatSCEV II, and NatSCEV III. The NatSCEV was designed to obtain lifetime and one-year incidence estimates of a comprehensive range of childhood victimizations (but perpetration is not covered in NatSCEV). The NatSCEV is collected via phone interviewing and does not use the more rigorous in-person data collection methods used in VACS. Data collection via phone tends to yield lower response rates than in-personxxxviii and cannot ensure respondent privacy. Also, NatSCEV includes in its nationally representative sample of children only youth less than 18 years of age and does not use the more expansive definition of adolescents to include young people up to age 24 years old.
A.5. Impact on Small Businesses or Other Small Entities
No small businesses will be involved in this data collection.
A.6. Consequences of Collecting the Information Less Frequently
The primary consequence of collecting these data less frequently is that stakeholders would have less timely access to data on violence against children and youth and associated risk and protective factors. The VACS adapted for the U.S. context must be pilot tested to inform future efforts to collect representative data on violence against children and youth. The present planned pilot involves a one-time data collection. Because no comprehensive, ongoing and representative data on violence against children exists in the U.S., reducing the frequency of data collection would greatly impact the CDC’s ability to conduct future VACS studies to gain a comprehensive understanding of violence against children at the state or local level to therefore use to inform prevention efforts.
A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5
The request fully complies with the regulation 5 CFR 1320.5.
A.8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency
A.8.a) Federal Register Notice
A 60-day Federal Register Notice was published in the Federal Register on July 28, 2020 Volume 85, Number 145, pp 45432 (Attachment B). CDC did not receive public comments.
A.8.b) Efforts to Consult Outside the Agency
CDC has engaged several federal and non-federal partners to learn about ongoing experiments being conducted in Federal surveys to improve response rates, to assess the feasibility of partnering to conduct mutually beneficial experiments, and to learn from methods being implemented by other Federal surveys. Since July 2017, CDC has consulted with or referred to publications and work from other Federal and non-Federal partners, including BJS, CDC–BRFSS, CDC–National Survey of Family Growth (NSFG), CDC–National Health Information Survey (NHIS), National Highway Traffic Safety Administration (NHTSA), National Science Foundation (NSF), Census Bureau, National Center for Health Statistics (NCHS), American Association for Public Opinion Research (AAPOR), Office of Juvenile Justice and Delinquency Prevention’s redesign of the National Survey of Children’s Exposure to Violence, and Research Triangle Institute (RTI). The purpose of these consultations was to learn more about studies that are currently in the field or pending and that could have implications for adapting VACS for a U.S. context.
During 2017, CDC staff held a series of consultations with state health department grantees for NCIPC’s Rape Prevention and Education (RPE) program, the Domestic Violence Prevention Enhancements and Leadership Through Alliances (DELTA) Impact program, and the Essentials for Childhood program. Consultations occurred through virtual meetings with program grantees as well as site visits to two health departments: the Colorado Department of Public Health and Environment and the North Carolina Department of Health and Human Services. These two health departments were selected based on availability for a site visit. The program consultations and site visits were used to obtain feedback regarding health departments’ use of data to inform program priorities and prevention strategies. Data availability and data gaps were also discussed with health departments. Information from the consultations identified several data gaps and opportunities to leverage a U.S. adaptation of VACS to inform program priorities. One area of interest involved collecting comprehensive data from youth to inform primary prevention efforts. Another theme highlighted related to the need for comprehensive rather than topic-specific violence data. Health departments indicated interest in improving coordination across programs that target different forms of violence. However, no data system currently available collects data across violence types in a comprehensive way, creating barriers to integrating efforts to address shared risk and protective factors and maximizing efficiencies across programs. Finally, health departments indicated that data specific to their jurisdictions was key – jurisdiction-specific data is essential to ensure buy-in and inform programming. Feedback from stakeholders and program grantees indicated that a U.S. adaptation of the global VACS could provide useful information to address program priorities and improve data-driven violence prevention efforts.
In July 2017, Together for Girls and the CDC Foundation convened a group of experts to discuss methodological considerations for adapting the VACS for the U.S. Attendees included representatives from CDC, key partners of the global VACS, and experts in violence against children and youth from academic and community organizations. Attachment P includes a list of individuals who participated in the meeting and the final report and readout of the discussion. The panel meeting discussed methodology modifications that would be necessary for the U.S., key ethical considerations, and survey adaptations to capture relevant and comprehensive data on violence against children in the U.S.
Several consultations were also held with partner organizations in Baltimore and Garrett County, to discuss implementation and procedures for piloting the adapted VACS in Baltimore and Garrett County. NORC and CDC held several preparatory meetings in September through November 2019 before holding an introductory workshop in Baltimore, with representatives from the Baltimore City Health Department, Baltimore’s steering committee, NORC, and the CDC VACS team. Similar virtual meetings were held with the Garrett County Health Department and members of the steering committee. Following the introductory preparatory workshops on the violence against children survey, facilitated by the Baltimore City Health Department (BCHD) and Garrett County Health Department, respectively VACS Steering Committees were established in Baltimore and Garrett County. Baltimore was selected for the pilot site due to high rates of violence in the cityxxxix and the city’s strong interest in designing prevention programs to combat youth violence.xl BCHD serves as the coordinating agency for the project, serving as a model of local agency coordination for VACS in other U.S. locales. BCHD has deep experience in both youth violence data collection and prevention program implementation, as well as long history of leading stakeholder coalitions to address local public health challenges.
Since September 2019, there has been regular contact between key partners participating in the pilot implementation of the adapted VACS pilot and CDC to discuss survey materials, human subjects protection planning, community entry protocols, survey sample size requirements (see Supporting Statement B, Sample Size Estimates) and logistics, and other critical topics in the preparation for the U.S. pilot implementation in Baltimore City and the rural feasibility pilot in Garrett County. Such ongoing dialogue has succeeded in adapting the questionnaire and survey procedures for the United States.
To inform plans for pre-test 1 cognitive interviewing, CDC consulted with NCHS and the Bureau of Justice Services (BJS) in July 2021. Subject matter experts provided insights into planning for cognitive interviewing, including with respect to data collection in the context of COVID-19. They provided recommendations about methodology and procedures for cognitive interviewing and use of questions to probe comprehension and clarity of question wording.
CDC also consulted with experts in public health ethics, privacy and confidentiality in research, and mandatory reporting to inform plans for data collection. These consultations included the CDC Public Health Ethics office, the CDC Privacy Unit, the Office of General Counsel, the Baltimore City Health Department, the Maryland Department of Human Services, and the State Council on Child Abuse and Neglect in the State of Maryland. These discussions focused on the implications for Maryland mandatory reporting laws, responsibilities and procedures to maintain respondent safety, privacy, and confidentiality, and procedures to ensure access to resources and services for respondents. The discussions and feedback from experts endorsed the planned protocols and procedures regarding respondent safety, privacy, and confidentiality. Experts expressed support for 1) use of ACASI; 2) methodological components including use of split samples for males and females, graduated consent procedures, and only one respondent per household; 3) plans to not collect or document PII; 4) use of verbal consent and assent to eliminate the potential for inadvertent disclosure; and 5) use of a response plan to offer services to respondents. Notably, legal and public health ethics experts indicated that a distinction should be established between information that is provided to an individual interviewer – which would be covered by mandatory reporting laws in the State of Maryland – and information provided confidentially through technology in a manner that does not permit linkages of data to PII. In particular, public health ethics and privacy experts indicated that strategies to enable interviewers to view data provided by the respondent in the tablets through ACASI could undermine respondent privacy and confidentiality and may constitute a risk to respondents. Experts’ assessments indicated that the benefits of the project outweigh potential harms, and given the degree of precautions built into the protocol, the risks of data collection poses to respondents are proportional to the potential public benefits of resources provided to respondents. Furthermore, experts and the Office of General Counsel indicated that any information disclosed directly to the interviewer would be subject to mandatory reporting laws and should be reported as consistent with procedures established by the Maryland Department of Human Services.
A.9. Explanation of Any Payment or Gift to Respondents
Incentives are one an effective way to decrease non-response bias in survey research. Nonresponse bias refers to the bias that exists when respondents to a survey are different from those who did not respond (e.g., in terms of demographic or attitudinal variables). Nonresponse bias occurs when there is a connection between the likelihood of participation in the survey and the specific survey measure of interest. The degree to which sampled respondents differ from the survey population (i.e., nonresponse bias) is central to evaluating the representativeness of a survey.xli Nonresponse bias is a problem because it can lead to incorrect estimates that are inflated or deflated. Associations between survey administration and the outcomes the survey is designed to measure will be assessed.xlii A non-response bias analysis will be performed. If a nonresponse variable is correlated with the phenomenon of interest in the research, the results will be biased.xliii The team will use several approaches to address non-response bias:
Refusal aversion. The team’s first line of defense against non-response is refusal aversion. The project will hire experienced interviewers who are highly skilled at gaining cooperation and conducting interviews. During training we will prepare interviewers with general and project-specific techniques to successfully address respondent concerns. The team will also continuously monitor response rates and reasons for non-response to avoid non-response bias. The team has a great deal of experience in implementing the above approach to reduce non-response bias while accommodating respondents as much as possible through flexibility in scheduling appointments.
Statistical adjustments for non-response and missing data in Baltimore City. Despite these above listed efforts, some non-response is to be expected. Non-response bias can affect the precision of study results and reduce statistical power, and could lead to biased results. To handle the case of missing whole interviews – referred to as total or unit nonresponse – a weighted adjustment methodology will be used for data from Baltimore City. Analyses will use unweighted and weighted results to determine the impact of alternative approaches. Heckman’s two-step process will model the non-participation process based on data available on all cases (e.g., data would be available on respondents and non-respondents on the address from the US postal service file NORC uses for sampling, interviewer observations on the type of dwelling and block, census measures for the block on socio-economic characteristics and any data gleaned from the head-of-household questionnaire). Post-stratification weights and adjustments will also be generated. There may be non-sampling error that affects the precision of results. For example, low response rates correlated with primary outcomes or subgroups of interest are problematic. The inverse probability selection weights will be computed, and analyses will adjust for nonresponse to derive the final sampling weights.
Item-level missing data. Depending on the amount of missing values on individual variables and patterns of missingness, imputation methods may be used to handle missing data on specific variables. Various imputation-based procedures and methods, including nearest neighbor “hot deck” and multiple imputation, will be considered and evaluated for their usefulness in filling in missing values.
A review of the incentive literature found some mixed resultsxliv on the effects of incentives on reducing non-response bias in surveys. While much of the literature suggests that the use of incentives has a positive effect on respondent response rates, little research has examined the impact of their use specifically on nonresponse bias reduction among youth surveyed in-person in a household survey. Among the few youth studies conducted (either not in-person or not on violence) regarding the effect of incentives, results were mixed. One study on youth and nutrition found that incentives were a main reason for participatingxlv while another on youth and health care found that the incentive did not influence the decision to participate.xlvi Interestingly, one study with older university students (not in-person) on sexual assault found higher response rates with a $25 incentive compared to a $10 incentive, but no difference between $25 and $40.xlvii In this same survey, sexual victimization was higher in the $10 incentive group than the $25 incentive group, potentially suggesting that victims were more motivated to participate even when a lower incentive was offered. However, the higher prevalence of victimization among those receiving the $40 compared to the $25 incentive may suggest that victims are susceptible to higher incentive offers.xlviii
Overall, much research across the field of survey research (not specific to youth) suggests that a $10 incentive may improve survey response rates and reduces non-response bias,xlix and that a $20 incentive may increase response rates over a $10 incentive.l Research has also shown that a $1 pre-incentive increases response rates by 6 percentage points and also reduces non-response bias.li
The scientific literature on incentives demonstrates the need for further incentive experimentation with youth respondents in household surveys to inform the adapted VACS methodology. We do not have results from other experiments with youth that can be relied upon to address pilot study questions about the best way to reduce non-response bias, especially in the context of an urban environment experiencing elevated rates of violence.
Incentives for Cognitive Interviewing, Field Test, and Rural Feasibility Pilot
For the cognitive interviews (pre-test 1), respondents will receive compensation of $50, in the form of cash or a gift card provided by the interviewer at the end of the interview. A $50 incentive is needed because of the longer amount of time needed for the cogntive interview 1 to 1.5 hours compared to just completing the survey, and the heavier cognitive demand associated with participating in a cognitive interview compared to just completing the survey without having to report back on the meaning of the question. We also have found that $50 for cognitive interviews is consistent with other studiesliiliii.
During the field test (pre-test 2) and the rural feasibility pilot in Garrett County, all respondents will receive a $20 incentive.
Pre-Incentives for Baltimore Full-Scale Pilot
CDC is planning to take the following approach to the use of incentives during the survey. First, respondents will receive a small incentive as a token of appreciation for their time. Households will be mailed a $1 bill to their homes as a pre-incentive to encourage participation in the screener component of the project. Because cash will be mailed, a lined envelope will be used so that it cannot be easily detected by anyone fraudulently handling the mail. For these small amounts, NORC has found cash combined with a lined envelope to be effective in their experience with pre-incentives. This is a pre-incentive mailed to prospective respondents’ homes to encourage participation in the screener component of the project. Pre-incentives, even in small amounts, have been shown to increase response rates.liv The mailed invitation will direct respondents to the web screener by including both a screener URL and QR code for easily accessing the screener from a smartphone, tablet, or computer. Households will also have the option to call a toll-free number to complete the screener with an interviewer. Below, an incentive experiment with three conditions of varying dollar amounts and the timing of the incentive is outlined. All of these incentives will be distributed to the youth respondents in the form of a gift card by the field interviewer at completion of the interview in the field. Only the pre-incentive of $1 will be mailed.
Incentives for Respondent Questionnaire for Pilot 2, Baltimore Full-Scale Pilot and Garrett County Feasibility Pilot
Three incentive structures will be experimentally tested to determine the optimal amounts and structure for incentives for the VACS population (see Table 1): (1) a direct offer of $20 upon survey completion; (2) recruitment offering $40 early bird incentive if completed within two weeks after screening in, otherwise a $20 post-incentive upon completion; and (3) an initial offer of $20 upon survey completion, with additional communication to non-responders later in the field period with an escalation offer of $40. All post-participation incentives will be distributed to youth respondents in the form of a gift card by the field interviewer at completion of the interview. While the literature mostly covers promised incentives, smaller pre-paid incentives are usually only used for the purpose of getting someone to consider study participation, as opposed to incentivizing the completion of a survey. The extant literature does not specifically address the potential effectiveness of varying dollar amounts and the timing of the incentives with a youth population. Therefore, how differential incentives will work for non-responders is unknown. An incentive experiment is necessary because there are not results from other experiments that can be relied upon to address the VACS pilot study questions about the best way to reduce non-response bias with a youth population. In sum, an incentive experiment is necessary to determine the total dollar values and timing of incentive offers that will yield the greatest reductions in non-response bias, which can ultimately save resources in future VACS studies in the U.S. and produce the highest-quality data. The overall sample size of 1,020 complete surveys has enough power and includes enough cases to analyze the results of this experiment (see SSB section “Sample Size Estimates” for the power analysis). During Pre-test 1 (cognitive interviewing) and Pre-test 2 (field test) and the rural feasibility pilot in Garrett County, the incentives will not be tested. The cognitive interview participants will receive a $50 incentive (see above explanation for this higher amount). There will be a $20 incentive for the field test and rural feasibility pilot.
Table 1. Use of Incentives for the Domestic VACS Pilot Implementation Study
Incentive groups by sample frame |
Sample size per group* |
|
|
Pre-paid Incentive with letter to households |
|
$1 (Baltimore Full-Scale Pilot) |
7,083 |
Direct offer of $50 upon completion of the cognitive interview $50 (Pre-test 1 Cognitive Interviews) |
30 |
Direct offer of $20 upon completion |
|
$20 (Pre-test 2 Field Test, Baltimore City Full-Scale Pilot, Garrett County Rural Feasibility Pilot) |
450 |
|
|
Early bird incentive |
|
$40 (Baltimore Full-Scale Pilot) |
340 |
|
|
Follow-up $40 for non-respondents |
|
$40 (Baltimore Full Scale Pilot) |
340 |
|
|
* Number of eligible sampled persons who are offered the incentive. Estimate contingent on response rates assumed in the sample design. |
Incentives for Head of Household Questionnaire
Given the short length of the head of household questionnaire (15 minutes), heads of households will not receive an incentive for participating in this survey, for any of the phases of the study (field test, rural feasibility pilot, full-scale pilot).
A.10. Protection of the Privacy and Confidentiality of Information Provided by Respondents
Privacy and Confidentiality
The CDC Office of the Chief Information Officer has determined that the Privacy Act does not apply. No system of records will be created under the Privacy Act. The Privacy Impact Assessment (PIA) for this evaluation is included as Attachment C. All persons working on the project will be required to protect confidentiality as stipulated in HHS regulations for the protection of human subjects in research (45 CFR 46), the Common Rule (45 CFR 46 Subpart A) and keep the study information private to the fullest extent allowable by law. All persons working on the project will be informed about and asked to maintain strict confidentiality about the nature of the study and will sign a confidentiality agreement prior to fieldwork, and interviewers will be trained on measures for preserving the confidentiality of respondents.
Several procedures will be used to maintain the privacy of the respondents. The interviewers for the cognitive interviews and survey will be instructed to identify a private space in consultation with the respondent and head of household that is safe and private within the home, unless it is determined that a private space outside the home is safer and more appropriate. The same guidance will apply for cognitive interviews conducted with an online platform. The interviewer will reschedule for another time if privacy cannot be ensured. Privacy and confidentiality during interviews will be further secured using ACASI software, which will allow respondents to answer sensitive questions via a tablet and headphones. With ACASI and headphones, the interviewers will not have to read violence-related questions aloud, reducing the risk of others in the household overhearing.
The survey design utilizes a split sample approach, such that the survey for females will be conducted in different segments/enumeration areas than the survey for males. This approach serves to protect the confidentiality of respondents by eliminating the chance that opposite sex perpetrators will be interviewed in the same community as victims. For instance, a male perpetrator of a sexual assault and the female who was the victim of his sexual assault in the same community would not both be interviewed. Similarly, the design reduces the chance that a female perpetrator and a male victim of sexual violence from the same community would both be interviewed.
Interviewers and field managers will be required to complete a multi-day training held prior to data collection. Training will cover sampling procedures, maintaining confidentiality, establishing a private space to conduct the interview, and responding to adverse reactions to the survey.
Respondent personally identifiable information (PII) will be collected, including addresses and phone numbers, in order to contact households for survey completion. These data will be collected separately from survey data, so any names, addresses, phone numbers, e-mail addresses, will never be associated or directly linked with the survey data. Questionnaire data and personally identifiable information (PII), including addresses, names, and maps, will always be kept separately and never transmitted to CDC. Any PII collected will be accessible only to authorized NORC data collection staff. A household identifier is assigned by NORC during fieldwork and is only known by the survey team while in the field and is not connected to any person or any address and therefore cannot be linked to any individual or household. Since there is no personal identifier, respondents cannot be linked to the data once they have completed the interview. NORC will take several steps to ensure all data is protected and remains confidential. Specifically, upon receipt of tablets from CDC, NORC will password protect and encrypt the devices via AirWatch, a mobile management system that allows NORC IT system administrators to control device images and assign specific users. Tablets are further secured via Knox, which registers the device to the assigned interviewer and allows the device to be remotely locked down if lost or stolen. All data, including PII, are encrypted prior to being synced from tablets to NORC servers. Once data resides on NORC servers, highly secure internal network storage protocols are used to prevent data loss, corruption, and unauthorized breach, as well as to administer least privilege, password-protected access rights.
All NORC system environments meet or exceed FISMA, HIPAA, and NIST 800-53 Revision 4 moderate-level framework compliance standards. Further, all remote access to internal NORC computing resources requires two-factor authentication and encrypted channels. Data access restriction is accomplished using unique case identifiers that allow the database to create a partition between response data and other data that could be used to identify an individual. These security measures ensure that PII is never connected to questionnaire data and will never be delivered to CDC. Once CDC has confirmed receipt of the data and consider the data final, NORC will delete all VACS related data from all NORC systems.
Informed Consent
Data collection will use a graduated consent procedure. When initially introducing the study to household members, interviewers will describe it as a survey on health, education, and life experiences of young people. The interviewer will first obtain consent (Attachments D and D.1) from the head of household to participate in a short survey (approximately 15 minutes) about the household (Attachment E). In households where the selected respondent is an adult (18-24 years old) or emancipated minor by Maryland law, verbal informed consent will be obtained from the selected respondent. Respondents will be read an initial information form (Attachments F and F.1), which introduces the cognitive interview / survey as an opportunity to learn more about young peoples’ health, educational, and life experiences. This initial information form indicates that participation is completely voluntary and confidential. Once the initial information form is read, verbal consent to provide more information about the study will be obtained from each respondent. Once the interviewer and respondent have privacy, the trained interviewer will read the contents of a more detailed verbal consent form. This use of graduated consent is consistent with World Health Organization (WHO) ethical and safety guidelines for research on domestic violence. More specifically, WHO ethical and safety recommendations have been adapted for informed consent for participation in a survey that contains questions on domestic violence in such a way that safety issues are taken into consideration for both the respondent and the interviewerlv.
For selected eligible respondents under 18 years of age, verbal consent from the parent/primary caregiver of the youth will be obtained (Attachments D and D.1 for the survey and cognitive interviews, respectively) prior to seeking assent from the selected respondent. In households where there is an eligible dependent minor respondent who speaks English but whose parent only speaks Spanish, the parent/caregiver consent will be offered in Spanish. In the parent/primary caregiver consent, the cognitive interview / pilot survey is described as a study about the social welfare of young people in Maryland which includes health, educational, and life experiences, including community violence as part of a list of broad topics. This approach is consistent with HHS regulations for the protection of human subjects in research (available at https://www.hhs.gov/ohrp/regulations-and-policy/regulations/45-cfr-46/index.html), subpart D Additional Protections for Children Involved as Subjects in Research. Parents or primary caregivers are informed about sensitive topics included in the survey but – as indicated in HHS 45 CFR §46.116 of Subpart D – adequate provisions are made to protect potentially vulnerable respondents. These procedures were reviewed and approved by the NORC Institutional Review Board under provisions of HHS 45 CFR §46.116 of Subpart D. During the parent/primary caregiver consent process, the interviewer will inform the parent/primary caregiver that the survey is both voluntary and confidential, and about the incentive that will be given to the survey respondent.
Once permission has been obtained from the parent or caregiver, the interviewer will read an initial information form to the respondent, which introduces the cognitive interview / survey as an opportunity to learn more about young peoples’ health, educational, and life experiences (Attachments F and F.1 for the survey and cognitive interviews, respectively). This initial information form indicates that participation is completely voluntary and confidential. Respondents will be informed that the information they share is confidential and will not be shared with anyone. Informed verbal consent/assent will be obtained from each respondent at the end of the assent form.
Verbal consent and assent, instead of a signed form, will be obtained for several reasons. First, a signed form would be the only document linking a respondent to the study. Consequently, this study requested and received a waiver of documentation of assent and informed consent from child, emancipated minors, and adult respondents from IRB to permit the use of verbal consent and assent. Additionally, this interviewer-administered oral consent protocol allows time for respondents to ask questions. After interviewers have documented that the respondent’s questions have been answered, the respondent will be asked to provide their verbal consent and assent if they agree to participate in the study. In consideration of the sensitive nature of the research and the fact that a written consent and assent would be the only documentation of PII, the request for the waiver of signed informed consent and assent by the respondent aligns with efforts to ensure confidentiality and privacy protections. The waiver of signed informed consent and assent does this by avoiding any potential link between names and signatures on consent forms and respondent data.
Certain information reported during the survey administration may indicate a referral for services. Therefore, several procedures will be followed to connect respondents with the appropriate services while protecting respondents. Due to the sensitive nature of this study, there is the possibility that a respondent may become upset during or after the administration of the survey. This will be addressed a number of ways. First, for minor respondents a statement has been included in the parent/caregiver consent indicating that some respondents might find certain questions to be sensitive. A similar version of the statement has been included in the consent forms for respondents. Second, a detailed distressed respondent protocol has been included (Attachment L) that describes how interviewers will be trained to handle respondent distress. Further, the VACS training for interviewers will include a module providing training on managing respondent distress as well as role-plays and demonstrations to ensure interviewers are fully trained to address respondent distress.
A Response Plan, Mandatory Reporting, and Distress Protocol (Attachment L) has been developed based on guidance provided from global VACS implementations, inputs from experts, experiences from other surveys in the U.S., and guidance from Maryland Department of Human Services and Baltimore City Health Department. All respondents will be offered a list of services in their area at the close of the interview. This list of resources includes crisis services at hotlines, victim services, and mental health resources embedded in a broader range of services such as reproductive health, food services, and tutoring and career centers that can meet a wide range of respondent needs. The list is intentionally broad and wide-ranging to include services to meet a variety of needs among respondents and to maintain the privacy and confidentiality of respondents. All respondents will also be offered a referral to the Baltimore behavioral health service system. The referral will include interdisciplinary services and direct contact to a on-call social worker who has been trained to respond to requests for services and support from study referrals. If child abuse or neglect is suspected, the social worker can obtain information necessary to meet mandatory reporting requirements, as a complement to mandatory reporting procedures the interviewers will follow. Interviewers will also be trained to follow mandatory reporting requirements for the state of Maryland if a respondent provides information to the interviewer that indicates he or she has experienced child abuse or neglect, or if a minor is in immediate danger. Specific information on mandatory reporting procedures is described in Attachment L. Interviewers will also be trained to follow a distress protocol if a respondent shows signs of emotional distress. Details are provided in Attachment L.
A.11. Institutional Review Board (IRB) and Justification for Sensitive
Questions
IRB Approval
The study protocol has been reviewed and approved by the NORC Institutional Review
Board (FWA00000142). A copy of the approval letter is provided (Attachment G).
Justification for Sensitive Questions
This study is covered by a Certificate of Confidentiality (COCs), which protects the privacy of research subjects by prohibiting disclosure of identifiable, sensitive research information to anyone not connected to the research except when the subject consents or in a few other specific situations. COCs are now automatically issued by HHS agencies to HHS-funded projects to protect identifiable research information from forced disclosure.
In order to conduct an effective study to better understand violence against children for the purposes of prevention, it is important to collect data that will yield useful results. Very few people, including children, report sexual, emotional, or physical violence to the police or other authorities; therefore, survey data provide the best source of information regarding its prevalence. In the U.S., there have been studies in which children as young as 10 years of age have been interviewed about sexual violence, and the data from these studies have been extremely effective in mobilizing key entities into taking action to prevent violence against children.lvi As no comprehensive, representative data on violence against children exists in the U.S., this data collection to develop adapted VACS protocols for U.S. settings is necessary to improve research and surveillance on violence in the U.S., in order to advance public health science in this critical area.
A.12. Estimates of Annualized Burden Hours and Costs
Estimates of Annualized Burden Hours
The total estimated respondent burden and costs for this 3-year collection is calculated below and illustrated in Table 2. The burden was derived by using 9,363 as the expected number of households contacted. A 90% response rate for 8,424 completed screeners is estimated as this would involve answering a simple yes or no question about whether there is a person aged 13-24 years who resides in the household. It is anticipated that most households where contact is made will be able to provide an answer to this question. A 25% eligibility rate for 2,106 households consented, a 90% consent completion rate for 1,896 completed head of household questionnaires, and 1,131 completed respondent questionnaires are expected over the 3-year project period. The household consent is a short process, and the questionnaires are brief and only take a few minutes. It is anticipated that many households will be able to complete the short consent and household questionnaire. This number considers a response rate of 65% for females and 55% for males. These response rate estimates were based partly on NORC’s experiences with conducting the General Social Survey, which achieved a 59.5% response rate in 2018.lvii The study will also include 60 respondents for Pre-test 2 to field test the field procedures with youth in the target age group. Therefore, the annualized numbers are 3,121 households contacted with an invitation letter, 2,808 completed screeners, 702 households consented, 632 completed head of household questionnaires, and 377 completed respondent questionnaires. The respondent questionnaire total includes 60 field test interviews, 50 rural feasibility pilot interviews, and 1,020 full implementation pilot interviews, for a total of 1,131 (1131 / 3 years = 377 per year). Cognitive interviews will use different instruments and methods, and therefore the burden was derived by using 300 as the number of individuals expected to respond to the recruitment flyer and receive information about the study. Half (150) of the responses will be from participants ages 18-24, and half (150) will be from parents of those ages 13-17 who are unable to be screened and consent themselves. After receiving information, we anticipate a 50% response rate for up to 150 individuals who will be screened for participation. Again, half (75) of those who complete the screener will be parents of youth ages 13-17 and half (75) will be from youth ages 18-24. We estimate a 90% consent and completion rate, and therefore 33 youth will be scheduled for interviews and receive the consent / assent. Parents of youth ages 13-17 (half of all participants) will need to provide permission. We estimate a 95% consent and completion rate for 30 as the number of youth who will complete cognitive interviews.
Table 2. Estimated Survey Annualized Burden Hours
Type of Respondents |
Data Collection |
Number of Respondents |
Number of Responses per Respondent |
Average Burden per Response (in hours) |
Total Burden (in hours) |
Youth ages 18-24 (pre-test 1) |
Cognitive interview response to flyer/ telephone inquiry (Attachment R) |
50 |
1 |
5/60 |
4 |
Cognitive interview screening (Attachment S) |
25 |
1 |
10/60 |
4 |
|
Parents of youth ages 13-17 (Pre-test 1) |
Cognitive interview response to flyer/ telephone inquiry (Attachment R) |
50 |
1 |
5/60 |
4 |
Cognitive interview screening (Attachment S) |
25 |
1 |
10/60 |
4 |
|
Informed parental consent (Attachment D.1) |
6 |
1 |
5/60 |
1 |
|
Youth ages 13-24 (Pre-test 1) |
Cognitive interviewing consent/assent (Attachment F.1) |
11 |
1 |
5/60 |
1 |
Cognitive Interview Protocol (Attachment Q |
10 |
1 |
2 |
20 |
|
Head of Household (Pre-test 2, rural feasibility pilot in Garrett County, full implementation pilot in Baltimore City) |
Invitation letter (Attachment H) |
3,121 |
1 |
2/60 |
104 |
Screener Questionnaire (Attachment I) |
2,808 |
1 |
3/60 |
140 |
|
Head of Household Consent (Attachment D) |
702 |
1 |
2/60 |
23 |
|
Head of Household Questionnaire (Attachment E) |
632 |
1 |
15/60 |
158 |
|
Youth ages 13-24 in Baltimore City or Garrett County, Maryland (Pre-test 2, rural feasibility pilot, full implementation pilot) |
Youth respondent consent/assent (Attachment F) |
632 |
1 |
3/60 |
32 |
Youth Respondent Questionnaire (Attachment J) |
377 |
1 |
1 |
377 |
|
Total: 872 |
Estimates of Annualized Burden Cost
The annual burden cost will be $25,670 (Table 3). The total cost for the three-year project will be $77,010. The estimated costs to respondents are based on the Bureau of Labor Statistics (BLS) data. The mean hourly wage for all occupations in Baltimore is $28.49.
The estimates of individual annualized costs are based on the number of respondents interviewed and the amount of time required from individuals who were reached and completed the cognitive interview inquiry, screening for cognitive interviews, consent for cognitive interviewone-time screener, head of household, and core questionnaires. The screener will take up to 3 minutes to determine whether a household is eligible. For those who agree to participate, the head of household survey will take up to 15 minutes to complete, including screening and verbal informed consent. For those who agree to participate in the core survey, it will take up to 60 minutes, including verbal informed consent.
Table 3. Estimated Annualized Burden Costs
Type of Respondent |
Data Collection |
Number of Respondents |
Number of Responses per Respondent |
Total burden (in hours) |
Average Hourly Wage Rate (in dollars) |
Total Respondent Cost |
Head of Household / Parent |
Invitation Letter |
3,121 |
1 |
104 |
$28.49 |
$2,963 |
Cognitive interview response to flyer and telephone inquiry |
50 |
1 |
4 |
$28.49 |
$114 |
|
Screening for cognitive interviews |
25 |
1 |
4 |
$28.49 |
$114 |
|
Consent for Cognitive Interview |
6 |
1 |
1 |
$28.49 |
$28 |
|
Screener for Questionnaire |
2,808 |
1 |
165 |
$28.49 |
$4,701 |
|
Consent for Questionnaire |
702 |
1 |
23 |
$28.49 |
$656 |
|
Questionnaire |
632 |
1 |
158 |
$28.49 |
$4,501 |
|
Respondent ages 13-24 |
Screening for cognitive interviews |
25 |
1 |
4 |
$28.49 |
$114 |
Cognitive interview response to flyer and telephone inquiry |
50 |
1 |
4 |
$28.49 |
$114 |
|
Screening for cognitive interview |
25 |
1 |
4 |
$28.49 |
$114 |
|
Consent/Assent for Cognitive Interview |
11 |
1 |
1 |
$28.49 |
$28 |
|
Consent/Assent for Questionnaire |
632 |
1 |
32 |
$28.49 |
$912 |
|
Cognitive Interview |
10 |
1 |
20 |
$28.49 |
$570 |
|
Questionnaire |
377 |
1 |
377 |
$28.49 |
$10,741 |
|
Total Annualized Burden Cost |
$25,670 |
A.13. Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers
There will be no direct costs to the respondents other than their time to participate in the data collection. CDC does not anticipate providing capital or start up or other related costs to private entities.
A.14. Annualized Cost to the Government
The contract to conduct the study was awarded to NORC at the University of Chicago through competitive bid in September of 2019. The total cost for the data collection is $5,882,602, including $2,501,446 in contractor costs and $3,381,156 in costs incurred directly by the federal government (Table 4). These total costs are annualized in Table 4.
Costs for this study include personnel for designing the study, developing, programming, and testing the survey instrument; drawing the sample; training the recruiters/interviewers; collecting and analyzing the data; and reporting the study results. The government costs include personnel costs for federal staff involved in the oversight, study design, and analysis, as presented in detail in Table 4.
Table 4. Estimated Annualized Cost to the Government
Labor |
Cost |
CDC personnel for project oversight (15% GS-13 scientist) |
$1,127,052 |
Contract labor for planning and design, development of study protocols, recruitment of respondents, data collection, data preparation, data analysis, report writing, and dissemination of findings |
$833,815 |
Total estimated annualized government costs |
$1,960,867 |
A.15. Explanation for Program Changes or Adjustments
This is a new information collection.
A.16. Plans for Tabulation and Publication, and Project Time Schedule
The schedule for data collection, analysis, and reporting is shown in Table 5 below. Data from each phase of data collection will be stored in password-protected files. Results and findings related to the adaptation and implementation of VACS in a U.S. context will be summarized in a final report. The report shall include input from field procedures, and will include information and recommendations for future data collection on violence against children and youth. This research study is for development purposes and is therefore descriptive. Data tabulations will be used to evaluate the results of questionnaire and methods testing. The information collected in this effort will not be the subject of population estimates or other statistics in CDC reports; results may only be published in the context of methodological research and associations among study variables, and not in the context of providing generalizable estimates of population parameters.
Table 5. Project Time Schedule
Activities |
Time Schedule |
Pre-test 1 cognitive interviewing |
Within 2 months of OMB approval |
Enumeration area selection for full implementation pilot |
Within 2 months of OMB approval |
Survey field pre-test 2 |
Within 2 months of OMB approval |
Data collection in Baltimore |
Starting May 2022 |
Data collection in Garrett County |
Starting May 2022 |
Clean, edit, and analyze dataset |
Within 1-6 months of data collection |
Complete report documenting results and methodology recommendations related to implementation of VACS in the U.S. |
Within 12 months of data collection |
A.17. Reason(s) Display of OMB Expiration Date is Inappropriate
The display of the OMB expiration date is not inappropriate.
A.18. Exceptions to Certification for Paperwork Reduction Act Submissions
There are no exceptions to the certification.
References: Endnotes
Centers for Disease Control and Prevention. Violence against Children Surveys: Towards a Violence-Free Generation. https://www.cdc.gov/violenceprevention/childabuseandneglect/vacs/ (Retrieved February 29, 2020). 2017.
ii Hillis, S., Mercy, J., Amobi, A., & Kress, H. (2016). Global prevalence of past-year violence against children: a systematic review and minimum estimates. Pediatrics, 137(3), e20154079.
iii Finkelhor, D., Turner, HA., Shattuck, A., Hamby, SL. Prevalence of childhood exposure to violence, crime, and abuse: Results from the national survey of children’s exposure to violence. JAMA pediatrics. 2015;169(8):746-754.
iv U.S. Census Bureau. An Aging Nation: Projected Number of Children and Older Adults. 2018. Retrieved from: https://www.census.gov/library/visualizations/2018/comm/historic-first.html
v Centers for Disease Control and Prevention. Child Abuse and Neglect Prevention. https://www.cdc.gov/violenceprevention/childabuseandneglect/index.html. (Retrieved July 24, 2018). 2016.
vi David-Ferdon C, Vivolo-Kantor AM, Dahlberg LL, Marshall KJ, Rainford N, Hall JE. A comprehensive technical package for the prevention of youth violence and associated risk behaviors. Atlanta, GA: National Center for Injury Prevention and Control, Centers for Disease Control and Prevention. 2016.
vii Centers for Disease Control and Prevention. Web-based Injury Statistics Query and Reporting System (WISQARS). Retrieved July 24, 2018 from http://www.cdc.gov/injury/wisqars/. 2016.
viii Centers for Disease Control and Prevention. Youth Risk Behavior Surveillance-United States, 2017. Morbidity and Mortality Weekly Report. 2018:67(8). https://www.cdc.gov/healthyyouth/data/yrbs/pdf/2017/ss6708.pdf
ix United Nations' Secretary General Study on Violence Against Children. Available from: http://www.unicef.org/violencestudy/presskits/2%20Study%20findings_Press%20kit%20EN.pdf
x Felitti VJ, Anda RF, Nordenberg D, et al. Relationship of childhood abuse and household dysfunction to many of the leading causes of death in adults: The Adverse Childhood Experiences (ACE) Study. American journal of preventive medicine. 1998;14(4):245258.
Danese A, Moffitt TE, Harrington H, et al. Adverse childhood experiences and adult risk factors for age-related disease: depression, inflammation, and clustering of metabolic risk markers. Archives of pediatrics & adolescent medicine. 2009;163(12):1135-1143.
Gilbert LK, Breiding MJ, Merrick MT, et al. Childhood adversity and adult chronic disease: an update from ten states and the District of Columbia, 2010. American journal of preventive medicine. 2015;48(3):345-349.
xi Silverman AB, Reinherz HZ, Giaconia RM. The long-term sequelae of child and adolescent abuse: A longitudinal community study. Child abuse & neglect. 1996;20(8):709-723.
Perry BD. The neurodevelopmental impact of violence in childhood. In: Schetky D., Benedek E., eds. Textbook of child and adolescent forensic psychiatry. Washington (DC): American Psychiatric Press; 2001. p. 221–238.2001.
xii Spatz Widom C, Marmorstein NR, Raskin White H. Childhood victimization and illicit drug use in middle adulthood. Psychology of Addictive Behaviors. 2006;20(4):394-403.
xiii National Research Council and Institude of Medicine. From neurons to neighborhoods: the science of early childhood development. in Committee on Integrating the Science of Early Childhood Development. Board on Children, Youth, and Famillies, Commission on Behavioral and Social Sciences and Education. 2000. Washington, DC: National Academy Press.
xiv Felitti, V., et al., The relationship of adult health status to childhood abuse and household dysfunction. American Journal of Preventative Medicine, 1998. 14: p. 245-58.
xv Kendall-Tackett, K., Treating the lifetime health effects of childhood victimization, 2003, Civic Research Institute, Inc: Kingston.
xvi Hurt H, Malmud E, Brodsky NL, Giannetta J. Exposure to violence: Psychological and academic correlates in child witnesses. Archives of pediatrics & adolescent medicine. 2001;155(12):1351-1356.
xvii Bowen NK, Bowen GL. Effects of Crime and Violence in Neighborhoods and Schools on the School Behavior and Performance of Adolescents. Journal of Adolescent Research. 1999;14(3):319-342.
xviii Peek-Asa C, Maxwell L, Stromquist A, Whitten P, Limbos MA, Merchant J. Does Parental Physical Violence Reduce Children's Standardized Test Score Performance? Annals of Epidemiology. 2007;17(11):847-853.
xix Borowsky IW, Ireland M, Resnick MD. Violence Risk and Protective Factors Among Youth Held Back in School. Ambulatory Pediatrics. 2002;2(6):475-484.
xx Ramirez M, Wu Y, Kataoka S, et al. Youth Violence across Multiple Dimensions: A Study of Violence, Absenteeism, and Suspensions among Middle School Children. The Journal of pediatrics. 2012;161(3):542-546.e542.
xxi Macmillan R, Hagan J. Violence in the Transition to Adulthood: Adolescent Victimization, Education, and Socioeconomic Attainment in Later Life. Journal of Research on Adolescence. 2004;14(2):127-158.
xxii Ramirez M, Wu Y, Kataoka S, et al. Youth Violence across Multiple Dimensions: A Study of Violence, Absenteeism, and Suspensions among Middle School Children. The Journal of pediatrics. 2012;161(3):542-546.e542.
xxiii Peguero AA. Violence, Schools, and Dropping Out: Racial and Ethnic Disparities in the Educational Consequence of Student Victimization. Journal of Interpersonal Violence. 2011;26(18):3753-3772.
xxiv Staff J, Kreager DA. Too Cool for School? Violence, Peer Status and High School Dropout. Social Forces. 2008;87(1):445-471.
xxv Begle AM, Hanson RF, Danielson CK, et al. Longitudinal pathways of victimization, substance use, and delinquency: Findings from the National Survey of Adolescents. Addictive Behaviors. 2011;36(7):682-689.
xxvi Cullen FT, Unnever JD, Hartman JL, Turner MG, Agnew R. Gender, Bullying Victimization, and Juvenile Delinquency: A Test of General Strain Theory. Victims & Offenders. 2008;3(4):346-364.
xxvii Nansel TR, Overpeck MD, Haynie DL, Ruan W, Scheidt PC. Relationships between bullying and violence among us youth. Archives of pediatrics & adolescent medicine. 2003;157(4):348-353.
xxviii Carbone-Lopez K, Esbensen F-A, Brick BT. Correlates and Consequences of Peer Victimization: Gender Differences in Direct and Indirect Forms of Bullying. Youth Violence and Juvenile Justice. 2010;8(4):332-350.
xxix Gomez AM. Testing the Cycle of Violence Hypothesis: Child Abuse and Adolescent Dating Violence as Predictors of Intimate Partner Violence in Young Adulthood. Youth & Society. 2010;43(1):171-192.
xxx Haynie DL, Farhat T, Brooks-Russell A, Wang J, Barbieri B, Iannotti RJ. Dating violence perpetration and victimization among US adolescents: prevalence, patterns, and associations with health complaints and substance use. Journal of Adolescent Health. 2013;53(2):194-201.
xxxi Barter C, Stanley N. Inter-personal violence and abuse in adolescent intimate relationships: mental health impact and implications for practice. International Review of Psychiatry. 2016;28(5):485-503.
xxxii Orpinas P, Nahapetyan L, Truszczynski N. Low and Increasing Trajectories of Perpetration of Physical Dating Violence: 7-Year Associations with Suicidal Ideation, Weapons, and Substance Use. Journal of Youth and Adolescence. 2017;46(5):970-981.
xxxiii Taylor B, Mumford E. A National Descriptive Portrait of Adolescent Relationship Abuse: Results From the National Survey on Teen Relationships and Intimate Violence. Journal of Interpersonal Violence. 2016;31(6):963-988.
xxxiv Centers for Disease Control and Prevention. Risk and Protective Factors. https://www.cdc.gov/violenceprevention/youthviolence/riskprotectivefactors.html. 2017
xxxv Centers for Disease Control and Prevention. Youth Violence Resources: Data Sources. Available from: https://www.cdc.gov/violenceprevention/youthviolence/resources.html
xxxvi Finkelhor, D., Turner, HA., Shattuck, A., Hamby, SL. Prevalence of childhood exposure to violence, crime, and abuse: Results from the national survey of children’s exposure to violence. JAMA pediatrics. 2015;169(8):746-754.
xxxvii Keeter, S., Hatley, N., Kennedy, C., & Lau, A. (2017). What low response rates mean for telephone surveys. Pew Research Center, 15, 1-39.
xxxviii Keeter, S., Hatley, N., Kennedy, C., & Lau, A. (2017). What low response rates mean for telephone surveys. Pew Research Center, 15, 1-39.
xxxix U.S. Department of Justice, Federal Bureau of Investigation. Maryland Offenses known to Law Enforcement, 2017. Available from: https://ucr.fbi.gov/crime-in-the-u.s/2017/crime-in-the-u.s.-2017/tables/table-8/table-8-state-cuts/maryland.xls.
xl Baltimore City Health Department. Office of Youth and Trauma Services. Accessible from: https://health.baltimorecity.gov/programs/violence-prevention.
xli Johnson TP, Wislar JS. Response Rates and Nonresponse Errors in Surveys. JAMA. 2012;307(17):1805–1806. doi:10.1001/jama.2012.3532
xlii Johnson TP, Wislar JS. Response Rates and Nonresponse Errors in Surveys. JAMA. 2012;307(17):1805–1806. doi:10.1001/jama.2012.3532.
xliii Johnson TP, Wislar JS. Response Rates and Nonresponse Errors in Surveys. JAMA. 2012;307(17):1805–1806. doi:10.1001/jama.2012.3532
xliv Mercer A, Caporaso A, Cantor D, and Townsend R. (2015). How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly, 79(1), 105–129.
xlv Kafka, T., Economos, C., Folta, S., and Sacheck, J. (2011). Children as Subjects in Nutrition Research: A Retrospective Look at Their Perceptions. Journal of Nurtirion Education and Behavior. 43(2):103-109. https://doi.org/10.1016/j.jneb.2010.03.002
xlvi Smith, K.A., Macias, K., Bui, K., and Betz, C. L. (2015). Brief Report: Adolescents' Reasons for Participating in a Health Care Transition Intervention Study. Journal of Pediatric Nursing. 30(5):165-171. https://doi.org/10.1016/j.pedn.2015.05.007
xlvii Krebs, C.P., Lindquist, C.H., Richards, A., Shook-Sa, B.E., Marcus, Berzofsky, Peterson, K., Planty, M., Langton, L., & Stroop, J. (2016). The Impact of Survey Incentive Amounts on Response Rates and Estimates of Sexual Assault Victimization.
xlviii Krebs, C.P., Lindquist, C.H., Richards, A., Shook-Sa, B.E., Marcus, Berzofsky, Peterson, K., Planty, M., Langton, L., & Stroop, J. (2016). The Impact of Survey Incentive Amounts on Response Rates and Estimates of Sexual Assault Victimization.
xlix Stern, MJ, Bilgen I, Wolter KM. 2014. Do Sequence and Mode of Contact Impact Response Rates for Web Only Surveys? Presented at the 69th Annual Conference of the American Association for Public Opinion Research (AAPOR) in Anaheim, CA.
l Lewis D, Creighton K. 2005. The Use of Monetary Incentives in the Survey of Income and Program Participation. The American Association for Public Opinion Research (AAPOR) 60th Annual Conference.
li Mercer A, Caporaso A, Cantor D, and Townsend R. (2015). How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly, 79(1), 105–129.
lii Centers for Disease Control and Prevention. 2019. The National Intimate Partner and Sexual Violence Survey (NISVS) Cognitive Testing Plan.
liii Levin, K., Willis, G. B., Forsyth, B. H., Norberg, A., Kudela, M. S., Stark, D., & Thompson, F. E. (2009). Using Cognitive Interviews to Evaluate the Spanish-Language Translation of Dietary Questionnaire. Survey Research Methods, 3(1), 13-25.
liv McGonagle KA, Freedman VA. The Effects of a Delayed Incentive on Response Rates, Response Mode, Data Quality, and Sample Bias in a Nationally Representative Mixed Mode Study. Field methods. 2017;29(3):221‐237. doi:10.1177/1525822X16671701
lv World Health Organization, Putting Women First: Ethical and Safety Recommendations for Research on Domestic Violence Against Women, 2001, Department of Gender and Women's Health, World Health Organization: Geneva, Switzerland.
lvi Griffin, M.G., et al., Participation in trauma research: is there evidence of harm? J Trauma Stress, 2003. 16(3): p. 221-7.
lvii Morgan, S. L. 2020. Response Rates and Representativeness: A Benchmark Comparison of the General Social Surveys to the American Community Surveys ,2012-2018: GSS Methodological Report No. 131.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2022-08-12 |