Ssb__vacs_12.14.2020

SSB__VACS_12.14.2020.docx

Pilot Implementation of the Violence Against Children and Youth Survey (VACS) in the US

OMB: 0920-1356

Document [docx]
Download: docx | pdf


SUPPORTING STATEMENT: PART B






OMB#



Pilot Implementation of the Violence Against Children and Youth Survey (VACS) in The United States





December 14, 2020









Point of Contact:

Howard Kress, PhD

Implementation Team Lead

Field Epidemiology and Prevention Branch

Contact Information:

Centers for Disease Control and Prevention

National Center for Injury Prevention and Control

4770 Buford Highway NE

Atlanta, GA 30341-3724

phone: 770.488.1285

email: [email protected]



CONTENTS





Section Page


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS



B.1. Respondent Universe and Sampling Methods 4

B.2. Procedures for the Collection of Information 7

B.3. Methods to Maximize Response Rates and Deal with Nonresponse ...10

B.4. Tests of Procedures or Methods to be Undertaken……………………12

B.5. Individuals Consulted on Statistical Aspects and Individuals

Collecting and/or Analyzing Data…………………………………….14































B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


This is a request to pilot test the Violence Against Children and Youth Surveys (VACS) in the United States. The VACS have been implemented across 24 countries globally, but never in the United States. This request is for a pilot test of the VACS methodology and questionnaire after adaption for the United States context.


A contract to adapt the VACS for the United States context was awarded in late September 2019 to NORC at the University of Chicago to assist CDC and the local health departments with the design and collection of VACS data. The survey instruments have been developed and adapted from international VACS to include a new data collection mode (i.e., Audio Computer-Assisted Self-Interviewing software). Two separate pre-tests will be completed prior to the launch of the main pilot study data collection: Pre-test 1 (n=9 below the OMB threshold for data collection) will take place several months prior to allow for the detection of any problems with the instrumentation and time to fix those problems and Pre-test 2 (n=60) will take place after OMB approval to test the finalized instrument. The CDC VACS study team will submit a change request to OMB if the questionnaire will need to be changed after either pre-test. Data collection for the pilot implementation is set to occur October 2021-March 2022, presuming that protective policies and guidance related to COVID-19 allow for in-person data collection.


PILOT TEST


The proposed study will pilot test the adaptation of the VACS for use in a domestic context, using a representative sample of youth in urban Baltimore and a convenience sample of youth in rural Garrett County, Maryland. Data collection will consist of household surveys, collected through a combination of interviewer-administered and Audio Computer-Assisted Self-Interviewing (ACASI) software formats.


The pilot test has the following objectives:

  1. Adapt the VACS methodology for implementation in the US. The pilot VACS implementation will use a representative sample of youth ages 13-24 and conduct in-person data collection using in-person household survey methods.

  2. Pilot the adapted VACS methodology in Baltimore City that yields results that are representative of the Baltimore city youth population and collect pilot data in a rural county in Maryland.

  3. Test minor variations in the methodology to determine which yield the best response rates.

  4. Estimate the prevalence of physical, emotional and sexual violence perpetrated against males and females in childhood as well as in the past 12 months.

  5. Identify risk and protective factors for physical, emotional and sexual violence against children to inform stakeholders and guide prevention efforts.

  6. Identify the health and social consequences associated with violence against children.

  7. Assess the knowledge and use of medical, psychosocial, legal, and protective services available for children who have experienced sexual, emotional and physical violence.

  8. Identify areas for further research.

  9. Make recommendations to relevant organizations in Baltimore (and similar cities in the US) on developing, improving and enhancing prevention and response strategies to address violence against children as part of a larger, comprehensive, multi-sectoral approach to child protection.

B.1. Respondent Universe and Sampling Methods


Target Population

The target population of the Pilot Implementation of VACS is English-speaking male and female youth ages 13-24 in Baltimore and Garrett County households. Analysis can yield estimates of childhood violence as well as lifetime and past-12 month experiences for all participants. Additional exclusions include 1) youth who do not speak English; 2) youth ages 13-17 whose parent(s)/guardian(s) does not speak English and therefore cannot provide consent; 3) males and females with mental disabilities who do not have the capacity to understand the questions being asked; 4) those with physical disabilities that would prevent them from being able to participate in ACASI modules (e.g., vision and fine motor skill impairment). Those who do not speak English will be excluded because the instrument and survey materials are limited to English. There are too few non-English speakers in Baltimore to be able to adequately represent such individuals in the VACS Baltimore data collection. Additionally, according to the American Community Survey data, only 4.1% of households in Baltimore speak Spanish. Of those households, 90% also speak English.i While we acknowledge that residents of Baltimore speak languages other than English, these data suggest that this study would not be an effective way to test a Spanish version of the survey, given the representative sampling strategy and low numbers of Spanish-speaking households. We acknowledge that people living with disabilities may be at even greater risk of violence than the general population and that this is an important epidemiologic question. However, since this survey is not designed to produce statistically stable estimates of violence in this sub-population we believe this issue would be best addressed in a separate study. We would recommend this as a follow-up study with highly trained interviewers, including those who can sign.


Baltimore Sampling Frame

The sampling frame for the representative sample of youth in Baltimore will include Primary Sampling Units (PSUs) comprised of census block groups, originally compiled by the U.S. Census Bureau for the 2010 Census. Census block groups are defined every 10 years plus time for processing of the data, based on the decennial census. Therefore, the 2010 Census data will likely be the most recent data available when the VACS sample in Baltimore is drawn. However, updated population variables will be used to map to the census block groups. The PSUs will consist of a single census block group or a combination of a small number of census block groups that meet a minimum criteria, to reach a specific measure of size threshold. Our criteria to create PSUs with sufficient measure of size (MOS) will be the following:

  • The MOS for sampling will be the number of Housing Units (HUs) with at least one person age 13-24. We will use the 5-yr combined American Community Survey to estimate the number of housing units meeting this criteria. This will most likely be from the 2014-2018 5-yr data file.

  • We will use a minimum threshold of 1,000 HUs for each PSU. For census block groups that don’t meet that criteria, we will combine adjacent block groups until we meet that minimum threshold.

  • In addition to the MOS criteria, we will also review to ensure our address sampling frame has sufficient sample size to select from (about 500 addresses).

  • The sample size estimate and sampling strategy applies to the main VACS implementation with the representative sample of youth in Baltimore discussed above. The convenience sample of households in Garret County and the 60 pre-test cases are not included in the sampling calculations, although they were necessary to include in the SSA burden table and corresponding explanation, which is why the sample size looks different in this section of SSB and section A.12 in the SSA.

Baltimore Sampling Strategy

The locally representative household survey of females and males 13-24 years of age in Baltimore will be stratified by biological sex and crime levels. We anticipate having approximately 510 completed female interviews and 510 completed male interviews based on the required sample size and consideration of response rates described below.

The sampling strategy involves selecting PSUs from the sampling frame using probability proportional to size (PPS) sampling. The PSUs will be selected in two steps:

    • Divide PSUs into low and high crime areas (counting UCR Part I Crimes):

      1. We will use Baltimore crime statistics, and map to each PSU.

      2. We will assign PSUs to low/high crime based on the number of crimes per 1,000 people, using FBI Uniform Crime Report (UCR) data.

    • Within each strata, select 100 PSUs using PPS sampling.

      1. The total number of PSUs to be selected within each strata will depend on the population distribution between low- and high-crime areas. For example, if 70% of the population is in high-crime areas, then we would select 70% of our 100 PSUs from the high-crime strata.

      2. For PPS, we will randomly select the PSUs within each strata based on the MOS, i.e., the number of HUs with at least one 13-24 year old.


After the PSUs are selected this will be followed by the selection of the fixed number of 85 households by equal-probability systematic sampling. The selection of 85 houses accounts for resolution of the housing unit, screening and listing of households, and interview response rates. Lastly, one eligible participant (female or male) will be randomly selected from the list of all eligible participants. As part of a global methodology, VACS recommends that the master sampling frame should generally contain at least 1,000 PSUs, and NORC will work towards meeting this requirement.ii


In sum, VACS uses a multi-stage sample design for youth in Baltimore:


Stage 1–A total of 100 PSUs will be selected using PPS sampling, based on the number of households in the PSU with an eligible person ages 13-24. These PSUs will be spread proportionally across two strata: low-crime, and high-crime. Within each strata, 46% of the PSUs will be assigned as Female, and the remaining 54% will be assigned as Male. Based on the research methodology literature, we will select a higher percentage of male PSUs to allow for lower participation rates generally found among males compared to females. For the selected PSUs, we will use an address-based frame. The addresses will be geo-coded to the correct PSUs.


Stage 2– Within each PSU, we will select a sample of 85 houses which will be selected by equal-probability systematic sampling. The selection of 85 houses accounts for resolution of the housing unit, screening and listing of households, and interview response rate.


Stage 3–One eligible participant (for each female or male PSU) will be randomly selected from all eligible females (or males depending on the PSU assignment) in each household. If there is only one eligible youth participant in the household that young person is selected to complete a survey interview. If there is more than one eligible respondent in the household than one youth is selected randomly from among all the eligible youth respondents.


Sample Size Estimates

Sample size for the representative survey of youth in Baltimore is determined from the following standard cluster sample formula:




Where:

Z = Confidence Interval, set to 1.96 for a 95% confidence interval;

P = Estimated prevalence of the any childhood sexual violence, estimated at .30;

e = Margin of Error, set at 0.04 to balance the reality of survey costs with the precision of estimates

DEFF = Design Effect, set at 2.0 for a cluster design;

n = Calculated sample size.



Sample Size Calculation:


Although only 1008 surveys is necessary to achieve a representative sample of youth in Baltimore with sufficient power, the sample size was increased slightly to account for any differential response between males and females (total sample n=1,020).


Assuming rates shown in Table 1below, we assume that we will need to draw an initial sample of about 8,500 addresses in Baltimore (100 PSUs * 85 houses/households) to account for the resolution rate, household eligibility rate, and interview completion rate (8,500 *.8 *.25 * .6= 1,020). In addition to the 1,020 completed interviews in Baltimore, this study will aim to complete 60 interviews during Pre-test 2 to test the instrument and study protocols in the field and 50 interviews from youth in Garret County to test the methodology in a rural location, for a total of 1,130 completed interviews. A convenience sampling method will be used to select these participants, and therefore the above sample size determination for the standard cluster sample formula will not be necessary. However, we have still accounted for non-response adjustments in our time and resources estimates, and anticipate contacting 494 (494*.8*.25*.6) households for Pre-test 2 and 413 (413*.8*.25*.6) households in rural Garret County.



Table 1 : Non-Response Adjustments

Rate

Definition or description

Assumption

Resolution Rate (RR)

Ineligible addresses, e.g., vacant units, seasonal housing, out-of-scope buildings.

Based on other recent surveys, it is expected that 80% of the sampled addresses will be resolved as households

Household Screening Rate (HSR)

Accounts for households with persons aged 13-24 years

Based on 2014-2018 American Community Survey (ACS), 25% HHs have eligible persons 13-24 years

Interview Completion Rate

(ICR)

The selected person completes the Individual Questionnaire

60% of the selected persons will complete the Participant Questionnaire


The calculation of the adjustments is presented in the table below.



Table 2: Number of Households to Select


Required Completed Individual Interviews (CII)

Individual Eligibility Rate (IER)

Individual Response Rate (IRR)

Adjusted Individual Interview

(AII)


CII/

(IER*IRR)

HH Eligibility Rate

(HER)

HH Response Rate (HRR)

HH Screen Rate (SR)

Adjusted HH

(AdjHH)





HER*HRR*SR

# HH





AII/AdjHH

#Households to select per PSU

#PSU

Female

510

100%

65%

785

90%

90%

25%

20%

3,875

85

46

Male

510

100%

55%

927

90%

90%

25%

20%

4,579

85

54


Thus, to obtain the 510 completed female interviews, 3,875 households in 46 PSUs will be contacted; to obtain 510 completed male interviews, 4,579 households in 54 PSUs will be contacted for the main VACS data collection. To obtain the 60 interviews for Pre-test 2 and 50 interviews in Garret County, instead of a systematic sample of households, we will use convenience sampling to select households. Interviewers will be instructed to skip a certain number of households, depending on the density of households in the area, in order to help ensure confidentiality and anonymity of study participants. To obtain 60 completed Pre-test 2 interviews, 494 households will be contacted. To obtain 50 completed interviews in Garret County, 413 households will be contacted. In total, 9,363 households will be contacted for the study.


B.2. Procedures for Collecting Information


To achieve the 1,020 completed interviews, CDC will first mail an invitation letter (Attachment H) to each selected household to complete a web or phone screener (Attachment I) that will obtain a roster of all persons in the household and capture basic demographic information including date of birth, first name, sex, and contact information. This will help determine which households have an eligible individual within the household. NORC will obtain and manage all screener information, and CDC will not have access to this information. The households will also have the option of calling a toll-free number to complete the screener with an interviewer. Households that do not complete the screener will receive a visit from a field interviewer to complete the screener.


Second, an interviewer will establish contact with the selected households (Attachment I). In cases where the household completed the screener, the interviewer will confirm household eligibility by asking the first name of the head of household and comparing this to the information provided in the screener. If the information does not match, the interviewer will ask to speak with the head of household or the person representing the head of household in order to introduce the study and confirm that there is one or more eligible participants living in the household. In these cases where the households did not already complete the screener, the interviewer will ask to speak with the head of household in order to complete the screener (Attachment I).


In households in which there is an eligible participant, the head of household will be asked to participate in a 15-minute survey to assess the socio-economic conditions of the household (Attachment E). During this pre-screening phase of determining whether there is an eligible youth (13-24 years old), we only move to the next phase of conducting a household questionnaire if an eligible youth participant is identified in the household. The household questionnaire is administered to assess data on the social economic status of the household, and garner household buy-in for the participant survey. When there is more than one eligible participant, the computer will be used to randomly select one participant. In the case where the head of household is a female or male 18-24 years old, they may also be selected as the participant. In this case, they would complete both the household questionnaire and the participant questionnaire (Attachment J). The household questionnaire (Attachment E) and first section of the core participant questionnaire (Attachment J) will be administered by the interviewer. Subsequent sections of the core participant questionnaire will be completed by the participant using Audio Computer-Assisted Self-Interview (ACASI) software.


A multi-stage graduated consent (or assent for participants ages 13-17) procedure will be followed. First, consent will be obtained from the head of household to complete the head of household questionnaire (Attachment D). For selected eligible participants under 18 years of age, verbal informed consent will first be obtained from their parent/primary caregiver. As part of the consent process, parents/caregivers will be told that this study is about the health and wellness of youth in Maryland and covers many topics including community violence. Participants will then be given some basic information on the study and asked if they wish to learn more about the study. After privacy is established for those who agree to learn more about the study, a full assent/consent will be obtained from the selected participant (Attachment F).


Interviewers and field managers will be required to complete a multi-day training held prior to data collection. Training will cover sampling procedures, maintaining confidentiality, establishing a private space to conduct the interview, and responding to adverse reactions to the survey.


To obtain the 60 interviews for Pre-test 2 and 50 interviews for Garret County, we will use convenience sampling to select households, and interviewers will be instructed to skip a certain number of households base on household density in the area, to help ensure confidentiality of study participants. Instead of randomly selecting participants in each household, participants will be selected to ensure adequate representation in the pilot test based on gender and the various age groups (i.e. 13-15 years, 16-17 years, and 18-24 years). The head of household questionnaire and consent procedure described above will be used for all Pre-test 2 and Garret County participants. Pre-test 2 will be conducted by interviewers after completing training. Pre-test 2 will take place after OMB approval to test the finalized instrument.


Safety Procedures: Respondent and interviewer safety is a primary concern for any data collection asking about violence and abuse. To maximize human subject protection, the communication materials have been carefully written to be very general and describe the study in broad terms. The lack of detailed study information in the advance letter is intentional for the protection of the prospective study participant. If the prospective study participant is being abused by a person in the household, a more general introductory letter is less likely to raise suspicion or incite potential perpetrators.


The surveys will use a graduated consent procedure. When initially introducing the study, the interviewer will describe it as a survey on “health.” Respondents will be interviewed in a private location, and they will be informed that they can end the interview at any time.


There is evidence that adults find that talking about their experiences of violence is beneficial and appreciate having the opportunity to be asked questions related to their experiences.iii,iv More broadly, there is some evidence that adolescents and young adults are willing to talk about their experiences of abuse under a supportive structure.v Nevertheless, participants may recall frightening, humiliating, or painful experiences, which may cause a strong emotional response. The project field interviewers will receive distress protocol training including how to recognize and document distress. CDC has developed a response plan protocol for this survey to provide help to those participants who need and want help from past or current experiences of violence through direct service referrals. All participants will be provided with a list of services, which will be available hardcopy and QR code. In order to ensure that the nature of the survey is not revealed to non-participants, the list of services will include a variety of services relevant to youth and embedded in the general list will be services specific to violence victimization. Furthermore, these services will reflect free programs, services, and amenities currently offered in Baltimore City and Garret County. This list will include addresses, phone numbers, and website information whenever possible. The goal is that all participants in need of services are able to identify local services that they can access with relative ease and that are capable and experienced dealing with youth. The list will continue to be updated and revised until survey implementation in order to provide the most up-to-date and robust information on services available In addition to providing list of services directly related to violence against women and children such as the integrated service centers for women and children, the list will also include medical centers, non-formal education services, and family welfare centers. Interviewers will be instructed to indicate which organizations and agencies provide services for violence, so that the participants clearly understand where to obtain the necessary services.


Next, participants who meet any one of the following criteria will be offered a direct referral to a project social worker who will be on call for the duration of the study and will monitor all referrals for the duration of data collection:

  • The participant becomes upset during the interview (for example, tearful, angry, sad, shaking body, difficulty breathing etc.); or

  • The participant shares at any point during their interaction with a field interviewer that he or she does not feel safe in his or her current living situation, including in his or her home or community due to violence; or

  • The participant asks for help for violence, regardless of what they may or may not have disclosed during the interview.

  • The ACASI signals the interviewer to offer the participant the response plan, based on their responses. While the interviewer will not know the participant’s answers, the ACASI will signal the response plan when the participant reports any of the following in answering ACASI survey questions:

    • The participant has experienced violence in the past 12 months (physical violence, emotional violence or sexual violence); or

    • The participant reported feeling they were in immediate danger.

    • The participant reported feeling they were unsafe in their current living situation.

    • The participant reported attempting suicide in the past 12 months.


B.3. Methods to Maximize Response Rates and Deal with Nonresponse


In-Person Household Survey Design

The in-person household survey design allows the data collection team to engage in a variety of efforts to maximize response rates and deal with nonresponse. First, visiting households in-person makes the completing the survey maximally convenient for respondents, as they will be able to complete the survey within the comfort of their own home. Further, interviewers will be trained to explain the purpose of the data collection to heads of household, parents/primary guardians, and participants themselves in order to garner buy-in for survey completion.


Mailer Invitation

A web screener invitation letter will be mailed to each selected survey household to introduce the study and invite them to complete the screener (Attachment H). The screener invitation on official project letterhead will provide legitimacy for the study and include a $1 pre-incentive to encourage participation in the project. Pre-incentives, even in small amounts, have been shown to increase response rates.vi The mailed invitation will direct participants to the web screener by including both a screener URL and QR code for easily accessing the screener from a smartphone, tablet, or computer. Households will also have the option to call a toll-free number to complete the screener with an interviewer.


Repeated Visits for Non-respondents

Households will be visited by an interviewer multiple times at various times of day and on different days of the week in an effort to reach participants at a convenient time in which they are home and available to complete an interview. If the selected participant is not available after six attempts, the case will be reviewed by a Field Manager to determine whether additional outreach should be made. The field manager will base this decision on several factors: (1) whether outreach was attempted at a variety of days and times, (2) whether any contact has been made with the household (i.e., interviewer was told to return at another time) that may have suggested a passive refusal, and (3) the benefit and cost effectiveness of additional attempts to interview this individual. If a determination is made to cease outreach or the participant refuses to participate, the household is skipped regardless of whether another eligible participant exists in the household. The household will not be replaced.


Scheduling Interview Time and Place

Combined with our protocols of protecting the security and confidentiality of the data, VACS protocols call for administering the survey under private conditions. Interviewers will be instructed to identify a private space in consultation with the participant and head of household that is safe and private inside the home. In cases where privacy cannot be ensured inside the home, the interviewer and participant will schedule a time and place to meet while the survey team is still in the community. Interviewers will meet up with the participant at the designated time and place, and the participant will be identified by visual recognition, not by any kind of identifying information. If the interview cannot be rescheduled while the survey team is in the selected community, the interview will be considered incomplete. This commitment to conducting the survey under safe and private conditions will encourage participation among participants concerned about the confidentiality of the data.


Incentives

Incentives are one of the most effective ways to increase response rates. Numerous studies have demonstrated that incentives, and in some cases higher amounts of incentives, lead to higher response rates.vii viii ix Some studies further support the use of incentives by demonstrating reductions in nonresponse bias and attrition bias resulting from their use.x xi xii Unfortunately, little research exists on the effectiveness of incentives on reducing non-response bias (as opposed to increasing response rates), particularly with youth participants.


Thus, the incentive experiment described herein will help us to fill this gap in the literature by allowing us to assess the effectiveness of three different incentive structures with youth participants. We propose testing three different incentive structures as part of the pilot test:

  1. A direct offer of $20 upon survey completion;

  2. Recruitment offering $40 early bird incentive if completed within two weeks after screening in, otherwise a $20 post-incentive upon completion; and

  3. An initial offer of $20 upon survey completion, with additional communication to non-responders later in the field period with an escalation offer of $40.


Incentives will be necessary for minimizing nonresponse bias and ensuring data quality with the target population. Given the sensitive survey topic, the study can only be conducted as an in-person household survey, to ensure the privacy and safety of participants. Therefore, we cannot offer various modes for completing the survey, which often helps decrease nonresponse bias.

Previous studies have suggested that incentives of $20-$40 for youth participants help to increase the response rates. Of those studies that have examined the effect of incentives, results were mixed. One study found that incentives were a main reason for participatingxiii while another found that the incentive did not influence the youth’s decision to participate.xiv Interestingly, one study with university students found higher response rates with a $25 incentive compared to a $10 incentive, but no difference between $25 and $40. In this same survey, sexual victimization was higher in the $10 incentive group than the $25 incentive group, potentially suggesting that victims were more motivated to participate even when a lower incentive was offered. However, the higher prevalence of victimization among those receiving the $40 compared to the $25 incentive may suggest that victims are motivated to do surveys when higher incentives are offered.xv


Although not examining the impact of their use specifically on nonresponse bias reduction among youth in household surveys, this body of literature on youth incentives suggests that the proposed $20 to $40 incentive for the U.S. VACS pilot is an appropriate amount for youth in the study sample and has potential to increase response rates. Further, the lack of incentive experimentation with this age groups supports the need for the proposed incentives test to determine which incentive structure, if any, reduces nonresponse bias.


B.4. Tests of Procedures or Methods to be Undertaken

In addition to examining the differences in outcomes across the different incentive structures, the domestic VACS pilot will determine the feasibility of the in-person methodology and questionnaire for use within the United States.


Pre-Testing of Core Instrument

As previously mentioned VACS has been implemented in 24 countries globally. As part of the global implementation, the core questionnaire has been cognitively tested in several of these countries to ensure its appropriateness and understandability across a variety of cultures. The questionnaire has since been adapted specifically for a U.S. context, and it will be important to test this questionnaire prior to the full pilot implementation. Therefore, CDC will pre-test adaptations to the Core survey with youth in the target age group. Two pre-tests will be conducted: (1) the first, Pre-test 1, will include interviewing nine participants from the target population to test the survey instrument, and (2) the second, Pre-test 2, will include 60 participants to test all study protocols in the field. The purpose of Pre-test 1 is to allow for the detection of any problems with the instrumentation and time to fix those problems. Pre-test 1 will assure comprehension, inform the burden estimation, and gather constructive feedback on any misunderstandings, sensitivities, or gaps in the survey language. Pre-test 1 will be conducted with a convenience sample (n=9) of youth from the target population. CDC will submit a change request to OMB following the implementation of Pre-test 1 if changes need to be made to the survey. The findings from Pre-test 1 will be used to inform instrument revisions for Pre-test 2. Pre-test 2 will take place after OMB approval to test the finalized instrument and survey procedures.


Pre-testing will be conducted with a fairly even split of males and females and youth ages 13-17 and 18-24 (i.e., four males and five females, five 13-17 year olds and four 18-24 year olds for Pre-test 1). The lead interviewers (team leaders) will conduct the pre-tests after their in-depth training on the questionnaire, interviewing on this sensitive topic, human subjects protection, safety and confidentiality and electronic data collection.


Pre-test 2 will consist of two days in the field interviewing participants and implementing all study protocols and one day for discussion and feedback from the field staff. All consenting and confidentiality procedures established for the main pilot test will be followed. However, we will offer just the standard $20 incentive for the pre-test participants and not test the other components of the incentive experiment at this pre-test stage of the study. Instead of a random sample of households, we will use convenience sampling to select households within each of the pre-test sites. Interviewers will be instructed to skip a certain number of households, depending on the density of households in the area, in order to help ensure confidentiality and anonymity of study participants. We will aim to have approximately 5 completed interviews in each of the age categories listed above by sex and crime setting, totaling 60 surveys (see Table 3). This will help test the questionnaire among both male and female participants of varied age groups to ensure appropriateness of the questionnaire and study protocols.






Table 3: Pre-test Sample Composition


Participant Characteristics

Crime Setting Total

Sex

Age

Low Crime

High Crime


Male

13-17

7

8

15

18-24

8

7

15

Female

13-17

8

7

15

18-24

7

8

15

Total

30

30

60


The pre-tests will help inform the survey procedures, including but not limited to: community entry, approaching households, gaining consent, as well as the referral process for participants who have an adverse reaction to the survey. Through administering the questionnaire in the pre-test, we will assess willingness to participate, average length of interviews and any comprehension issues within the questionnaire. While the questionnaire has been cognitively tested in other countries, each pre-test will help to ensure that the questions are understood and are obtaining the data we are seeking from participants in the United States. The information obtained from Pre-test 1 will be used to further revise the instruments to reduce burden and improve the utility of the instruments and Pre-test 2 will then test the finalized instrument.


Pilot Test

This request is for a pilot test of VACS in the U.S., which will be used to determine the feasibility of implementing the survey in a domestic context and serve as a foundation for future research on violence against children and youth in the U.S. In addition, this pilot data will be used to estimate the prevalence of physical, emotional, and sexual violence perpetrated against males and females in childhood and in the past 12 months for an urban US environment; identify risk and protective factors for physical, emotional, and sexual violence against children to inform stakeholders and guide prevention efforts; identify the health and social consequences associated with violence against children, and assess the knowledge and use of medical, psychosocial, legal, and protective services available for children who have experienced sexual, emotional and physical violence in Baltimore. The CDC plans to use the results of this survey to inform its violence surveillance, prevention and response efforts and refine practices related to the protection of children.


Analysis of Data

CDC will provide weights for the data to allow CDC researchers to obtain parameters that are representative of the total youth population 13-24 years old in Baltimore. CDC will apply a three step weighting procedure incorporating: (Step 1) computation of base weight for each sample participant; (Step 2) adjustment of the base weights for differential non-response in the sample; and (Step 3) calibrate the adjusted weights to known population totals. CDC will produce weighted point estimates and the appropriate 95% confidence intervals using SAS 9.3 survey procedures.


CDC will lead the analysis, ensuring that the investigation, where appropriate, is consistent with global analyses of VACS. CDC will produce a complete description of the findings, including reporting weighted frequencies and percentages along with 95% confidence intervals on the principal variables of interest. Charts and diagrams will be used to display data. Tables will be created to illustrate distributions of characteristics associated with sexual behavior and practices; physical, emotional, and sexual violence; and utilization of health care services, counseling services, and other services utilized by participants.


Data will be analyzed separately for males and females. The participants are partitioned into two sub-groups for analysis: a 13 to 17 age group and an 18 to 24 age group. The 13 to 17 age group will yield information on events occurring in the past 12 months (current estimates of violence against children). Lifetime estimates of violence during childhood are based on responses from participants aged 18 to 24 reporting on their experiences prior to the age of 18.


B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


Individuals who have participated in designing the data collection:

CDC Staff:

Leah Gilbert, MD, MSPH

Laura Chiang, MPH

Howard Kress, PhD

Liping Zhu, MPH

Jose Luis Carlosoma, M.S.

NORC Staff:

Bruce Taylor, PhD

Elizabeth Mumford, PhD

Shannon Nelson

The following individuals participate in data analysis:


CDC Staff:

Leah Gilbert, MD, MSPH

Howard Kress, PhD

Liping Zhu, MPH

Jose Luis Carlosoma, M.S.


NORC Staff:

Bruce Taylor, PhD

Elizabeth Allen, MS

Elizabeth Mumford, PhD

Shannon Nelson, MA





References/Endnotes

i U.S. Census Bureau. (2019). 2018 American Community Survey 5-Year Estimates, Table DP02. Retrieved from https://data.census.gov/cedsci/table?q=Baltimore%20County,%20Maryland%20Families %20and%20Living%20Arrangements&tid=ACSDP1Y2018.DP02&hidePreview=true

ii Nguyen, K.H., Kress, H., Villaveces, A., Massetti, G.M., (2019). Sampling Design and Methodology of the Violence Against Children and Youth Surveys. Injury Prevention, 25(4):321-327. https://doi.org/10.1136/injuryprev-2018-042916

iii Jansen, H.A.F.M., "Putting Women First" Ethical and Safety recommendations for Research on Violence against Women: Training in Research in Reproductive Health/Sexual Health, 2006, World Health Organization.

iv Draucker, C.B., The emotional impact of sexual violence research on participants. Arch Psychiatr Nurs, 1999. 13(4): p. 161-9.

v Jensen, T.K., et al., Reporting possible sexual abuse: a qualitative study on children's perspectives and the context for disclosure. Child Abuse Negl, 2005. 29(12): p. 1395-413.

vi McGonagle KA, Freedman VA. The Effects of a Delayed Incentive on Response Rates, Response Mode, Data Quality, and Sample Bias in a Nationally Representative Mixed Mode Study. Field methods. 2017;29(3):221‐237. doi:10.1177/1525822X16671701

viiMcGonagle KA, Freedman VA. The Effects of a Delayed Incentive on Response Rates, Response Mode, Data Quality, and Sample Bias in a Nationally Representative Mixed Mode Study. Field methods. 2017;29(3):221‐237. doi:10.1177/1525822X16671701

viii Edwards, P., Roberts, I., Clarke, M., DiGuiseppi, C., Pratap, S., Wentz, R., et al. (2002). Increasing response rates to postal questionnaires: Systematic review. British Medical Journal, 324.

ix Goritz, A. S. (2006). Incentives in web studies: Methodological issues and a review. International Journal of Internet Science, 1, 58–70.

x Singer, E., & Ye, C. (2013). The Use and Effects of Incentives in Surveys. The ANNALS of the American Academy of Political and Social Science, 645(1), 112–141. https://doi.org/10.1177/0002716212458082

xi Williams, D., & Brick, J. M. (2018). Trends in US face-to-face household survey nonresponse and level of effort. Journal of Survey Statistics and Methodology, 6(2), 186-211.

xii Felderer, B., Müller, G., Kreuter, F., & Winter, J. (2018). The Effect of Differential Incentives on Attrition Bias: Evidence from the PASS Wave 3 Incentive Experiment. Field Methods, 30(1), 56–69. https://doi.org/10.1177/1525822X17726206

xiii Kafka, T., Economos, C., Folta, S., and Sacheck, J. (2011). Children as Subjects in Nutrition Research: A Retrospective Look at Their Perceptions. Journal of Nurtirion Education and Behavior. 43(2):103-109. https://doi.org/10.1016/j.jneb.2010.03.002

xiv Smith, K.A., Macias, K., Bui, K., and Betz, C. L. (2015). Brief Report: Adolescents' Reasons for Participating in a Health Care Transition Intervention Study. Journal of Pediatric Nursing. 30(5):165-171. https://doi.org/10.1016/j.pedn.2015.05.007

xv Krebs, C.P., Lindquist, C.H., Richards, A., Shook-Sa, B.E., Marcus, Berzofsky, Peterson, K., Planty, M., Langton, L., & Stroop, J. (2016). The Impact of Survey Incentive Amounts on Response Rates and Estimates of Sexual Assault Victimization.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCDC User
File Modified0000-00-00
File Created2021-01-12

© 2024 OMB.report | Privacy Policy