SSB__VACS_FINAL_v2

SSB__VACS_FINAL_v2.docx

Pilot Implementation of the Violence Against Children and Youth Survey (VACS) in the US

OMB: 0920-1356

Document [docx]
Download: docx | pdf


SUPPORTING STATEMENT: PART B




OMB#


Pilot Implementation of the Violence Against Children and Youth Survey (VACS) in The United States



October 18, 2021





Point of Contact:

Jeffrey D. Ratto, MPH

Health Scientist

Field Epidemiology and Prevention Branch

Contact Information:

Centers for Disease Control and Prevention

National Center for Injury Prevention and Control

4770 Buford Highway NE

Atlanta, GA 30341-3724

phone: 404-498-0370

email: [email protected]


CONTENTS



Section Page


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


B.1. Respondent Universe and Sampling Methods 4

B.2. Procedures for the Collection of Information 7

B.3. Methods to Maximize Response Rates and Deal with Nonresponse ...10

B.4. Tests of Procedures or Methods to be Undertaken……………………12

B.5. Individuals Consulted on Statistical Aspects and Individuals

Collecting and/or Analyzing Data…………………………………….14



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

This is a request to pilot the Violence Against Children and Youth Surveys (VACS) adapted for the United States. The VACS have been implemented across 24 countries globally, but never in the United States. This request is to pilot the adapted VACS methodology and questionnaire after adaption for the United States context.


A contract to adapt the VACS for the United States context was awarded in late September 2019 to NORC at the University of Chicago to assist CDC and the local health departments with the design and collection of VACS data. The survey instruments have been developed and adapted from international VACS to include a new data collection mode (i.e., Audio Computer-Assisted Self-Interviewing software). Two separate pre-tests will be completed prior to the launch of the main pilot study data collection: Pre-test 1 (n=30) will take place first and will include cognitive testing of the adapted questionnaire. The results of the pre-test 1 cognitive testing will assess how the adapted questionnaire is performing in the U.S. context. Results may indicate needed changes to the questionnaire. Pre-test 2 (n=60) will involve a field test of the survey procedures. Change requests will be submitted for the Information Collection Request (ICR) if the changes to the adapted questionnaire are needed after either pre-test. Data collection for pre-test 1 (cognitive testing interviews) will take place shortly after the ICR is approved. Data collection for the full pilot implementation is set to occur starting in May 2022, presuming that protective policies and guidance related to COVID-19 allow for in-person data collection.


SURVEY PILOT IMPLEMENTATION

The proposed study will pilot the adaptation of the VACS for use in a domestic context, using a representative sample of youth in urban Baltimore and a convenience sample of youth in rural Garrett County, Maryland. While adapting the survey instrument for use in the pilot implementation in Baltimore, Maryland, CDC has been collaborating with the Baltimore City Health Department (BCHD) to elicit their input on data needed to develop well-informed violence prevention programs within their community. CDC will also engage with a rural locale in Maryland, Garrett County, to conduct a feasibility pilot of the questionnaire and survey procedures in a rural area.


Data collection will consist of household surveys, collected through a combination of interviewer-administered and Audio Computer-Assisted Self-Interviewing (ACASI) software formats.


The survey pilot implementation has the following objectives:

  • Adapt the VACS methodology for implementation in the US. The pilot VACS implementation in an urban and suburban setting will use a representative sample of youth ages 13-24 and conduct in-person data collection using in-person household survey methods.

  • Pilot the adapted VACS methodology in Baltimore City using a representative sample of youth ages 13-24.

  • Assess the feasibility of field procedures with a convenience sample in rural Garrett County in Maryland.

  • Test minor variations in the methodology to determine which variations result in better response rates and indicate less survey non-response bias.

  • Identify risk and protective factors associated with physical, emotional and sexual violence against children and youth to inform stakeholders and guide future VACS implementation in the US.

  • Identify the health and social consequences associated with violence against children and youth.

  • Assess the knowledge and use of medical, psychosocial, legal, and protective services available for children and youth who have experienced sexual, emotional and physical violence.

  • Identify areas for further research.

  • Assess the feasibility of VACS adaptations in the U.S., including data collection through ACASI as well as adapted field procedures for community entry and household entry.


B.1. Respondent Universe and Sampling Methods


Target Population

The target population of the pilot implementation of the adapted VACS is English-speaking male and female youth ages 13-24 in Baltimore and Garrett County households. Exclusion criteria include 1) youth who do not speak English; 2) males and females with mental disabilities who do not have the capacity to understand the questions being asked; 3) males and females with physical disabilities that would prevent them from being able to participate in ACASI data collection (e.g., vision and fine motor skill impairment). Potential respondents who do not speak English will be excluded. Although VACS has been implemented in Spanish-speaking countries, there are too few non-English speakers in Baltimore to be able to adequately represent such individuals in the VACS Baltimore data collection. According to American Community Survey data, only 4.1% of households in Baltimore speak Spanish. Of those households, 90% also speak English.i To further ensure representation, the parent consent will be available in Spanish to accommodate households where the 13-17 dependent minor speaks English, but the parent/caregiver does not speak English and would not be able to otherwise provide parental consent for their dependent minor. Although residents of Baltimore speak languages other than English, data on the proportion of Spanish-speaking youth in Baltimore suggest that this study would not be an effective way to test a Spanish version of the survey, given the representative sampling strategy and low numbers of Spanish-speaking households. People living with disabilities may be at even greater risk of violence than the general population. However, since this survey is not designed to produce results representative of this sub-population, this issue would be best addressed in a separate study.


Baltimore Sampling Frame

The sampling frame for the representative sample of youth in Baltimore will include Primary Sampling Units (PSUs) comprised of census block groups, originally compiled by the U.S. Census Bureau for the 2010 Census. Census block groups are defined every 10 years plus time for processing of the data, based on the decennial census. Therefore, the 2010 Census data will likely be the most recent data available when the sample in Baltimore is drawn. However, updated population variables will be used to map to the census block groups. The PSUs will consist of a single census block group or a combination of a small number of census block groups that meet a minimum criteria, to reach a specific measure of size threshold. Criteria to create PSUs with sufficient measure of size (MOS) will include the following:

  • The MOS for sampling will be the number of Housing Units (HUs) with at least one person ages 13-24 years. The 5-year combined American Community Survey data will be used to estimate the number of housing units meeting these criteria. This will most likely be from the 2014-2018 5-year data file.

  • A minimum threshold of 400 HUs will be used for each PSU. For census block groups that do not meet that criteria, adjacent block groups will be combined until that minimum threshold is met.

  • In addition to the MOS criteria, the address sampling frame will be reviewed to ensure it has sufficient sample size to select from (about 200 addresses).

  • The sample size estimate and sampling strategy applies to the main adapted VACS implementation with the representative sample of youth in Baltimore discussed above.


Baltimore Sampling Strategy

The locally representative household survey of females and males 13-24 years of age in Baltimore will be stratified by biological sex and crime levels. Approximately 510 completed female interviews and 510 completed male interviews are anticipated based on the required sample size and consideration of response rates described below.


The sampling strategy involves selecting PSUs from the sampling frame using probability proportional to size (PPS) sampling. The PSUs will be selected in two steps:

    • Divide PSUs into low and high crime areas (counting UCR Part I Crimes) to form two strata:

    • Baltimore crime statistics will be mapped to each PSU.

    • PSUs will be selected from low/high crime areas based on the number of crimes per 1,000 people, using FBI Uniform Crime Report (UCR) data.

    • 100 PSUs will be selected from Baltimore sampling frame using PPS sampling.

    • The total number of PSUs selected will be distributed proportional to the population distribution between low- and high-crime areas. For example, if 70% of the population is in high-crime areas, 70 out of 100 PSUs would be selected from the high-crime strata.

    • For PPS, PSU will be randomly selected based on the MOS, i.e., the number of HUs with at least one eligible 13-24-year-old.


After the PSUs are selected this will be followed by the selection of the fixed number of 85 households by equal-probability systematic sampling. The selection of 85 houses accounts for resolution of the housing unit, screening and listing of households, and interview response rates. Lastly, one eligible respondent (female or male) will be randomly selected from the list of all eligible respondents. The pilot adapted VACS in Baltimore will use this same requirement.ii


In sum, the adapted VACS uses a multi-stage sample design for youth in Baltimore:

  • Stage 1. A total of 100 PSUs will be selected using PPS sampling, based on the number of households in the PSU with an eligible person aged 13-24 years. These PSUs will be spread proportionally across two strata: low-crime, and high-crime. Within each strata, 46% of the PSUs will be assigned as Female, and the remaining 54% will be assigned as Male. Based on the research methodology literature, a higher percentage of male PSUs will be selected to account for lower participation rates generally found among males compared to females. For the selected PSUs, an address-based frame will be used. The addresses will be geo-coded to the correct PSUs.

  • Stage 2. Within each PSU, a sample of 85 houses will be selected by equal-probability systematic sampling. The selection of 85 houses accounts for resolution of the housing unit, screening and listing of households, and interview response rate.

  • Stage 3. One eligible respondent (for each female or male PSU) will be randomly selected from all eligible females (or males depending on the PSU assignment) in each household. If there is only one eligible youth respondent in the household, that young person will be selected to complete the questionnaire. If there is more than one eligible respondent in the household, one youth is selected randomly from among all the eligible youth respondents.


Sample Size Estimates

Power analyses were conducted. Assuming a design effect within males and within females of 2.0 (this accounts for the clustered sampling) and a correlation between male/female estimates of .25, we would expect to be able to detect a difference of 7.6 percentage points or greater between estimates, given a 95% confidence interval centered around a 50% estimate. The minimum detectable difference (MDD) is dependent upon multiple factors, but very dependent upon the estimated percentage. A 50% estimate is the most conservative estimate and provides the largest MDD, similar to how a 50% estimate provides the largest margin of error. We will be able to detect moderately small differences with a sample size of 1,000 cases. For the design effect and sample size, detecting a difference of 7.6 is reasonable. Note that using the 30% estimate, the MDD is 7.1 percentage points.


Sample size for the representative survey of youth in Baltimore is determined from the following standard cluster sample formula:



Where:

Z = Confidence Interval, set to 1.96 for a 95% confidence interval;

P = Estimated prevalence of the any childhood sexual violence, estimated at .30;

e = Margin of Error, set at 0.04 to balance the reality of survey costs with the precision of estimates

DEFF = Design Effect, set at 2.0 for a cluster design;

n = Calculated sample size.


Sample Size Calculation:


Although only 1008 completed questionnaires are necessary to achieve a representative sample of youth in Baltimore with sufficient power, the sample size was increased slightly to account for any differential response between males and females (total sample n=1,020).


Assuming rates shown in Table 1 below, an initial sample of about 8,500 addresses in Baltimore (100 PSUs * 85 houses/households) will need to be drawn to account for the resolution rate, household eligibility rate, and interview completion rate (8,500 *.8 *.25 * .6= 1,020). In addition to the 1,020 completed interviews in Baltimore, this study will aim to complete 30 interviews during Pre-test 1 (cognitive interviewing), 60 interviews during Pre-test 2 (field testing) to test the instrument and study protocols in the field, and 50 interviews from youth in Garret County to test the methodology in a rural location, for a total of 1,160 completed interviews. A convenience sampling method will be used to select respondents in both pre-tests 1 and 2 and from the Garret County feasibility pilot. Therefore, the above sample size determination for the standard cluster sample formula will not be necessary. However, non-response adjustments will be accounted for in time and resources estimates, and anticipate contacting 494 (494*.8*.25*.6) households for Pre-test 2 and 413 (413*.8*.25*.6) households in rural Garret County, and providing information about the study to 300 potential cognitive interview participants (300*.8*.25*.6).


Table 1 : Non-Response Adjustments

Rate

Definition or description

Assumption

Resolution Rate (RR)

Ineligible addresses, e.g., vacant units, seasonal housing, out-of-scope buildings.

Based on other recent surveys, it is expected that 80% of the sampled addresses will be resolved as households.

Household Screening Rate (HSR)

Accounts for households with persons aged 13-24 years

Based on 2014-2018 American Community Survey (ACS), 25% HHs have eligible persons 13-24 years.

Interview Completion Rate

(ICR)

The selected person completes the Individual Questionnaire

60% of the selected persons will complete the Respondent Questionnaire. This is based on NORC’s experience with conducting the General Social Survey, which achieved a 59.5% response rate in 2018.


The calculation of the adjustments is presented in the table below.


Table 2: Number of Households to Select


Required Completed Individual Interviews (CII)

Individual Eligibility Rate (IER)

Individual Response Rate (IRR)

Adjusted Individual Interview

(AII)


CII/

(IER*IRR)

HH Eligibility Rate

(HER)

HH Response Rate (HRR)

HH Screen Rate (SR)

Adjusted HH

(AdjHH)





HER*HRR*SR

# HH





AII/AdjHH

#Households to select per PSU

#PSU

Female

510

100%

65%

785

90%

90%

25%

20%

3,875

85

46

Male

510

100%

55%

927

90%

90%

25%

20%

4,579

85

54


Thus, to obtain the 510 completed female interviews, 3,875 households in 46 PSUs will be contacted; to obtain 510 completed male interviews, 4,579 households in 54 PSUs will be contacted for the main VACS data collection in Baltimore City.


In addition to collecting the representative sample of 1,020 interviews in Baltimore City, a targeted convenience sample will also be used to collect 30 cognitive interviews during pre-test 1 and 60 field test interviews in pre-test 2 in Baltimore City, and 50 interviews in Garrett County. The purpose of these interviews will be to test the adaptations to the questionnaire and data collection protocols with youth in the target age group and a rural area. Therefore, a representative sampling strategy will not be used. Interviewers will be instructed to skip a certain number of households, depending on the density of households in the area, in order to help maintain confidentiality of study respondents. For the cognitive interviewing in pre-test 1, a sample size of 30 interviews has been selected as this reflects a benchmark standard in the field, consistent with methods used by the National Center for Health Statistics. To obtain the 30 interviews for pre-test 1, approximately 300 persons will be provided with information about the study. To obtain 60 completed interviews during field testing, 494 households will be contacted. To obtain 50 completed interviews in Garrett County, 413 households will be contacted. In total, 9,363 households will be contacted for the study and 1,130 interviews will be completed across Pre-test 2, Baltimore City, and Garrett County data collections.


B.2. Procedures for Collecting Information

This data collection consists of three main steps: 1) a screener to identify eligible households through letter of invitation, 2) the head-of-household questionnaire, and 3) the youth respondent questionnaire. These steps are described in detail in this section. To achieve the 1,020 completed interviews in the survey pilot implementation, CDC will first mail an invitation letter (Attachment H) to each selected household to complete a web or phone screener (Attachment I) that will obtain a roster of all persons in the household and capture basic demographic information including date of birth, first name, sex, and contact information. This will help determine which households have an eligible individual within the household. NORC will obtain and manage all screener information, and CDC will not have access to this information. The households will also have the option of calling a toll-free number to complete the screener with an interviewer. Households that do not complete the screener will receive a visit from a field interviewer to complete the screener.


Second, an interviewer will establish contact with the selected households (Attachment I). In cases where the household completed the screener, the interviewer will confirm household eligibility by asking the first name of the head of household and comparing this to the information provided in the screener. If the information does not match, the interviewer will ask to speak with the head of household or the person representing the head of household in order to introduce the study and confirm that there is one or more eligible respondents living in the household. In these cases where the households did not already complete the screener, the interviewer will ask to speak with the head of household in order to complete the screener (Attachment I).


In households in which there is an eligible respondent, the head of household will be asked to participate in a 15-minute survey to assess the socio-economic conditions of the household (Attachment E). A household questionnaire will only be completed if an eligible respondent is identified in the household during the screening phase. The household questionnaire is administered to collect data on the social economic status of the household, and garner household buy-in for the respondent questionnaire. When there is more than one eligible respondent, the computer will be used to randomly select one respondent. If the head of household is a female or male aged 18-24 years, they may themselves be selected as the youth respondent. In this case, they would complete both the household questionnaire and the respondent questionnaire (Attachment J). The household questionnaire (Attachment E) and first section of the respondent questionnaire (Attachment J) will be administered by the interviewer. Subsequent sections of the respondent questionnaire will be completed by the respondent using Audio Computer-Assisted Self-Interview (ACASI) software. The questions that will be administered via ACASI are noted in Attachment J, the respondent questionnaire.


A multi-stage graduated consent (or assent for respondents ages 13-17) procedure will be followed. These consent procedures are described in detail in section A.10, Protection of the Privacy and Confidentiality of Information Provided by Respondents.


To obtain the 30 interviews for pre-test 1 (cognitive interviewing), 60 interviews for Pre-test 2 (field test), and 50 interviews for Garret County, convenience sampling will be used to select PSUs and households within PSUs. Interviewers will be instructed to skip a certain number of households based on household density in the area, to help ensure confidentiality of study respondents. Instead of randomly selecting respondents in each household, respondents will be selected to ensure adequate representation in the pre-test phases and the Garrett County rural feasibility pilot based on respondent sex and the various age categories (i.e. 13-17 years, and 18-24 years). Convenience sampling will be used to select households within each of the pre-test 2 locations in Baltimore City, and interviewers will be instructed to skip a certain number of households based on the density of households in the area. For pre-test 1, participants will be recruited through community partnership contacts. The recruitment flyer (Attachment R) will be provided to partners for display in common areas, and interested individuals will be screened (Attachment S). See Table 3 below for the target number of interviews to obtain for pre-tests 1 and 2 within each sex and age group category. The head of household questionnaire and consent procedure described above will be used for all Pre-test 2 and Garret County respondents. Pre-test 2 will be conducted by interviewers after completing training.


Safety Procedures: Respondent and interviewer safety is a primary concern for any data collection asking about violence and abuse. To maximize human subject protection, the communication materials have been carefully written to be very general and describe the study in broad terms. The lack of detailed study information in the advance letter is intentional for the protection of the prospective study respondents. If the prospective study respondent is being abused by a person in the household, a more general introductory letter is less likely to raise suspicion or incite potential perpetrators. Respondents will be interviewed in a private location, and they will be informed that they can end the interview at any time.

There is evidence that adults find that talking about their experiences of violence is beneficial and appreciate having the opportunity to be asked questions related to their experiences.iii,iv More broadly, there is some evidence that adolescents and young adults are willing to talk about their experiences of abuse under a supportive structure.v Nevertheless, participants may recall frightening, humiliating, or painful experiences, which may cause a strong emotional response. The project field interviewers will receive response plan training including how to recognize distress (see Attachment L in SSA for the Response Plan). CDC has developed a response plan protocol for this survey to provide help to those respondents who need and want help from past or current experiences of violence through direct service referrals. All respondents will be provided with a list of services, which will be available hardcopy and QR code. In order to ensure that the nature of the survey is not revealed to non-respondents, the list of services will include a variety of services relevant to youth and embedded in the general list will be services specific to violence victimization. Furthermore, these services will reflect free programs, services, and amenities currently offered in Baltimore City and Garret County. This list will include addresses, phone numbers, and website information whenever possible. The goal is that all respondents in need of services are able to identify local services that they can access with relative ease and that are capable and experienced dealing with youth. The list will continue to be updated and revised until survey implementation in order to provide the most up-to-date and robust information on services available. In addition to providing list of services directly related to violence against women and children such as the integrated service centers for women and children, the list will also include medical centers, non-formal education services, and family welfare centers. Interviewers will be instructed to indicate which organizations and agencies provide services for violence, so that the respondents clearly understand where to obtain the necessary services.


B.3. Methods to Maximize Response Rates and Deal with Nonresponse

In-Person Household Survey Design

The in-person household survey design allows a variety of efforts to maximize response rates and deal with nonresponse. First, visiting households in-person makes the completing the survey maximally convenient for respondents, as they will be able to complete the survey within the comfort of their own home. Further, interviewers will be trained to explain the purpose of the data collection to heads of household, parents/primary guardians, and respondents themselves in order to garner buy-in for survey completion.


Mailer Invitation

A web screener invitation letter will be mailed to each selected survey household to introduce the study and invite them to complete the screener (Attachment H). The screener invitation on official project letterhead will provide legitimacy for the study and include a $1 bill as a pre-incentive to encourage participation in the project. Only the pre-incentive will be mailed. Because cash will be mailed, lined envelopes will be used so that it cannot be easily detected by anyone fraudulently handling the mail. For these small amounts, NORC has found cash combined with a lined envelope to be effective in their experience with pre-incentives. This is a pre-incentive mailed to their homes to encourage participation in the screener component of the project. Pre-incentives, even in small amounts, have been shown to increase response rates.vi The mailed invitation will direct respondents to the web screener by including both a screener URL and QR code for easily accessing the screener from a smartphone, tablet, or computer. Households will also have the option to call a toll-free number to complete the screener with an interviewer.


Repeated Visits for Non-respondents

Households will be visited by an interviewer multiple times at various times of day and on different days of the week in an effort to reach respondents at a convenient time in which they are home and available to complete an interview. If the selected respondent is not available after six attempts, the case will be reviewed by a Field Manager to determine whether additional outreach should be made. The field manager will base this decision on several factors: (1) whether outreach was attempted at a variety of days and times, (2) whether any contact has been made with the household (i.e., interviewer was told to return at another time) that may have suggested a passive refusal, and (3) the benefit and cost effectiveness of additional attempts to interview this individual. If a determination is made to cease outreach or the respondent refuses to participate, the household is skipped regardless of whether another eligible respondent exists in the household. The household will not be replaced.


Scheduling Interview Time and Place

Combined with our protocols of protecting the security and confidentiality of the data, VACS protocols call for administering the survey under private conditions. Interviewers will be instructed to identify a private space in consultation with the respondent and head of household that is safe and private inside the home. In cases where privacy cannot be ensured inside the home, the interviewer and respondent will schedule a time and place to meet while the survey team is still in the community. Interviewers will meet up with the respondent at the designated time and place, and the respondent will be identified by visual recognition, not by any kind of identifying information. If the interview cannot be rescheduled while the survey team is in the selected community, the interview will be considered incomplete. This commitment to conducting the survey under safe and private conditions will encourage participation among respondents concerned about the confidentiality of the data.


Incentives

Incentives are one effective way to increase response rates. Numerous studies have demonstrated that incentives, and in some cases higher amounts of incentives, lead to higher response rates.vii viii ix Some studies further support the use of incentives by demonstrating reductions in nonresponse bias and attrition bias resulting from their use.x xi xii Unfortunately, little research exists on the effectiveness of incentives on reducing non-response bias (as opposed to increasing response rates), particularly with youth respondents.


Thus, the incentive experiment described herein will help us to fill this gap in the literature by allowing us to assess the effectiveness of three different incentive structures with youth respondents. We propose testing three different incentive structures as part of the pilot test:

  1. A direct offer of $20 upon survey completion;

  2. Recruitment offering $40 early bird incentive if completed within two weeks after screening in, otherwise a $20 post-incentive upon completion; and

  3. An initial offer of $20 upon survey completion, with additional communication to non-responders later in the field period with an escalation offer of $40.


Due to the higher burden of cognitive interviews, participants for pre-test 1 will be offered $50 (see SSA for a justification for the higher $50 incentive). Incentives can help minimize nonresponse bias and ensure data quality with the target population. Given the sensitive survey topic, the study can only be conducted as an in-person household survey, to ensure the privacy and safety of respondents. Therefore, we cannot offer various modes for completing the survey, which often helps decrease nonresponse bias.


A review of the incentive literature and reflects mixed results, but much of the literature suggests that the use of incentives has a positive effect on respondent response rates. Little research has examined the impact of incentive use specifically on nonresponse bias reduction among youth surveyed in-person in a household survey. Previous studies have suggested that incentives of $20-$40 for youth respondents help to increase the response rates. Of those studies that have examined the effect of incentives, results were mixed. One study found that incentives were a main reason for participatingxiii while another found that the incentive did not influence the youth’s decision to participate.xiv Thus, although not examining the impact of their use specifically on nonresponse bias reduction among youth in household surveys, a body of literature on use of incentives with youth suggests that the proposed $20 to $40 incentive for the U.S. will be effective. Interestingly, one study with university students found higher response rates with a $25 incentive compared to a $10 incentive, but no difference between $25 and $40. In this same survey, sexual victimization was higher in the $10 incentive group than the $25 incentive group, potentially suggesting that victims were more motivated to participate even when a lower incentive was offered. However, the higher prevalence of victimization among those receiving the $40 compared to the $25 incentive may suggest that victims are motivated to do surveys when higher incentives are offered.xv


Although not examining the impact of their use specifically on nonresponse bias reduction among youth in household surveys, this body of literature on youth incentives suggests that the proposed $20 to $40 incentive for the U.S. VACS pilot is an appropriate amount for youth in the study sample and has potential to increase response rates. Further, the lack of incentive experimentation with this age groups supports the need for the proposed incentives test to determine which incentive structure, if any, reduces nonresponse bias.



B.4. Tests of Procedures or Methods to be Undertaken

The domestic VACS pilot will determine the feasibility of the in-person methodology and questionnaire for use within the United States by adapting the global VACS to the US context. The extant literature does not specifically test with a youth population varying dollar amounts and the timing of the incentives. Therefore, we do not know how differential incentives will work for non-responders, which is why we are proposing an incentive experiment. An incentive experiment is necessary because there are not results from other experiments that can be relied upon to address our pilot study questions about the best way to reduce non-response bias, especially in the context of an urban environment experiencing elevated rates of violence. In addition to examining the differences in participation across the different incentive structures, the domestic VACS pilot will determine the feasibility of the in-person methodology and questionnaire for use within the United States on the topic of violence, and use of the ACASI software to collect sensitive information.


Pre-Testing: Cognitive Interviewing and Field Test

As previously mentioned, VACS has been implemented in 24 countries globally. As part of the global implementation, the core questionnaire has been cognitively tested in several of these countries to ensure its appropriateness and understandability across a variety of cultures. Cognitive interview reports are included as Attachments M, N, and O. The questionnaire has since been adapted specifically for a U.S. context, and it will be important to conduct cognitive testing for this adapted questionnaire prior to the full pilot implementation. Therefore, the adapted questionnaire will be tested using cognitive interviewing methods with youth in the target age group. Two pre-tests will be conducted: (1) Pre-test 1 will include cognitive interviewing with 30 respondents from the target population to test the survey instrument, and (2) Pre-test 2 will include field testing with 60 respondents to test all study protocols in the field. Protocols tested will include screening and recruitment for the survey via an in-person, door-to-door methodology, obtaining consent from parents and assent from minors, conducting the head of household questionnaire, and administering the core youth questionnaire using a combination of interviewer-administered and ACASI-administered methods.


The purpose of Pre-test 1, cognitive interviewing, is to allow for the detection of any problems with the instrumentation and time to fix those problems. Pre-test 1 will assess comprehension, inform the burden estimation, and gather constructive feedback on any misunderstandings, sensitivities, or gaps in the survey language. Pre-test 1 cognitive interviewing will be conducted with a convenience sample of 30 youth from the target population. The convenience sample for cognitive interviewing will be recruited through collaboration with community organizations in Baltimore and will include a balanced distribution of 15 males and 15 females. CDC will submit a change request to OMB following the implementation of Pre-test 1 if changes are needed to the questionnaire following cognitive interview results. The findings from Pre-test 1 cognitive interviewing will be used to inform instrument revisions.


Pre-test 2 will involve a field test of the instrument and survey procedures. The field test will be conducted with a balanced distribution of 30 males and 30 females. Pre-test 2 will consist of 60 interviews to pilot field procedures and will involve implementing all study protocols. Pre-test 2 will include 60 respondents to field test study protocols and inform efforts to improve implementation of field procedures prior to full pilot implementation. All consenting and confidentiality procedures established for the full pilot implementation will be followed. However, respondents for the pre-tests and the Garrett County rural feasibility pilot will be offered the standard $20 incentive; these phases will not be included in the test for the other components of the incentive experiment during the field test phase. Instead of a random sample of households, convenience sampling will be used to select households within the field test sites. Interviewers will be instructed to skip a certain number of households, depending on the density of households in the area, to ensure confidentiality and anonymity of study respondents. We will aim to have approximately 30 completed interviews for males and females, balanced by age group and crime setting, totaling 60 surveys (see Table 3). This will test the field procedures among males and females of varied age groups to ensure appropriateness of the questionnaire and study protocols. Qualitative information from the cognitive interviews in pre-test 1 and the field test in pre-test 2 will be used to refine questionnaire and field procedures. In addition, descriptive analyses of the information from pre-test 2 will be used to assess how the questions are performing and ensure adequate performance of technology for data collection. Pre-test 2 will inform the survey procedures. Pre-test 2 will take place directly after interviewer training and will thus give interviewers an opportunity to implement and practice their training in the real world prior to the full pilot. In addition, administering the questionnaire in the field test will provide preliminary information on the average length of interviews and information about the questionnaire format. If results from the field test indicate that changes are needed to survey protocols or questionnaire, an IRB amendment and change requests to the present Information Collection Request will be submitted.


Table 3: Pre-test Sample Composition

Respondent Characteristics

Crime Setting Total

Sex

Age

Low Crime

High Crime


Male

13-17

7

8

15

18-24

8

7

15

Female

13-17

8

7

15

18-24

7

8

15

Total

30

30

60


The information obtained from Pre-test 1 (cognitive interviewing) will be used to assess the performance of the questionnaire to reduce burden and improve the utility of the instruments and Pre-test 2 (field testing) will assess performance of field study procedures. Pre-test 2 field testing will help inform the survey procedures, including but not limited to: community entry, approaching households, and gaining consent. Through administering the questionnaire in the pre-test, we will assess willingness to participate and average length of interviews and ACASI administration and technology performance. If changes are made to the instrument during either of these pre-tests, a change request will be submitted for this ICR.


Survey Implementation Pilot

This request is for a pilot test of VACS in the U.S., which will be used to determine the feasibility of implementing the survey in a domestic context and serve as a foundation for future research on violence against children and youth in the U.S. In addition, this pilot data will be used to assess the associations between physical, emotional, and sexual violence and risk and protective factors for violence; examine associations among different forms of violence; identify the health and social consequences associated with violence; and assess the knowledge and use of medical, psychosocial, legal, and protective services available for children who have experienced sexual, emotional and physical violence in Baltimore. CDC plans to use the results of this survey to inform its violence surveillance, prevention and response efforts and refine practices related to the protection of children.


Analysis of Data

Pretests: Cognitive Interviewing and Field Test, and Rural Feasibility Pilot

The analytic approach for Pretest 1 cognitive interviewing study will involve qualitative assessment to provide understanding regarding the respondents’ interpretation of questions and answer options; assessing whether respondents understand the meaning of terms and concepts; and assessing questionnaire performance, including the length of the questionnaire, ordering of sections, and length of section introductions. The data analysis will follow the same approach used during previous rounds of cognitive interviewing for VACS questionnaires, conducted by the National Center for Health Statistics in the Philippines (Attachment M) and Malawi (Attachment N) and by Columbia University in Colombia (Attachment O). Analysis of interviews will include a three-stage process involving data reduction and theory building (i.e. drawing conclusions). Original interview text from every interview will be summarized into interview notes. Summary notes will specify the way in which individual respondents answer every survey question, including each respondent’s interpretation of questions and key terms, activities and experiences considered by respondents, and any response difficulties and errors. Next, systematic analysis across interviews will identify interpretive patterns (including patterns of response errors) across interviews. Findings from this second level of analysis will explore the phenomena captured by each question and allow for the assessment of construct validity. The final stage of analysis will involve examination of themes by specific subgroups. Identifying the presence of specific patterns occurring only within in a particular subgroup (e.g., females or males) can signal potential bias in resulting survey data.


The primary purpose of Pretest 2 field test is to practice data collection and questionnaire procedures. Descriptive and qualitative information will be compiled from the field test to assess the performance of the questionnaire, household entry procedures, and consent procedures. Qualitative information will assess willingness to participate, average length of interviews, question performance, ACASI performance, and other issues that may come up in the practice administration of the survey and questionnaire. Descriptive and qualitative information will be similarly assessed from the rural feasibility pilot in Garrett County. Information from the rural feasibility pilot will assess the performance of the questionnaire, household entry procedures, and consent procedures, as well as other aspects of implementation of survey procedures and the questionnaire. Descriptive analyses of data will be used to assess questionnaire performance and associations between questionnaire variables to assess question performance and scale development.


Survey Implementation Pilot

CDC will develop survey weights that will be representative of the total youth population 13-24 years old in Baltimore. CDC will apply a three-step weighting procedure incorporating: (Step 1) computation of base weight for each sample respondent; (Step 2) adjustment of the base weights for differential non-response in the sample; and (Step 3) calibrate the adjusted weights to known population totals. CDC will produce a complete description of the findings, including reporting weighted distributions of characteristics associated with sexual behavior and practices; physical, emotional, and sexual violence; and utilization of health care services, counseling services, and other services utilized by respondents and 95% confidence intervals (using SAS 9.4). Charts and diagrams will be used to display data.


This research study is for development purposes and is therefore descriptive. Data tabulations will be used to evaluate the results of questionnaire and methods testing. The information collected in this effort will not be the subject of population estimates or other statistics in CDC reports; results may only be published in the context of methodological research and associations among study variables, and not in the context of providing generalizable estimates of population parameters.


Data will be analyzed separately for males and females. Associations between experiences of violence and risk and protective factors for violence will be assessed, by sex. Results will be examined to empirically assess risk and protective factors at the individual, relationship, and community levels. Findings will be used to understand patterns of risk and protective factors for violence in Baltimore. In addition, associations among different forms of violence will be assessed, to advance understanding of patterns of poly-victimization among male and female youth in Baltimore. Finally, analyses will examine characteristics and risk behaviors associated with violence to identify populations at elevated risk for violence and other negative health outcomes.


B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


Individuals who have participated in designing the data collection and plans for data analysis.

CDC Staff:

Leah Gilbert, MD, MSPH

Jeff Ratto, MPH

Laura Chiang, M.A.

Francis Annor, Ph.D., MPH

Howard Kress, PhD

Greta Massetti, M.A., Ph.D.

Liping Zhu, MPH

NORC Staff:

Bruce Taylor, PhD

Elizabeth Mumford, PhD

Shannon Nelson

Elizabeth Flanagan

Justine Bulgar-Medina, PhD

References/Endnotes

i U.S. Census Bureau. (2019). 2018 American Community Survey 5-Year Estimates, Table DP02. Retrieved from https://data.census.gov/cedsci/table?q=Baltimore%20County,%20Maryland%20Families %20and%20Living%20Arrangements&tid=ACSDP1Y2018.DP02&hidePreview=true

ii Nguyen, K.H., Kress, H., Villaveces, A., Massetti, G.M., (2019). Sampling Design and Methodology of the Violence Against Children and Youth Surveys. Injury Prevention, 25(4):321-327. https://doi.org/10.1136/injuryprev-2018-042916

iii Jansen, H.A.F.M., "Putting Women First" Ethical and Safety recommendations for Research on Violence against Women: Training in Research in Reproductive Health/Sexual Health, 2006, World Health Organization.

iv Draucker, C.B., The emotional impact of sexual violence research on participants. Arch Psychiatr Nurs, 1999. 13(4): p. 161-9.

v Jensen, T.K., et al., Reporting possible sexual abuse: a qualitative study on children's perspectives and the context for disclosure. Child Abuse Negl, 2005. 29(12): p. 1395-413.

vi McGonagle KA, Freedman VA. The Effects of a Delayed Incentive on Response Rates, Response Mode, Data Quality, and Sample Bias in a Nationally Representative Mixed Mode Study. Field methods. 2017;29(3):221‐237. doi:10.1177/1525822X16671701

viiMcGonagle KA, Freedman VA. The Effects of a Delayed Incentive on Response Rates, Response Mode, Data Quality, and Sample Bias in a Nationally Representative Mixed Mode Study. Field methods. 2017;29(3):221‐237. doi:10.1177/1525822X16671701

viii Edwards, P., Roberts, I., Clarke, M., DiGuiseppi, C., Pratap, S., Wentz, R., et al. (2002). Increasing response rates to postal questionnaires: Systematic review. British Medical Journal, 324.

ix Goritz, A. S. (2006). Incentives in web studies: Methodological issues and a review. International Journal of Internet Science, 1, 58–70.

x Singer, E., & Ye, C. (2013). The Use and Effects of Incentives in Surveys. The ANNALS of the American Academy of Political and Social Science, 645(1), 112–141. https://doi.org/10.1177/0002716212458082

xi Williams, D., & Brick, J. M. (2018). Trends in US face-to-face household survey nonresponse and level of effort. Journal of Survey Statistics and Methodology, 6(2), 186-211.

xii Felderer, B., Müller, G., Kreuter, F., & Winter, J. (2018). The Effect of Differential Incentives on Attrition Bias: Evidence from the PASS Wave 3 Incentive Experiment. Field Methods, 30(1), 56–69. https://doi.org/10.1177/1525822X17726206

xiii Kafka, T., Economos, C., Folta, S., and Sacheck, J. (2011). Children as Subjects in Nutrition Research: A Retrospective Look at Their Perceptions. Journal of Nurtirion Education and Behavior. 43(2):103-109. https://doi.org/10.1016/j.jneb.2010.03.002

xiv Smith, K.A., Macias, K., Bui, K., and Betz, C. L. (2015). Brief Report: Adolescents' Reasons for Participating in a Health Care Transition Intervention Study. Journal of Pediatric Nursing. 30(5):165-171. https://doi.org/10.1016/j.pedn.2015.05.007

xv Krebs, C.P., Lindquist, C.H., Richards, A., Shook-Sa, B.E., Marcus, Berzofsky, Peterson, K., Planty, M., Langton, L., & Stroop, J. (2016). The Impact of Survey Incentive Amounts on Response Rates and Estimates of Sexual Assault Victimization.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2022-08-12

© 2024 OMB.report | Privacy Policy