0704-XXXX_EFAC_Supporting Statement PartB FAP Outreach Project 21Apr2023

0704-XXXX_EFAC_Supporting Statement PartB FAP Outreach Project 21Apr2023.docx

Evaluation of the Family Advocacy Program’s Domestic Violence Awareness and Child Abuse Prevention Campaigns

OMB: 0704-0679

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT – PART B

B.  COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

1.  Description of the Activity

Respondents will include military service members and their partners who are 18 years of age or older, are associated with a military installation selected for the project, and are English literate. These respondents are being selected since they are the target audience of the Domestic Violence Awareness (DVA) and Child Abuse Prevention (CAP) campaigns, and therefore their responses can provide the best insight into the impact of the campaigns. In partnership with OSD level Family Advocacy Program staff, 3-5 installations from each Service will be selected. To be able to conduct planned analyses with sufficient statistical power and allow for expected attrition, 150 respondents from the 16 installations will be recruited, for a total of 2,400 potential respondents. These respondents will be recruited through various recruitment materials, including print materials and social media. The sample will be a convenience sample, and previous research has found about an 8-12% response rate (see Air Force Community Action Surveys). Because we expect recruitment to be challenging and we want to decrease attrition for the four waves of data collection, we will provide a monetary token of appreciation.


In addition, FAP staff will be invited to participate in program improvement conversations to provide insight into successes and challenges with campaign implementation.

2.  Procedures for the Collection of Information

  1. Statistical methodologies for stratification and sample selection;

Since prevention efforts are year-long with new themes launched during April and October, we are proposing two methods of analyses: regression discontinuity and multi-level modeling (i.e., growth curve analyses). Regression discontinuity (RD) methodology in evaluation designs is a pre-test/post-test program-comparison group strategy.  Thus, we will be examining within-base change among the bases two months after receiving the roll-out of the new themes for each campaign. The evaluation design models are being structured to assess whether patterns in the target outcomes metrics change after the new themes rollout, as to whether or not there was a statistically significant difference in outcome metrics from pre-test month as well as the three posttests. This approach will allow the evaluation to assess the impact of improving awareness of reporting options and resources in the month immediately following the launch of the new themes. This design enables the post-intervention slope (immediately after the launch of the new theme in October and the launch of new theme in April, respectively) outcomes to be assessed to determine if there is improved post-intervention compared to pre-intervention results related to outcomes.   


A level of “implementation robustness” will be assigned to each installation based on conversations with FAP staff and findings from the site visits and online/social media examinations. This will allow for measuring change at the group level and will assist in determining whether the saturation through implementation robustness is linked to the degree of changes seen within the pre/posttests.

  1. Estimation procedures;

To determine the sample size of respondents needed, we conducted a power analysis. A power analysis is the calculation used to estimate the smallest sample size needed for an experiment, given a required significance level, statistical power, and effect size. Based on the power analyses and a 48% potential attrition rate, recruiting 150 participants will allow for the retention of the 78 participants needed to provide 80% power to detect small treatment effects. We used this level of estimation to prepare for the worst case scenario; however, we anticipate that we will be able to retain a higher number of respondents across the four waves of data collection.

  1. Degree of accuracy needed for the Purpose discussed in the justification;

The evaluation is being structured to assess whether patterns in the target outcomes metrics will exhibit some change in awareness of DVA and CAP campaigns, respectively, from baseline (pre-test) to post-test to pre-post-post-test to post-post-post-test with the baseline starting in August/September for the year-long campaign.  The new theme for DVA is launched in October, and outreach efforts using that theme will be year-long.  Four data collection points of outcome metrics are in italics in Table 1.  The collection of four data points related to DVA enables us to measure impact of the campaign and outreach efforts over time and whether that change or awareness of resources and/or the information is continuous.  Moreover, utilizing four data points will allow for a vigorous set of data analyses by employing a multi-level modeling tool known as growth curve analysis.  Growth curve analysis is used to summarize longitudinal data into a smooth curve defined by relatively few parameters for further inquiry.  This analysis technique is especially useful in measuring intra-individual change over time, and it can also be used to measure change at group levels (e.g., those installations at low versus high implementation robustness or quality).  The new theme launch for CAP month will be examined with the regression discontinuity design. 


In addition to the four data collection points, Clearinghouse will conduct conversations with FAP staff and will collect implementation metrics throughout the campaign year (Table 1).


Table 1.  Data Collection Type and Timeline 

Sept 2023 (Pre-test: DVA items) 

Oct 2023 (New Theme for DVA) 

Nov 2023 (Post-test: DVA and CAP items) 

Feb 2024 (Pre-and Post-Posttest: DVA and CAP items) 

Apr 2024 (New Theme for CAP) 

May 2024 (Post-Post-Posttest: DVA and CAP items) 

 


May/June 2024: Conversations with FAP staff

September2023-May 2024: Implementation metrics collected by Clearinghouse research staff


d.  Unusual problems requiring specialized sampling procedures; and

There are no unusual problems expected.

e.  Use of periodic or cyclical data collections to reduce respondent burden.

Respondents will participate in four waves of data collection. To reduce respondent burden, we have limited the demographic items to the first wave and have limited the number of items asked at each wave of data collection.


3.  Maximization of Response Rates, Non-response, and Reliability

Response rates will be maximized using several strategies:

  1. PSU has taken painstaking care to minimize the number of questions being asked at each wave to reduce respondent burden. This strategy will improve the likelihood that respondents are retained across the four waves of data collection.

  2. Incentives, that increase in amount for each wave, will be provided after completion of each survey to increase the likelihood of completion.

  3. In cases of non-response, up to three email or text reminders per respondent will be utilized to promote survey completion.

  4. Recruitment methods will be fluid and will be revised, as needed. Feedback will be obtained from OSD partners to ensure creative and effective methods are utilized.

To ensure the accuracy and reliability of data, survey data will be entered directly by the respondent, eliminating errors in data entry. Parameters will be put in place within the online survey platform, Qualtrics, to ensure that data is entered as intended. For example, response validation and requirements will be added to ensure that an email address is entered correctly. For multi-select questions, custom validation will be added to ensure that respondents select the right amount of answer choices.

The sampling procedure put forth allows for the generalization of the findings. Moreover, the monitoring of campaign implementation will enable an examination of respondent’s knowledge gain based on campaign “intensity”.

4.  Tests of Procedures

Pre and posttests were developed through a rigorous process with input from OSD staff. Pre and posttest survey items underwent an iterative feedback process to ensure that items aligned with campaign content and to ensure minimal respondent burden. Survey items were drawn and adapted from similar past studies that assessed internal consistency and, in some cases, validity.

Questions for the program improvement conversations with FAP staff were also developed with input from OSD staff and the number of items were limited to reduce participant burden.

5.  Statistical Consultation and Information Analysis

a. Provide names and telephone number of individual(s) consulted on statistical aspects of the design.

Daniel F. Perkins, Ph.D., Pennsylvania State University, [email protected], 814-867-4182

Cameron Richardson, Ph.D., Pennsylvania State University, [email protected], 814-865-7416

b. Provide names and organization of person(s) who will actually collect and analyze the collected information.

Daniel F. Perkins, Ph.D., Pennsylvania State University, [email protected], 814-867-4182

Kristen Lawton, M.S., Pennsylvania State University, [email protected], (814)-863-4524

Kellie Forziat Pytel, Ph. D., LPC, NCC, ACS, Pennsylvania State University, [email protected], 814-865-0796

2


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPatricia Toppings
File Modified0000-00-00
File Created2023-09-29

© 2024 OMB.report | Privacy Policy