3045-NEW PHA Member Outcome Surveys_SSB_FINAL

3045-NEW PHA Member Outcome Surveys_SSB_FINAL.docx

Evaluation of Public Health AmeriCorps

OMB: 3045-0199

Document [docx]
Download: docx | pdf

Corporation for National and Community Service (dba AmeriCorps)

Evaluation of Public Health AmeriCorps

OMB Control Number 3045-NEW

Collections of Information Employing Statistical Methods – Part B Supporting Statement


  1. Description of Universe and Sample

  1. Universe and sample

The universe is the population of Public Health AmeriCorps grantees and members. The first cohort of 82 grantees was funded in fiscal year 2022; the second cohort of 26 grantees was funded in fiscal year 2023. During this period, 6 grantees from the first cohort relinquished their grant, reducing the total population of grantees to 102. From the program inception in 2022 through January 2024, 812 members completed the term of service; 794 members exited early as of January 2024. Between February 2024 through January 2025, AmeriCorps estimates 1,270 members will complete their term of service and 684 will exit without completing their service term (a 35% attrition rate based on exiting attrition data from Public Health AmeriCorps and other national service programs). Table 1 shows the respondent universe for the survey instruments.


Table 1 Respondent Universe and Effective Sample Size

Entity

Universe [JUL2022*-JAN2024]

Projected [JAN2024-JAN2025]

Total Universe

Effective sample size (based on projected response rate)

Grantees

102

--

102

82

Alumni

812

1,270

2,082

1,562

Early Exit members**

397

342

739

443

* July 2022 is the program’s inception.

** Of the early exit members, we assume some may reapply to serve either with Public Health AmeriCorps or another national service program, thus making them ineligible as early exit member for the current information collection; we also assume that the contact information will be more tenuous. A conservative estimate is that about 50% of the estimated number of early exit members will be reachable; once we reach them their response rate is anticipated to be lower than alumni based on prior studies.


  1. Response rate


Based on previous studies with AmeriCorps members, the projected response rate for the proposed information collection with grantees is 80 percent. A recent survey administered to the first cohort of 82 Public Health AmeriCorps grantees under the Fast Track approval, garnered an 83 percent response rate. Since 2016, AmeriCorps Office of Research and Evaluation has administered the Member Exit Survey (MES) to exiting members whether they are exiting upon completing their service or exiting early without completing their service. The MES response rate for members for each AmeriCorps program is over 80 percent. In recent information collection the agency has garnered a 75 percent among members serving with AmeriCorps NCCC, 60 percent among early exit members serving with AmeriCorps NCCC, and over 80 percent among AmeriCorps Seniors volunteers.


  1. Statistical Methodology

Describe the procedures for the collection of information including: Statistical methodology for stratification and sample selection, Estimation procedure, Degree of accuracy needed for the purpose described in the justification, Unusual problems requiring specialized sampling procedures, and Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


  1. Stratification


JBS will invite the entire population of 102 grantees to participate in the study. The universe for the Member Alumni Survey will be all Public Health AmeriCorps alumni. We will invite all alumni to participate in the Member Alumni Survey. The universe for the Early Exit Survey will be all Public Health AmeriCorps members who ceased their service before the end of their service term and have not re-enrolled in Public Health AmeriCorps or another AmeriCorps programs (“Early Exits”). The proposed frequency for the data collection is three time points between June 2024 and March 2026. The rationale for a census rather than sampling is that the sample size must be sufficient to allow for subgroup analysis based on the type of public health intervention, characteristics of the communities served, and the socioeconomic and demographic characteristics of the alumni.

  1. Recruitment


Public Health AmeriCorps program staff will introduce the study to grantees and members in regular communication through the program’s member newsletter and other routine communications. Appendix A includes the sample communication messages about the study. Upon exit (whether the member is completing their service or exiting early without completing their service), ORE will introduce the study to members and invite them to participate. AmeriCorps will provide JBS with the list of Public Health AmeriCorps alumni and early exit members that includes full name, NSAID, NSPID, email address, phone number, grantee ID where each member served, grantee name and address, information on the host site (site ID, and address) if available, month and year service completed (or month and year of exit for early exit members). At the beginning of Phases 1 and 2 of the survey administration, Public Health AmeriCorps will email the alumni and early exit members requesting their participation in the evaluation. JBS will follow up with an email to each participant introducing the study. The email message will include a summary of the study, asking for participants’ consent on the survey, and assure participants that their participation is voluntary and confidential. The email message will include a link to the informed consent form and survey whereby the individual will indicate whether they want to participate in the study. If the individual agrees to participate, they will be routed to the online survey. If the individual declines, their response will be recorded and there will be no further contact. JBS will send up to two email reminders and two follow-up phone calls to those who do not complete the informed consent to indicate their willingness to participate.

The proposed strategy for maximizing enrollment into the study and reaching the target sample size includes regular follow-ups with each participant and an equally important aspect is regular communication with grantees and members before launching and during the data collection. The channels of communication will include e-mail announcement(s) originating from Public Health AmeriCorps. JBS staff will monitor a toll-free hotline and an e-mail address to respond to participants’ queries about the study. JBS will provide live, friendly responses to incoming phone calls. The hotline will also accept voicemail which staff will monitor every weekday and return calls.

JBS will select a random sample of 15 grantees to participate in one of three focus groups: member recruitment and retention, member training and support, and partnership. Each focus group will consist of up to five participants. JBS will select a sample of four partners to participate in the interviews to address the evaluation questions on the successes and challenges of partnership with local or state health departments. The grantees whose partners are sampled for the interview will be asked to introduce the study to the partner. Once the grantee introduces the study to the partner, JBS research team will contact the partner by email and phone to schedule the interview.

We will conduct remote interviews by phone or video with five early exit members in each of the first two phases of data collection (5 participants in each phase, a total of 10 interviews). In selecting a sample to interview, we will consider the length of the project’s intervention the member served with and length of the member’s service in the context of the length of the project’s intervention. For example, some members leave within the first two months while others leave just a few weeks shy of completing the term of service. This selection process will allow a broad range of member experiences and the myriad of reasons for the decision to end service. The qualitative data collection with early exit members may be challenging in terms of gaining cooperation to participate in a survey and interview; thus, a flexible data collection approach may need to be used with this group. JBS will contact sampled early exit members via email and phone to obtain their consent and schedule the interview.


  1. Analysis


The alumni and early exit surveys covers a range of topics, including social, economic, and demographic characteristics, motivation to serve, and the service experience. The service experience includes training, mentoring, and the support to prepare for a public health career. Additionally, the survey explores the public health competencies alumni learned during their service, public health and non-public health employment outcomes following service, as well as education post-service. The demographic variables that will be considered include age, gender, living arrangement, parental education and employment, prior volunteer experience, education, prior employment history, and area of residence. The primary objective of the outcome evaluation is to determine the extent to which alumni have been able to secure employment in a public health career post-service.

To explore whether the Public Health AmeriCorps program leads members to enter public health careers, the binary outcome is entering public health jobs or not. To reveal the extent of entering the public health career as a result of the Public Health AmeriCorps program, the metric will be the proportion of Public Health AmeriCorps alumni entering public health jobs. JBS will measure from the survey data the proportion of Public Health AmeriCorps alumni that continue their education/certification in the public health knowledge areas. Lastly, from the survey data JBS will measure where members are serving (communities) and where they are placed (i.e., state or local health departments) to calculate the proportion of members who are recruited and served in the same community. To compare the Public Health AmeriCorps members’ job placement areas with different nationally available results, JBS will use other datasets or subsets of datasets. We will also use the early exit survey data as a comparison to alumni.


Descriptive Analysis. The descriptive analysis is exploratory in that it is intended to describe and summarize survey and administrative data. This analysis will comprehensively characterize alumni and early exit members, identifying barriers and facilitators that members encounter when they enter their Public Health AmeriCorps service. The analysis will also describe service activities and how members interact with public health entities. The analysis will describe the types and levels of public health competencies, knowledge, and skills members develop through the training and service. JBS will use frequency, percentage, proportions, and cross-tabulations for all those analyses. To illustrate the exploratory findings, JBS will create relevant data visualizations.


Inferential Analysis. JBS plans to use regressions as the primary statistical models for the outcome evaluation questions. The aim of the analyses is to examine the association between the socio-demographic characteristics of alumni and their public health competencies, the professional support and skill-building provided by grantees, and alumni’ career search during and after the term of service.

JBS plans to investigate whether there is an association between entering a public health career, socio-demographic factors, prior public health experience, motivation to serve and challenges and barriers faced during service, grantee-provided training, CDC TRAIN, satisfaction with the service, and goal achievement perception. We will use an Odds Ratio analysis to determine the likelihood of entering public health careers. We anticipate merging the member survey with the grantee survey to explore the association between grantee characteristics, implementation successes and challenges, and member outcomes.

To assess the core competencies of public health, JBS will perform factor analysis for each of the three constructs used in the survey (Cultural competency skills, Community dimensions of practice and leadership skills, and Policy development and program skills and public health science skills). We will estimate the construct validity using the Principal Factor Analysis and build multiple regression models using the composite score for each construct to identify the key factors influencing core competencies of public health among alumni. We will, through a logistic regression model, determine which competencies affect the likelihood of members entering public health jobs. Additionally, JBS will perform Partial Least Square Structural Equation Models (factor or path analysis) to estimate which factors affect members’ public health competencies, knowledge, and skills. Partial Least Squares Structural Equation Modeling (PLS-SEM) is a statistical method used to analyze the relationships between latent variables. It is a multivariate analysis technique to model complex and interrelated phenomena. The goal of PLS-SEM is to estimate the strength and direction of the relationships between latent variables, and to identify the key drivers of observed phenomena.


Early Exit Data Analysis: Besides quantitative data analysis, in the outcome evaluation, there will be a qualitative data analysis on the early exit semi-structured interviews. By analyzing multiple data sources to answer core research questions, JBS will be able to triangulate the data across instruments.

Benchmarking Analysis JBS plans to utilize benchmarking analysis to compare Public Health AmeriCorps’ program impacts with nationally available and comparable datasets, and when necessary, subsets of datasets. Multiple benchmark data will be used to compare a single metric. To evaluate the impact of member participation in Public Health AmeriCorps on their likelihood of entering the public health workforce and participating in civic engagement and national service, JBS will construct sub-groups of samples from benchmark datasets similar to Public Health AmeriCorps members on socio-economic characteristics, education, and training. JBS will conduct a thorough benchmarking analysis by calculating the mean, median, and standard deviation (if required, other measures of central tendencies and dispersions) for every metric in the Public Health AmeriCorps data, and also for the benchmarking data. Afterward, JBS will compare the results for each impact metric between Public Health AmeriCorps and benchmarking data, by presenting the data in tables and visualizations. This analysis will help identify whether Public Health AmeriCorps’ performance is above or below the benchmarking performance levels and allow JBS to pinpoint the gaps. If necessary, JBS will conduct root cause analysis to identify the reasons for having above or below the benchmarking measures.


  1. Survey administration


All three surveys (grantees, alumni, early exit members) will be available online and by phone. Mode effects will be analyzed to ensure that differences in modality do not affect study results. For the follow-up survey for alumni and early exit members, JBS will send an email to each member and early exit member who completed the first survey. JBS will send three reminder emails to all participants for online surveys and call up to three times to administer the survey by phone.


  1. Non-response weights

If necessary, JBS staff will calculate a non-response weight adjustment. The evaluator will compute survey weights that account for differential non-response for each round of data collection. Follow-up survey weights will account for the propensity to participate in both the first survey and the follow-up survey. The follow-up survey weights will be used in analyzing the combined baseline and follow-up data. Analyses using the weights will be representative of the participants who completed the first survey. JBS research staff will evaluate whether it is necessary to make weight adjustments for additional missing data arising from the matching process.

  1. Estimation procedure and degree of accuracy


The estimation procedure consists of descriptive statistics and multivariate regression analysis as the primary statistical models for the outcome evaluation questions as described in the Outcome Evaluation section under Research Questions (pages 7-8). We will text for independence of observations since there will be multiple members within grantees. Table 2 shows the minimum detectable effect for a range of sample. The assumption is a linear multiple regression model with 15 predictors, a two-tails t-test on the regression coefficient. The effective sample size of 800 alumni respondents will detect an effect size of about 0.016. A sample size of 200 early exit respondents will detect an effect size of about 0.065.

Table 2 Minimum detectable effect– linear multiple regression model

Power = 0.95, alpha = 0.05, number of predictors in the model is 15, two-tails t-test

Sample Size

Effect size

200

0.065

300

0.044

400

0.033

500

0.026

600

0.022

700

0.019

800

0.016

900

0.014



AmeriCorps may disaggregate the alumni respondents into subgroups. Some possible subgroups may include, for example, characteristics of the communities served. Table 3 shows the minimum detectable effect size of the mean difference between two groups. Assuming there are two subgroups where group 1 includes members whose service experience focused on communities with food insecurity compared to group 2 which includes members whose service experience focused on communities with housing insecurity. If group 1 consists of 300 members and group 2 consists of 200 respondents, this is a total sample size of 500. The MDES for the mean different between the two groups is 0.329.

Table 3 Minimum detectable effect size, mean difference between two groups

Power = 0.95, alpha = 0.05, two-tails t-test, allocation sample size of group 2 relative to the sample size of group 1 is 0.667


Total sample size (both groups)

Effect size

300

0.426

350

0.394

400

0.369

450

0.348

500

0.329

550

0.314

600

0.301



  1. Unusual problems requiring specialized sampling procedures

There are no unusual problems requiring specialized sampling procedures since AmeriCorps will contact the universe of grantees and alumni.

  1. Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The data collection with grantees will occur once. The data collection with alumni and early exit members will occur twice. The pre/post data collection with members will permit the analysis to strengthen the evidence of employment outcome and public health careers post-service. The ability to secure public health employment may take time, as some alumni may continue with higher education. It may also take longer to secure public health employment due to external barriers such as general labor market conditions, public health hiring and governmental public health careers, and opportunities in target communities by the public and private health sectors may be inelastic. The longitudinal aspect of the data collection allows for a longer time frame to measure alumni’s employment outcomes.


  1. Statistical Reliability

Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield 'reliable" data that can be generalized to the universe studied.


  1. Maximize response rate

The proposed strategy for maximizing enrollment into the study and reaching the target sample size includes regular follow-ups with each participant and an equally important component is regular communication with grantees and members before launching and during the data collection. The channels of communication will include e-mail announcement(s) originating from Public Health AmeriCorps. JBS staff will monitor a toll-free hotline and an e-mail address to respond to participants’ queries about the study. JBS will provide live, friendly responses to incoming phone calls. The hotline will also accept voicemail which staff will monitor every weekday and return calls.



Specific strategies used to maximize the response rate include:

  1. Targeted communications – Member newsletters, routine email communication, grantee meetings. These communications are designed to encourage participation and reduce nonresponse.

  2. Using multiple reminders and multiple modes to contact and follow up with respondents – JBS will use multiple communication approaches to encourage participation and maximize responses using the following approaches:

      1. An online survey is the primary mode to administer the survey. JBS will use Alchemer (online survey tool) to send up to three email reminders to non-respondents. For the most difficult-to-reach respondents, JBS research staff will contact non-respondents via phone. When contacted by phone, participants will have the option to complete the survey on the phone.

      2. Participants will be sent up to three reminder emails (for online surveys) or called up to three times (for phone surveys) at each time point.

  1. Incentive offers – The use of incentives positively impacts the response rate with no adverse effects on reliability. Participation in the study is voluntary; however, respondents will likely perceive a time cost and burden associated with their participation. Using incentives to increase the response rate is particularly important when collecting data.

  1. Addressing issues of non-response


Even with the most aggressive and comprehensive enrollment efforts, there is a possibility of nonresponse. The possible reasons for nonresponse are an eligible respondent who does not complete survey; incorrect or outdated contact information resulting in the research team being unable to contact the respondent. If necessary, JBS staff will calculate a non-response weight adjustment that accounts for differential non-response.

We anticipate that item nonresponse will likely be very minimal and will result primarily from early termination of the survey. These will be classified as incomplete, but included, cases and non-sufficient responses. The evaluator will assess whether both partial survey completion and answers to survey items differ significantly by survey modality (online survey or phone).

For phone or online surveys, respondents may decline to answer some questions, and some data will be missing in the final analysis dataset. If any item response rate is less than 70%, the evaluator will conduct an item nonresponse analysis to determine if the data are missing at random at the item level.

  1. Test

Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and Improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of test may be submitted for approval separately or in combination with the main collection of information.


JBS piloted the Member Alumni Survey with current and former Public Health AmeriCorps members from September 12 to November 22, 2023. JBS conducted two rounds of pilot testing and revised the survey after each round. Six respondents completed the survey in the first round, and six separate respondents completed a revised version of the survey in the second round. After the pilot participants completed the survey online, each participated in a cognitive interview with two JBS researchers. JBS staff interviewed five of the six respondents from the first round, and five of the six respondents from the second round.

The interviews included a content review focusing on the completeness of survey instructions and questions; clarity of instructions and questions; sequence of questions; relevance of questions for the target audience; use of language that the target audience understands; use of appropriate and relevant terms; and completion time.


Respondents’ feedback showed the majority of survey instructions, navigation, questions, and response options were clear and were interpreted as intended, and the survey questions were relevant to the Public Health AmeriCorps experience. However, a few respondents stated that certain response items in the section “Skills/Knowledge Gained” could be rephrased to make them clearer. JBS researchers also rephrased certain questions on “Current Employment” to make them easier to ensure respondents understood the questions as intended, including one question that asked respondents about paid employment. When we asked for respondents’ job titles, we found that some of the examples we provided leaned heavily towards the public health field or nonprofit sector; based on respondents’ feedback, we revised the question so all respondents would feel comfortable providing their true job title even if it does not seem related to public health or the nonprofit sector.



  1. Statistical Consultation

Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The following individuals were consulted regarding the statistical methodology: Georges, Annie, Ph.D., JBS International, Inc. (Phone: 650-373-4938); Hussain Belayeth, Ph.D., JBS International, Inc. (Phone: 650-373-4977); Neal Kar, MPH, JBS International Inc. (Phone: 240-645-4614).


Technical Working Group (TWG) members (alphabetical by last name):

1. Kellan Baker, PhD, MPH, MA, Whitman Walker Institute, [email protected]

2. Sherry Glied, PhD, MA, Robert F. Wagner Graduate School of Public Service at New York University, [email protected]

3. Carrie Henning-Smith, PhD, MPH, MSW, Division of Health Policy & Management at University of Minnesota’s School of Public Health, [email protected]

4. Heather Krasna, PhD, Mailman School of Public Health at Columbia University Irving Medical Center, [email protected]

5. Jonathan (JP) Leider, PhD, Division of Health Policy & Management at University of Minnesota’s School of Public Health, [email protected]

6. Emily Zimmerman, PhD, MS, MPH, Department of Family Medicine & Population Health at Virginia Commonwealth University’s School of Medicine, [email protected]


Field Working Group (FWG) members (alphabetical by last name):

1. Desiree (Des) Culpitt, South Central Region Portfolio Manager at ORO

2. Sarah Foster, Program Impact Specialist at ASN

3. Veta Hurst, Program Manager at the Office of DEIA

4. Caroline Ledlie, Southeast Region Senior Portfolio Manager at ORO

5. Deborah (Debbie) Richardson, North Central Region Senior Portfolio Manager at ORO

6. Elizabeth Rose, Midwest Region Portfolio Manager at ORO


The data will be collected under contract to JBS International. Analysis of the data will be conducted by AmeriCorps Office of Research and Evaluation and JBS International.


References


Beckler, D., & Ott, K. (2006). Indirect monetary incentives with a complex agricultural establishment survey. ASA Section on Survey on Survey Research Methods. 2741-2748. Retrieved from https://www.amstat.org/sections/srms/proceedings/y2006/Files/JSM2006-000059.pdf

Brick, J. Michael, Jill Montaquila, Mary Collins Hagedorn, Shelley Brock Roth, and Christopher Chapman. 2005. Implications for RDD design from an incentive experiment. Journal of Official Statistics 21 (4): 571–89.

Church, Allan H. 1993. Estimating the effect of incentives on mail survey response rates: A meta-analysis. Public Opinion Quarterly 57 (1): 62–79.

Dillman, D. A. (2000). Mail and Internet surveys: The tailored design method (2nd ed.). New York, NY: John Wiley and Sons.

Edwards, Phil, Ian Roberts, Mike Clarke, Carolyn DiGuiseppi, Sarah Pratap, Reinhard Wentz, and Irene Kwan. 2002. Increasing response rates to postal questionnaires: Systematic review. British Journal of Medicine 324 (7347): 1183–91.

Jäckle, A., & Lynn, P.. (2008). Respondent incentives in a multi-mode panel survey: Cumulative effects on nonresponse and bias. Survey Methodology 34 (1): 105–17.

James, Jeannine M., and Richard Bolstein. 1990. The effect of monetary incentives and follow-up mailings on the response rate and response quality in mail surveys. Public Opinion Quarterly 54 (3): 346–61.

Shettle, Carolyn, and Geraldine Mooney. 1999. Monetary incentives in U.S. government surveys. Journal of Official Statistics 15 (2): 231–50.

Singer, E., & Ye, C. (2013). The use and effects of incentives in surveys. The ANNALS of American Academy of Political and Social Science. 645, 112-141. DOI: 10.1177/0002716212458082. Retrieved from http://ann.sagepub.com/content/645/1/112.

Singer, Eleanor, John Van Hoewyk, and Mary P. Maher. 2000. Experiments with incentives in telephone surveys. Public Opinion Quarterly 64 (2): 171–88.

Zhang, F. (2010). Incentive experiments: NSF experiences. (Working Paper SRS 11-200). Arlington, VA: National Science Foundation, Division of Science Resources Statistics.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAppel, Elizabeth
File Modified0000-00-00
File Created2024-07-22

© 2024 OMB.report | Privacy Policy