G-1608-001 NEW_Supporting Statement B_20240228

G-1608-001 NEW_Supporting Statement B_20240228.docx

Citizenship Integration Grant Program (CIGP) Program Evaluation

OMB:

Document [docx]
Download: docx | pdf


SUPPORTING STATEMENT FOR

Citizenship Integration Grant Program (CIGP) Program Evaluation

OMB Control No.: 1615-NEW

COLLECTION INSTRUMENT(S): G-1608

  • Implementation Evaluation Grant Recipient Staff Web Survey,

  • Implementation Evaluation Participant Web Survey,

  • Implementation Evaluation Grant Recipient Staff Virtual Interview,

  • Implementation Evaluation Participant Virtual Interview,

  • Outcome Evaluation Grant Recipient Staff Web Survey,

  • Outcome Evaluation Participant Web Survey,

  • Outcome Evaluation Grant Recipient Staff Virtual Interview, and

  • Outcome Evaluation Participant Virtual Interview


B. Collection of Information Employment Statistical Methods


The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce the burden or improve accuracy of results. When Item 17 on the Form OMB 83-I is checked "Yes", the following documentation should be included in the Supporting Statement to the extent that it applies to the methods proposed:


  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


Surveys with Program Participants: Given the expected small universe of CARING participants and the expected low survey response rate, no sampling will be implemented for either implementation or outcomes survey of CARING participants.


For the survey of CINAS participants, we will randomly sample from the administrative data universe of participants. Based on power analysis calculations, we will need 460 total completed surveys for each program.1 Depending on the distribution for the key participant characteristics, we may need to stratify by such variables as region (four groups), time in the United States (three groups), region of birth (four groups), age (three groups), and income (three groups). Given an expected 10% response rate and anticipation of inaccurate contact information, 5,000 CINAS participants will need to be sampled to attain the 460 completed survey responses for each survey (implementation or outcomes survey).


Surveys with Grant Staff: For the web surveys with grant recipient staff, we will invite two staff members (program manager and purposive selection of a key member in the grant application) from the same organization based on the list obtained from the desk reviews and grantee program contacts. This will provide diverse perspectives of the program from organizations’ staff.


Interviews with Grant Staff: For virtual interviews with CINAS grant recipient staff, the purposive sample will be used based on region, a measure of organization size, time in the program, and LPRs populations served. Since there are only six CARING organizations, no sampling will be conducted.


Interviews with Program Participants: The interview will cover received and preferred modes for outreach, perceived cultural competency of outreach activities, experiences with program application and enrollment, and suggestions for improvements to outreach.


Table 1. Sample size


Expected Universe

Completed Interviews

Survey Response Rate

Completed Surveys

IMPLEMENTATION EVALUATION





Virtual Interviews





CINAS Grant Recipient Staff

104 organizations (208 staff)

24 staff

(It may include up to two staff from the same organization)



CARING Grant Recipient Staff

6 organizations and 12 staff

4 staff



CINAS participants

20,800

24



CARING participants

1,200

24



Total


76



Web Surveys





CINAS Grant Recipient Staff

104 organizations (208 staff)


50%

104 staff

CARING Grant Recipient Staff

6 organizations and 12 staff


50%

6 staff

CINAS Participants

(5,000 sampling frame out of 20,800)


10%

460

CARING Participants

1,200


10%

120

Total




690

OUTCOME EVALUATION




Virtual Interviews





CINAS participants

20,800

24



CARING participants

1,200

24



Total


48



Web Surveys





CINAS Participants

(5,000 sampling frame out of 20,800)


10%

460

CARING Participants

1,200


10%

120

Total




580

Virtual Interviews





CINAS Grant Recipient Staff

104 organizations and 208 staff

24



CARING Grant Recipient Staff

6 organizations and 12 staff

4



Total


28



Web Surveys





CINAS Grant Recipient Staff

104 organizations and 208 staff


50%

104 staff

CARING Grant Recipient Staff

6 organizations and 12 staff


50%

6 staff

Total




110

GRAND TOTAL


152


1,380

Note 1: Based on the grantees’ list for FY18–FY21, there are 104 CINAS grantees and six CARING grantees. Assuming each of the 104 CINAS grantees serves at least 200 individuals over the 2-year grant period, we expect to have about 20,800 CINAS participants. Six CARING grantees are assumed to be serving about 200 participants over the 2-year grant period, thus we expect to have about 1,200 CARING participants.

Note 2: Program participants who completed the implementation evaluation surveys or interviews will be removed from the outcome evaluation data collection.


2. Describe the procedures for the collection of information including:


  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequently than annual) data collection cycles to reduce burden.


Statistical methodology for stratification and sample selection were described above. Nonresponse bias analysis will be conducted based on available administrative data for participant characteristics. Analytical weights will be developed to adjust for the nonresponse and provide results that are generalizable to the population of program participants.


Program participants who completed the implementation evaluation surveys or interviews will be removed from the outcome evaluation surveys data collection.


The contractor plans to collect the following data on grant recipients and program participants.


Implementation Evaluation

  • Virtual interviews with grant recipient staff: Staff will be interviewed on gaps in activities, fidelity and changes to implementation, reasons for changes to implementation, expected consequences of deviations, additional changes that are needed, reasons for dropout, activities to prevent dropout, cultural competency of outreach, and suggestions for improvements.

  • Web surveys with grant recipient staff: The survey will cover staffs’ cultural competency in implementing activities, gaps in activities, reasons for dropout, activities to prevent dropout, perceived influence of grant on organizational capacity (e.g., changes in staff, budget, technology, partnerships, other factors), plans for the change in scope and intensity of program implementation, and suggestions for improvements.

  • Web surveys with program participants: The surveys will gauge participants’ program awareness, received and preferred modes for outreach, reasons for participating, satisfaction with outreach and program, and suggestions for improvements to outreach.

  • Virtual interviews with program participants: The interview will cover received and preferred modes for outreach, perceived cultural competency of outreach activities, experiences with program application and enrollment, and suggestions for improvements to outreach.


Outcomes Evaluation

  • Web surveys with program participants: Survey questions will include the number and types of activities attended, satisfaction with the program, perceived program effectiveness (for English proficiency, civics knowledge, employment, motivation to naturalize, naturalization), Net Promoter Score, acculturation (identity, language, behavioral participation, and social engagement in U.S. and native cultures), motivation to naturalize, integration, and suggestions for improvements.

  • Virtual interviews with program participants: The interviews will gauge perceived program effectiveness (for English proficiency, civics knowledge, employment, motivation to naturalize, naturalization), best practices, issues, gaps in services, and suggestions for improvements.

  • Web surveys with grant recipient staff: The survey questions will include the perceived level of program success, as well as factors influencing such, activities, recruitment efforts, cultural competency, the impact of funding, challenges, recommendations to improve the program, and plans to continue with the program in the future.

  • Virtual interviews with grant recipient staff: The interviews will gather information on grantees’ monitoring procedures, challenges, resources, plans for future program participation, support, perceived success, and suggestions for the future.


Outreach/recruitment process for web surveys and interviews with program participants and grant recipient staff


The participant web survey for both implementation and outcome evaluations will take about 20 minutes to complete. The web survey data collection process will use the following materials:

  1. The official USCIS introduction e-mail to participants.

  2. The official introduction e-mail from grantees to their participants.

  3. The invitation e-mail from the contractor with the individualized link to the web survey.

  4. Up to eight reminder e-mails (two per week) from the contractor to non-respondents with the individualized link to the web survey.

  5. The reminder e-mail from grantees to their clients during the final week of data collection.

  6. The completion of web survey.


The grantee web survey for both implementation and outcome evaluations will take about 20 minutes to complete. The web survey data collection process will use the following materials:

  1. The official USCIS introduction e-mail to participants.

  2. The invitation e-mail from the contractor with the individualized link to the web survey.

  3. Up to six reminder e-mails (two per week) from the contractor to non-respondents with the individualized link to the web survey.

  4. The reminder e-mail from USCIS to the grantees during the final week of data collection.

  5. The completion of web survey.


Virtual interviews with grantees for the implementation and outcome evaluation will be semi-structured, which will allow for follow-up questions during the interviews. The grantee interviews will last about 45 minutes and involve the following recruitment approach:

  1. The official introduction e-mail from the USCIS.

  2. E-mail from the contractor with the schedule and the consent form.

  3. Up to five reminder e-mails to schedule an interview.

  4. Up to five reminder phone calls.

  5. The completion of web-based consent form prior to the interview.

  6. The phone or webinar interview completion.


The participant interviews for both implementation and outcome evaluations will last about 30 minutes and involve the following recruitment approach:

  1. The official introduction e-mail from the USCIS.

  2. The official introduction e-mail from grantees to their clients.

  3. E-mail from the contractor with the schedule.

  4. Up to five reminder e-mails to schedule an interview.

  5. The completion of web-based consent form prior to the interview.

  6. The phone or webinar interview completion.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


The following data collection techniques will be used to maximize the survey response rate and minimize respondent burden.

  • Using an official USCIS introduction email to establish the legitimacy and the importance of the survey. The introduction will describe the study’s purpose and value, the privacy and confidentiality of responses, and their use in the analysis.

  • Ensuring that web surveys are easy to use, reduce burden, and can be accessed from a variety of devices.

  • Establishing a toll-free telephone number and email alias, in case of questions about the data collection process.

  • Using the tracking system to monitor in real time the status of survey completions, the disposition for each respondent contact, and unreachable cases with potentially outdated contact information.

  • Conducting multiple respondent reminder contacts at various times to recruit participants, as well as to obtain updated contact information or find an alternative point of contact.

  • Outlining the potential benefits that the survey offers to the program planning and future participants. The survey introduction, invitation, and reminder emails describe the importance of the data collected to improve the program.

  • Collaborating with USCIS to send email reminders to nonrespondents, if needed.


Program participants who completed the implementation evaluation surveys or interviews will be removed from the outcome evaluation data collection efforts.


Contractor will implement numerous contact attempts to improve the response rate and minimize response bias; otherwise, the sample will be systematically biased toward those who are easily contacted and recruited. Contractor will conduct up to eight contact attempts with non-respondents via email and targeted phone follow-ups. If no contact is made with a survey non-respondent after eight email and phone call attempts, then contractor will stop contacting the non-respondent. The reminders will also be stopped to any non-respondent who states that he/she chooses not to participate. Number of contact attempts is based on best practices for recruiting participants.


Optimal staff will use the tracking system to monitor the web-survey data collection activities and progress in real time to identify unreachable cases, to track follow-up contacts to obtain updated contact information or an additional point of contact, and to target phone follow-up contacts to particular types of non-respondents (e.g., those with partially completed surveys).


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


After receiving feedback on the survey instruments from USCIS, Optimal pretested in December 2023 the online version of the web survey instrument, virtual interviews, and associated data collection materials with up to nine participants for each instrument to ensure reliability, validity, content coverage, context specificity, and cultural appropriateness. Based on the culturally diverse population of respondents, it was critical to pretest the instruments with program participants or individuals with similar background (time in the U.S., English proficiency, countries of origins, etc.).


Optimal has conducted numerous pretests of survey instruments using a cognitive pretesting approach that integrates think-aloud and retrospective probing methods. This pretesting approach ensures that the survey instrument produces high-quality data, reduces burden, and protects respondents from unwarranted and inappropriate questions. Based on the pretesting results, the data collection materials, and the web survey content, layout, and functionality were revised to incorporate participants’ feedback. The final version of the instruments and data collection materials were submitted to USCIS for approval and further revised, as needed. Optimal also submitted a report summarizing the pretesting results, recommended changes and rationale for each change made, and refinements to reduce burden and protect confidentiality.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Oswaldo Urdapilleta, (301) 306-1170, Optimal Solutions Group, LLC

Andrey Vinokurov, (301) 306-1170, Optimal Solutions Group, LLC

Andrea Johnson, (301) 306-1170, Optimal Solutions Group, LLC

Peter Simmons, (301) 306-1170, Optimal Solutions Group, LLC

Wayne Pitts, (919) 541-6000, RTI International

Nicholas Thomas, (919) 541-6000, RTI International

Jacob Klerman, (301) 347-5000, Optimal Solutions Group, LLC




1 Power analysis for general linear model, with a small effect size of .03 (f2), significance level of .05, power of .80, and 7 predictors.


14


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT FOR
AuthorTSA Standard PC User
File Modified0000-00-00
File Created2024-10-28

© 2026 OMB.report | Privacy Policy