SAVE Program Evaluation Supporting Statement B

SAVE Program Evaluation Supporting Statement B.docx

Independent Evaluation of the Systematic Alien Verification for Entitlements (SAVE) Program

OMB: 1615-0139

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT B


Independent Evaluation of the Systematic Alien Verification for Entitlements (SAVE) Program (OMB Control No. 1615-NEW)



B. Collection of Information Employing Statistical Methods

Introduction


We are proposing to use statistical methods for the evaluation of Systematic Alien Verification for Entitlements (SAVE) program, as described in Sections B.1-B.3. Statistical methods will be used in two key components of the evaluation:

  1. Conducting the Web-based SAVE user agency survey (Web-based survey)

  2. Development of a sampling design for conducting user agency site visits


  1. Respondent Universe


The SAVE program currently consists of 1,136 user agencies. Of this total, 550 user agencies are considered “active agencies”, defined as agencies that processed at least one application in Calendar Year 2015. Accordingly, the respondent universe for the evaluation of the SAVE program will consist of the 550 active user agencies.


The Web-based survey will be distributed to all 550 active user agencies. No sampling methods will be used in the administration of the Web-based survey, as the survey will be distributed to all 550 agencies that comprise the respondent universe of active agencies. The Statement of Work (SOW) for the SAVE evaluation articulates a response rate of 80% among the active user agencies.


In the SOW for the evaluation, USCIS stated a requirement that the sample size for the site visits of active user agencies be comprised of a purposive sample of 25 to 50 SAVE user agencies, balanced among the following volume of use categories: fewer than 100 applications per year; 100 to 1,000 applications per year; and, greater than 1,000 applications per year. In addition to volume of use, USCIS requested that the user agencies selected for site visits vary by geographic region, the type of technology used to access SAVE (web access or web services access), and the type of benefit they provide. In accordance with these requirements, the evaluation contractor has utilized a stratified, purposive sample of 40 user agencies that aligns with the SOW’s recommendation that the sample size be between 25 and 50 agencies, while also capturing the major variations within the population and allowing for rigorous qualitative research. This non-probability-based purposive sample will utilize stratification methods to help capture the diversity among user agencies. The 40 SAVE user agencies represented within the sample will be visited by the evaluation contractor. The one-day site visits focus on obtaining an improved understanding of the implementation and the use of SAVE within user agencies. Along with the Web-based survey, these one-day site visits will comprise the majority of data collection.


1.1 Sampling Frame


A sampling frame for the Web survey will not be utilized, as the survey will be distributed to all 550 active user agencies.


The sampling frame for the site visits will consist of all 550 active user agencies. The SAVE program provided a full list of the 550 active SAVE user agencies along with an array of nine variables for each user agency in the list. These variables provided the evaluation contractor with the requisite information needed to construct the sampling strata as well as other pertinent, secondary variables that would be considered in the construction of the sample. Variables included were:

  1. User Agency Name (used to identify agencies by their formal name)

  2. User Agency ID (serves as a unique identifier)

  3. HLQ Group Name (used to identify agencies by type)

  4. User Agency’s Annual Volume of Use (i.e., number of initial verifications) in CY2015 (Requested Categories: Fewer than 100 Cases, 100-1,000 Cases, Greater than 1,000 Cases)

  5. User Agency Type (Requested Categories: Local Government, State Government, DMV, Federal Government)

  6. User Agency’s State Code and Geographic Region (Requested Categories: Midwest, Northeast, South, West)

  7. User Agency’s Benefit Category (Requested Categories: Multiple Benefits, Professional/Commercial Licenses, Health & Social Services, Labor, Education, Driver’s License, Badging Agencies/Background Investigation, Miscellaneous, Voter Registration, Tax Exemption, Housing)

  8. User Agency’s Method of Accessing SAVE (Requested Categories: Web Access, Web Service Access)

  9. Number of Years that the User Agency has Participated in SAVE


1.2 Sample Design and Sample Size


The Web survey will be distributed to the universe of active user agencies and will not require the use of a sample.


The sample for the site visits consists of a purposive stratified sampling designed to sample 40 SAVE user agencies that are representative of the 550 active SAVE user agencies. The sampling design aligns with USCIS’ SOW and SAVE program needs for the evaluation that require that the sample for the site visits of active user agencies ultimately be comprised of five strata for stratifying within the sample: Agency’s Annual Volume of Use, Agency Type, Geographic Region, Benefit Category, and Agency’s Method for Accessing SAVE.


Exhibit 1 provides an overview of the number and percentage of SAVE user agencies that comprise each stratum in the sample frame and stratified sample. The second column provides the percentage of all user agencies within each stratum based on the analysis of the sample frame listing all active SAVE user agencies provided by USCIS. The third column provides the number and percentage of active SAVE user agencies represented within each stratum based on the final sample of 40 user agencies


The purposive sample was developed using an iterative, multi-step process that consisted of randomly selecting user agencies to reflect, as closely as possible, the figures provided in Exhibit 1’s ‘Percentage of User Agencies within the Sample Frame’ column. In the first step, a random sample of 40 user agencies stratified by the ‘Benefit Category’ stratum was drawn from the sampling frame of active user agencies.


In the second step, a detailed review of the sample was conducted to ascertain the degree that the sample reflected the population of active user agencies along each of the five strata. This review allowed the evaluation contractor to ascertain whether various types of agencies were under- or over-represented within the sample. In the final step, smaller stratified random samples were conducted to replace user agencies that were over-sampled among multiple strata. This process was repeated until the evaluation contractor identified a purposive stratified sample that was largely reflective of the five strata. The figures presented in Exhibit 1’s ‘Number and Percentage of User Agencies within the Purposive Stratified Sample’ column depict the final sample counts and percentage of agencies within each stratum.


The final purposive sample captures the diversity among user agencies across the five sampling strata though with a few notable differences. Across the agency type stratum, the sample under-samples the number of local government agencies while providing greater representation of DMVs and federal and state government agencies. Within the volume of use stratum, agencies with greater than a 1,000 cases are over-sampled in contrast to agencies with fewer than 100, and 100-1,000 cases. Agencies in the Southeast region are under-sampled while agencies in the Central, West, and Northeast regions are slightly oversampled.


Across the benefit category stratum, the sample effectively captures the diversity of the types of benefits provided by user agencies. Agencies providing driver’s licenses and labor benefits are slightly oversampled while user agencies providing multiple benefits and labor benefits are slightly under-sampled. Finally, the sample oversamples the number of user agencies that utilize web service access methods while under-sampling the number of agencies that use web access methods. Despite these notable differences, the sample aligns with the SOW’s recommendations that the evaluation contractor construct a purposive stratified sample, consisting of 30 to 50 user agencies, while also capturing major variations within the population and allowing for rigorous qualitative research.





Exhibit 1. Overview of the Stratified Purposive Sample of SAVE User Agencies by Stratum

Variables

Percent of Agencies within the Sample Frame

Number and Percentage of Agencies within the Purposive Stratified Sample

Agency Type Stratum

Local Government

39.3%

9 (22.5%)

State Government

47.3%

21 (52.5%)

DMV

8.6%

6 (15.0%)

Federal Government

4.8%

4 (10.0%)

Agency’s Annual Volume of Use Stratum

Greater than 1,000

30.5%

30 (50.0%)

100-1,000 cases

18.5%

7 (17.55%

Fewer than 100 cases

51.0%

13 (32.5%)

Geographic Region Stratum

Southeast

51.7%

8 (20.0%)

West

6.6%

6 (15.0%)

Central

28.8%

16 (40.0%)

Northeast

12.8%

10 (25.0%)

Benefit Category Stratum

Multiple Benefits

43.5%

14 (35.0%)

Professional/Commercial License

19.6%

7 (17.5%)

Health & Social Services

6.2%

3 7.5%)

Labor

9.4%

2 (5.0%)

Education

4.8%

3 (7.5%)

Driver's License

7.9%

5 (12.5%)

Badging Agencies/Background Investigation

5.0%

2 (5.0%)

Miscellaneous

1.3%

1 (2.5%)

Voter Registration

0.6%

1 (2.5%)

Tax Exemption

1.3%

1 (2.5%)

Housing

0.6%

1 (2.5%)

Agency’s Method for Accessing SAVE Evaluation

Web Access

90.5%

32 (80.0%)

Web Service Access

9.5%

8 (20.0%)


  1. Procedures for the Collection of Information


Web-based Survey: Collection of Information

To collect the requisite data via the Web-based survey (see Save Evaluation Web Survey), the evaluation contractor will implement a multi-step process. Utilizing the list of active user agencies provided by USCIS, the evaluation contractor will work with USCIS to acquire current contact information for all active agencies, including names, mailing addresses, email addresses, and telephone contact numbers. In the next step, the evaluation contractor will distribute a USCIS recruitment letter by mail (see Attachment A-1 for letter to agencies selected for the Web-based Survey only, and Attachment B-1 for letter to agencies selected for the Web-based Survey and Site Visit Protocols Questionnaire), and by email with the USCIS letter attached (see Attachment A-2 for email to agencies selected for the Web-based Survey only, and B-3 for email to agencies selected for the Web-based Survey and Site Visit Protocols Questionnaire) to staff identified for each active agency prior to the official release of the Web-based survey. The letters will describe the purpose of the survey and provide instructions on completing the survey on the Web. The letters will be on USCIS’s letterhead, and will include a message of support and signature of an USCIS administrative official. In the next step, the evaluation contractor will release email notices announcing the availability of the SAVE user agency survey with access and log-in instructions unique to the staff identified for each active user agency (see Attachment A-3). In the fourth step, the evaluation contractor will release follow-up email notices to non-respondents. Three follow-up emails will be released at 7 days, 14 days, and 21 days following the official release of the SAVE user agency survey (see Attachment A-4). After the 21-day follow-up period, telephone reminder calls will be made to non-respondents, and staff will be encouraged to complete the Web-based survey (see Attachment A-5).


User Agency Site Visit Protocols Questionnaire: Collection of Information

The evaluation contractor team will work with USCIS’s staff to recruit the 40 agencies identified in the stratified purposive sample and prepare for site visits. The recruitment process will include initial outreach via a mailed USCIS recruitment letter (see Attachment B-1), and a study flyer; and by email (see Attachment B-3) with the USCIS letter (see Attachment B-2) and study flyer attached. The recruitment letter will be on USCIS’s letterhead and the study flyer that specifies the purpose of the site visit, how the agency was selected, how the data will be used, benefits of participation, a point of contact (POC) to acquire additional information, and notifies the site that the evaluation contractor will provide additional communication and details.


For non-responders, follow-up efforts will be made via telephone calls up to three attempts 14 days, 21 days, and 28 days following the delivery of the recruitment letter/email (see SAVE Evaluation Site Visit Protocols Questionnaire). If the follow-up telephone calls are unsuccessful, the evaluation contractor will inform USCIS of the user agency. Follow-up telephone calls to responders will consist of a site visit recruitment and scheduling telephone call to the agency point of contact (POC) to schedule the site visits, and a site visit prep call to determine who will be interviewed during the visit (see SAVE Evaluation Site Visit Protocols Questionnaire).


The evaluation contractor will train all researchers prior to the site visits. This will ensure that the researchers share a common understanding of the purpose of the site visits, are well versed in conducting semi-structured interviews and systematically examining SAVE business processes and collecting key documents, and know how to respond to issues or problems that may arise during the visits. The evaluation contractor will also designate a team member as a point person for all site-visit teams to coordinate regular check-ins and develop solutions to unexpected issues that come up during the site visits. Importantly, the evaluation contractor cannot pretest the site visit protocols questionnaire prior to the official data collection period. However, the contractor team will consider the first one or two site visits to be trial runs. They will meet after each visit to discuss and make any needed refinements to the site visit protocols questionnaire.


The site visits will follow a detailed plan that identifies the business processes and interactions to be observed during the visit, the individuals to be interviewed, and a list of requisite documents to be collected during the visit (see SAVE Evaluation Site Visit Protocols Questionnaire). On arrival, the research teams will meet with agency leadership and provide a list of the documents to be collected prior to the completion of the site visit. The first, key component of the site visit is documenting the sites SAVE business processes (via system demonstrations, and job shadowing). The second, key component of the site visit is staff interviews. The teams of researchers will conduct individual, semi-structured interviews with agency employees from each of the three categories of SAVE users. This process will ensure that the diverse perspectives of the super users, supervisors, and users within the SAVE user agency are represented.


Immediately following the completion of a site visit, all researchers will assemble their observation notes, interview transcripts, and supporting documents and prepare the collected data for rigorous qualitative analysis.

  1. Methods to Maximize Response Rates and Deal with Issues of Non-Response


To minimize nonresponse, the evaluation contractor will devote resources to developing and implementing approaches likely to achieve good respondent cooperation with the user agencies.

We expect high levels of cooperation with the evaluation among user agencies as they all have signed a Memorandum of Agreement (MOA) with the Department of Homeland Security (DHS), and have agreed to respond to DHS and Social Security Administration (SSA) designees’ inquiries about the SAVE program. Specifically, the MOA states the user agency’s responsibilities as follows:


(e) Allow SAVE Monitoring and Compliance to conduct desk audits and/or site visits to review User Agency’s compliance with this MOA and all other SAVE-related policy, procedures, guidance and law applicable to conducting verification and safeguarding, maintaining, and disclosing any data provided or received pursuant to this MOA;


(f) Allow SAVE Monitoring and Compliance to perform audits of User Agency’s User Ids use and access, SAVE Training Records, SAVE financial records, SAVE biographical information, system profiles and usage patterns and other relevant data;


(g) Allow SAVE Monitoring and Compliance to interview any and all User Agency SAVE system users and any and all contact persons or other personnel within the User Agency regarding any and all questions or problems which may arise in connection with the User Agency’s participation in SAVE;


The evaluation contractor will provide the draft site visit protocols questionnaire to USCIS for review and will revise the protocols questionnaire based on USCIS’s feedback. The interview protocols questionnaire will be pretested with a sample of five to nine user agencies, and will be revised based on the results of the pretest. Final versions of the protocols questionnaire will be submitted to USCIS.


The evaluation contractor will use extensive follow-up procedures to increase response rates for both the Web survey and site visits. A recruitment letter sent by mail and email (letter attached) will be distributed to staff identified for each active agency prior to the official release of the Web-based survey. Following the email announcement of the SAVE user agency survey, the evaluation contractor will release follow-up email notices to non-respondents. Three follow-up emails will be released at 7 days, 14 days, and 21 days following the official release of the SAVE user agency web survey. After the 21-day follow-up period, telephone reminder calls will be made to non-respondents, and staff will be encouraged to complete the Web-based survey. In addition, the evaluation contractor will provide staff with an option to complete the survey via phone.


The recruitment process for the site visits will include initial outreach from USCIS via an email and a mailed letter. For non-responders, follow-up efforts will be made via telephone calls with three attempts scheduled at 14 days, 21 days, and 28 days following the delivery of the recruitment letter/email.


  1. Tests of Procedures for Refining Data Collections


The evaluation plan has been informed by a series of interviews and meetings with USCIS and the SAVE program’s staff. In coordination with RED and the SAVE program’s staff, the evaluation contractor will develop a draft Web survey that will be pretested on a selected group of nine SAVE users. Findings from this pretest will be used to guide the refinement and finalization of the survey. RED and SAVE staff will be involved throughout the entire development process, including question development, review and approval of pretest processes, development of interview protocols, review of analytic results, and recommendations for survey revisions. Further, as the survey is programmed for Web administration, RED and SAVE staff will provide formal input regarding the Web interface and the survey’s look and feel. The final survey will be submitted to USCIS for approval. USCIS will submit the final survey to OMB.


  1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


The following statisticians were consulted on the statistical aspects of the design and analysis of the current study:


Dallas J. Elgin, Ph.D.

Research Associate

IMPAQ International

202-774-1996


The following individuals will collect and/or analyze data for the current study:

Rocco Russo, Ph.D.

Managing Director

IMPAQ International

202-774-1994


Susan Berkowitz, Ph.D.

Director of Qualitative Research, Principal Research Associate

IMPAQ International

202-774-1943


Dallas J. Elgin, Ph.D.

Research Associate

IMPAQ International

202-774-1996


Jessica Smith, MPP

Senior Research Analyst

IMPAQ International

202-774-1988


Anna Yego, MA

Senior Research Analyst

IMPAQ International

443-259-5173

6


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPLEMENTAL SUPPORTING STATEMENT B
AuthorS. Tarragon
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy