SUPPORTING STATEMENT FOR
Independent Evaluation of the Systematic Alien Verification for
Entitlements (SAVE) Program
OMB Control No.: 1615-NEW
COLLECTION INSTRUMENT(S): G-1503
A. Justification
1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
The Department of Homeland Security (DHS) requests clearance from the Office of Management and Budget (OMB) for new information collection on the evaluation of the Systematic Alien Verification for Entitlements (SAVE) program. The United States Citizenship and Immigration Services’ (USCIS) Research and Evaluation Division (RED) commissioned this USCIS SAVE evaluation in response to the FY 2015 OMB Passback guidance to the DHS: “Guidance requests that the FY 2015 program funding level for SAVE be set at a level that supports an independent evaluation of the SAVE program.”1
The SAVE program is a voluntary program for federal, state, and local government agencies to assist participating agencies with verifying the immigration status of the benefit’s applicants. The program is fee-funded and was established after Congress passed the Immigration Reform and Control Act (IRCA) of 1986 under the legacy Immigration and Naturalization Service (INS)2, which required the creation of a verification system that confirms the immigration status of individuals applying for specific federally funded benefits. The mission of the SAVE program is to provide timely customer centric immigration status information to authorized agencies to help them maintain the integrity of their programs.
The SAVE program has expanded into a nation-wide program that conducts immigration status verifications. More recently, the REAL ID Act of 2005 and the Affordable Care Act of 2010 created new laws that have increased the number of SAVE user agencies. The laws require departments of motor vehicles and health insurance exchanges to verify one’s immigration status before issuing licenses/IDs and health insurance. There are also mandates in some states (such as Georgia) that require the universal use of SAVE to receive any public benefit.3 SAVE currently has 1,136 user agencies.4 Of this total, 550 are considered “active agencies” defined as agencies that process at least one application per year. The highest-volume user agencies (as calculated by initial verifications) are the Centers for Medicare and Medicaid Services (35.9%), the Social Security Administration (17.9%), the California Department of Health Services (7.4%), the Texas Department of Public Safety (4.3%), and the California Department of Motor Vehicles (4.3%).5
The SAVE program has limited insight into the day-to-day workings of SAVE operations at user agencies. The evaluation will explore how the SAVE program operates in the field, and will provide an improved understanding of the factors that influence variation in implementation. As a result, the study will identify what is working well, what is not working not so well, and recommendations for improving the SAVE program. The key evaluation tasks include: (1) developing a logic model (2) fielding a Web-based survey of all active user agencies, (3) conducting agency site-visits, and (4) reviewing USCIS case-record reviews and interviewing USCIS staff.6
2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.
All information for the evaluation of the SAVE program will be collected by the DHS USCIS contractor IMPAQ International, LLC. USCIS will use the information to assess the effectiveness of the implementation of the SAVE program. The evaluation will cover the 2015 and 2016 SAVE program years.
The primary purpose of the SAVE evaluation is to analyze how the program is operated by user agencies and the experience of individuals attempting to correct their records in response to SAVE issued information.7 To these ends, the evaluation will focus on answering six primary research questions:
User Agency Implementation
What business processes have agencies established to use SAVE?
How do agencies train staff to use SAVE?
How do SAVE users perceive SAVE’s value to their agencies?
How do Web services interface with SAVE’s processes?
Benefit Applicants’ Experience
How do user agency staff members interact with benefit applicants in the verification process?
When benefit applicants directly or indirectly interact with the SAVE program to correct their records, what is the case-resolution process?
Collectively, these six research questions will provide a comprehensive evaluation of the SAVE program that documents how user agencies implement SAVE and the experience of benefit applicants who interact with the program. By answering these questions, the evaluation will detail the program’s aspects that are working well, and not working well, along with any recommended enhancements to SAVE.
The evaluation will collect data from several points of the program as listed below to provide a comprehensive study of its operations.
Web Survey
The SAVE user agency survey will be a census of all active SAVE user agencies. It will be administered to approximately 550 active user agencies (defined as user agencies that process a minimum of one application request per year) to capture program information across the first five research questions. The Web survey contains close ended-questions on topics that include frequency of use and system interface, user roles and responsibilities, ease of use of the system, user training experiences, usefulness of SAVE help resources, interaction with USCIS staff, the benefit applicant experience, and the perceived value of SAVE (see SAVE Evaluation Web Survey).
Site Visits
The evaluation contractor will also conduct 40 one-day site visits with active SAVE user agencies to examine the first five research question in greater depth, some of which cannot be fully addressed through the closed-ended survey questions alone. The site visits will allow for open-ended interview questions, a detailed examination of business processes, collecting screenshots and other agency SAVE documents, and better understanding the day-to-day challenges of SAVE use in the field. The site visits will include a system demonstration and interviews with the agency’s SAVE program stakeholders at each site (tailored to each site) – for example, super users, SAVE supervisors, SAVE users, and relevant leadership of the agency. The evaluation contractor will also schedule a site visit prep phone call with each agency to get basic facts about their SAVE operations and identify individuals to interview on-site (see Save Evaluation Site Visit Protocols Questionnaire).
Administrative USCIS Case Records
The SAVE user’s agency’s role ends once the verification process is complete. Therefore, the user agency has limited insight into the experience of benefit applicants who try to contact the federal government to correct their immigration records. This data collection is to capture information for research question six (6). The evaluation contractor will review USCIS administrative case records and conduct 20 interviews with USCIS federal staff. This task will document the processes by which benefit applicants work to correct their immigration records, and identify challenges and opportunities to improve these case-resolution operations. These data will enable the evaluation contractor to probe deeper into the benefit applicant’s experience and identify opportunities to improve SAVE program practices or processes.
The data collected through the aforementioned methods will be used to develop a report that summarizes results, and identifies findings and recommendations about the program.
3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.
The survey of user agencies will be Web-based. The survey will be pre-tested with cognitive testing via phone interviews with up to 9 respondents prior to the 30 day OMB package. Following approval, the survey will be posted at a link that is currently being developed. The web survey reduces the burden on respondents as they can fill out the survey at their convenience, and start and restart the survey as needed before they submit it.
The site visits will be conducted in-person, and will not rely on information technology for data collection. The data collected will be written notes and interview recordings.
Once OMB approval has been obtained, the information collection documents will be 508 compliant per section 508 of the Rehabilitation Act of 1973.
4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.
There is no other similar information currently available that can be used to evaluate the SAVE program. This is the first evaluation conducted of the SAVE program. This data collection is critical in that the evaluation will identify program challenges, detail the program’s aspects that are working well, identify areas for improvement, and provide recommendations of enhancements to the SAVE program.
5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.
The user agencies do not qualify as small businesses. Therefore, the design of the evaluation study is such that it will not have a significant impact on small businesses.
6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
This is a one-time research study. The information will be collected from the user agencies only once during the study period. If this information is not collected, USCIS will not be able to respond to the OMB Budget Passback guidance to DHS: “Guidance requests that the FY 2015 program funding level for SAVE be set at a level that supports an independent evaluation of the SAVE program. This independent evaluation… should analyze how the program is implemented by user agencies and the experience of individuals attempting to correct their records in response to SAVE issued information”
7. Explain any special circumstances that would cause an information collection to be conducted in a manner:
• Requiring respondents to report information to the agency more often than quarterly;
• Requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
• Requiring respondents to submit more than an original and two copies of any document;
• Requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;
• In connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
• Requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
• That includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
• Requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.
The special circumstances contained in item 7 of the supporting statement (i.e., more than quarterly, responded to in less than 30 days, where records must be retained more than 3 years, where statistical surveys are not designed to produce reliable results, requiring statistical data not approved by OMB, when a pledge of confidentiality is not supported by statue or regulation, which requires the respondent to submit proprietary trade secrets) are not applicable to this information collection.
8. If applicable, provide a copy and identify the data and page number of publication in the Federal Register of the agency’s notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.
Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.
Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years - even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.
On February 23, 2017 USCIS published a 60-day notice in the Federal Register at 82 FR 11476. USCIS received two comments after publishing that notice. On May 15, 2017, USCIS published a 30-day notice in the Federal Register at 82 FR 22344. USCIS has not received comments to date.
Below is a summary of the comments and USCIS response:
USCIS thanks the Commenters for their comments on this OMB/PRA Information Collection Clearance for the USCIS Evaluation of the USCIS SAVE program.
The First Commenter stated that the SAVE Program should not be expanded by the President and included other issues in his/her comments.
USCIS responds that the SAVE Program was established after Congress passed the Immigration Reform and Control Act (IRCA) of 1986 and has subsequently expanded into a nation-wide program that conducts immigration status verifications. The other issues raised by the First Commenter were not germane or related to an evaluation.
The Second Commenter stated: “I strongly believe in this collection of any and all data…” and “We must know ask and not apologize for knowing all information about immigrants in our country.”
USCIS responds that the purpose of this evaluation is to collect information on the SAVE program, which is a voluntary program for federal, state, and local government agencies to assist participating agencies with verifying the immigration status of benefit applicants. The mission of the SAVE program is to provide timely customer centric immigration status information to authorized agencies to help them maintain the integrity of their programs. Other issues raised by the Second Commenter were not germane to the evaluation process.
9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
No incentives or payments will be made to respondents.
10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation or agency policy.
In order to recruit for the data collection, the evaluation contractor needs a list of all active SAVE user agencies. USCIS will provide the user agency point of contact (POC) information that includes names, mailing addresses, email addresses, and telephone contact numbers. The POC information will be used internally only by the evaluation contractor for recruitment purposes. No PII is collected for the Web survey or site visits. The evaluation contractor will aggregate the feedback from all sites into a summary report to USCIS SAVE. To protect the privacy of individuals and establishments, every effort will be made to assure anonymity; no names of specific agencies or individuals will be associated with viewpoints or quotes in the report.
The following disclosure statement, signed by the Branch Chief of the Research and Evaluation Division, will be included in a USCIS letter (Attachment B-1 and B-2) that the study contractor will send as an attachment to a recruitment email for the web survey (Attachment A-2), as well as an attachment to the recruitment email for the site visits (Attachment B-3).
The U.S. Citizenship and Immigration Services (USCIS) is pleased that you are participating in the Systematic Alien Verification for Entitlement (SAVE) Program. It is important to USCIS that we periodically assess the effectiveness of SAVE. We are thus sponsoring an evaluation of the SAVE program designed to help us better understand how SAVE fits with your business processes and learn more about what is working well and not so well for you. Finally, the evaluation will also provide the opportunity for you to offer recommendations on how to improve SAVE.
This evaluation is being carried out for USCIS by IMPAQ International LLC (IMPAQ), an independent social science consulting firm. Activities will include making 40 site visits across the country to a diverse set of SAVE user sites as well as conducting an online web survey of all SAVE users. {Name of organization/agency} has been selected to take part in both the site visit and survey components of the evaluation. Your participation will make an important contribution in helping to shape the future direction of SAVE.
I would very much appreciate your full cooperation with IMPAQ staff who are contacting you by email to request your participation in this important evaluation and to begin to make arrangements for your agency’s site visit.
Your participation in the evaluation is voluntary. As a SAVE User Agency member, your information is vital to the success of the evaluation, and your agency’s participation is strongly encouraged. Any data and/or information shared during the evaluation will be handled with anonymity. Only the research team conducting the evaluation will have access to the data and information your agency shares. All data and information will be pooled with the data and information collected from other Save User Agencies, and will be published in aggregate form only.
The system of records notice for this information collection is Department of Homeland Security U.S. Citizenship and Immigration Services--004 Systematic Alien Verification for Entitlements (SAVE) Program System of Records. It was published in the Federal Register on August 8, 2012. The SAVE Program privacy impact assessment (PIA) was approved June 5, 2008, and updated August 26, 2011.
11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
There are no sensitive questions.
As stated above, in order to recruit for the data collection, the evaluation contractor needs a list of all active SAVE user agencies. USCIS will provide the user agency point of contact (POC) information that includes names, mailing addresses, email addresses, and telephone contact numbers. The POC information will be used internally only by the evaluation contractor for recruitment purposes. No PII is collected for the Web survey or site visits. The evaluation contractor will aggregate the feedback from all sites into a summary report to USCIS SAVE. To protect the privacy of individuals and establishments, every effort will be made to assure anonymity; no names of specific agencies or individuals will be associated with viewpoints or quotes in the report.
12. Provide estimates of the hour burden of the collection of information. The statement should:
• Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.
• If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.
• Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.
Data Collection Activity |
Type of Respondent |
Estimated Number of Respondents |
No. of Responses per Respondent |
Avg. Burden per Response* (in hours) |
Total Annual Burden (in hours) |
Hourly Wage Rate** |
Annual Respondent Cost |
Web Survey |
|
|
|
|
|
|
|
Web Survey |
User Agency Staff |
550 |
1 |
.33 |
182
|
$43.54 |
$7,924 |
Site Visit Field Observation and Questionnaire |
|
|
|
|
|
|
|
Site Visit Field Observation and Questionnaire |
User Agency Staff |
240 |
1 |
2.25 |
540 |
$43.54 |
$23,512 |
Total |
|
550 |
|
|
722 |
|
$31,436 |
*There will be two data collection efforts targeting staff at the 550 User Agencies.
First, there will be a Web Survey. One staff person at each of the 550 User Agencies will be requested to complete the Web Survey. The estimated number of respondents for the Web Survey is 550. The average burden hour per response is .33.
Second, there will Site Visit Field Observation and Questionnaire. There will be 240 respondents, in total, to the Site Visit Field Observation and Questionnaire. (40 of the 550 User Agencies have been selected for participation in the Site Visits Field Observation and Questionnaire.) It is estimated that 6 User Agency Staff at each of these 40 User Agencies will participate in the Site-Visit Field Observation and Questionnaire. Thus, 240 respondents in total. The average burden hour per response is 2.25.
**The wage rate for User Agency staff was estimated at $43.54 per hour (http://www.bls.gov/ooh/management/social-and-community-service-managers.htm) with a base rate of $31.10 times the benefit multiplier of 1.4 = $43.54. These estimates are based on the average full-time hourly earnings of managers in social service programs for government agencies.
13. Provide an estimate of the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).
• The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life); and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.
• If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.
• Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995; (2) to achieve regulatory compliance with requirements not associated with the information collection; (3) for reasons other than to provide information or keep records for the government; or, (4) as part of customary and usual business or private practices.
There are no capital or start-up costs associated with these collections. There are no extra costs. The respondents will be answering questions based on their experience and knowledge, at hand. There will be no need for additional work or preparation by the respondents. There will be no need for any additional support or assistance, such as representatives of legal units. There is no fee associated with collecting this information.
14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 in a single table.
The cost of $490,428.23 was determined by IMPAQ, International, which holds this contract for this Evaluation of the SAVE program. The costs include evaluation design and methodology development, back ground research and literature review, data collection, analyses and reporting.
15. Explain the reasons for any program changes or adjustments reporting in Items 13 or 14 of the OMB Form 83-I.
This is a new information collection.
16. For collections of information whose results will be published, outline plans for tabulation, and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.
Analyses of Data from the SAVE Web Survey, Site Visits and USCIS Case Records
The time schedule for the conduct of the data collection, tabulation, analysis, and
preparation of the report is shown here.
Project Time Schedule
Data Collection, Analysis, and Reporting Activities |
Due to Start |
Date to Complete |
Survey Active SAVE User |
||
Field Survey |
10/02/2017 |
11/20/2017 |
Submit Interim Report on Active SAVE users |
11/27/2017 |
03/26/2018 |
Submit Clean Data Set in SAS Readable Format |
11/27/2017 |
03/26/2018 |
Conduct Interviews and Field Observations at Selected Active SAVE User Agencies |
||
Site visits and interviews |
10/02/2017 |
03/05/2018 |
Submit Clean Interview Transcripts |
10/02/2017 |
04/16/2017 |
Submit Clean Site Visit Notes |
10/02/2017 |
04/16/2017 |
Submit User Agency Documentation |
10/02/2017 |
04/16/2017 |
Submit Report on Site Visits |
04/02/2017 |
07/02/2018 |
Conduct Research on Benefit Application Interactions with USCIS due to SAVE Request |
||
Summary spreadsheet of applicants interactions with USCIS regarding SAVE IAVs |
10/03/2016 |
01/02/2017 |
Recordings of interviews with USCIS staff |
10/03/2016 |
01/02/2017 |
Interim report on benefit applicants` interactions with USCIS regarding SAVE IAVs |
01/16/2017 |
05/04/2017 |
Produce a Comprehensive Final Report |
||
Submit Final Report that is Section 508 Compliant |
07/16/2018 |
10/29/2018 |
Submit Power Point Slide Deck of Final Report |
10/15/2018 |
11/19/2018 |
Conduct In-person Briefing of the Final Report |
11/19/2018 |
11/19/2018 |
Submit 30 Bound, Color Copies of the Final Report |
11/29/2018 |
11/29/2018 |
As stated in section A.2, the evaluation will focus on answering six primary research questions:
What are the key business processes that agencies have set up to use SAVE?
How do agencies train staff to use SAVE?
How do SAVE users perceive SAVE`s value to their agencies?
How do web services interface with SAVE`s processes?
How do user agency staff members interact with benefit applicants in the verification process?
When benefit applicants directly or indirectly interact with USCIS representatives in order to correct their records, what is the case resolution process?
The evaluation contractor will use a multi-method data analysis approach that incorporates qualitative and quantitative methods to answer the six key research questions. They will conduct a quantitative analysis of survey data collected via the SAVE user agency survey to provide high-level information about how user agencies have implemented SAVE in their business processes. Qualitative analysis of agency interviews, site-observation data, and materials obtained during site visits will provide richer detail on how the 40 user agencies have implemented SAVE within their business process, the associated challenges they have faced, and opportunities for the program’s improvement. Finally, qualitative analysis of data collected from interviews with USCIS staff, along with USCIS case records, will provide insight to the experiences of benefit applicants who try to correct their immigration records after a SAVE “Initiate Additional Verification (IAV)” request.
Web survey data from the 550 active SAVE user agencies will be analyzed quantitatively. The analysis will use univariate, bivariate, and multivariate statistical tests to answer the associated research questions on the implementation and experience of benefit applicants. Descriptive analysis methods, including cross-tabulations and comparison of means tests (e.g., one-sample tests and two independent samples tests), along with visual depictions (such as histograms), will be used to summarize the data obtained through the SAVE user agency survey. In addition, multivariate analysis methods, such as analysis of variance, logistic regressions, and factor analysis, will be used to examine explanatory relationships within the data, while controlling for other relevant factors. Analysis of Variance (ANOVA) tests will be used to test for differences in the means of outcome variables (e.g., an agency’s number of hanging IAVs) across levels of a predictor variable (e.g., program types). Logistic regressions will be used to examine the relationships between a dichotomous outcome variable (e.g., whether a case resulted in an unresolved IAV) and a collection of predictor variables (e.g., the number of super users within an agency, or the agency’s method for accessing SAVE). Finally, factor analysis methods will be used to identify the factors that underlie the key variables, along with the patterns of interrelationships among key variables.
Qualitative methods will be used for the analysis of the site interview, observation, and documentation data collected during each of the 40 site visits. Following the completion of each site visit, the evaluation contractor will assemble their observation notes, verbatim interview transcripts, and other supporting documents. Transcript data will be uploaded into the NVivo10 qualitative data analysis software. Thereafter, the team will use the SAVE logic model and the first five research questions as a guiding framework to help discern systematic themes, patterns, and clusters of ideas related to business processes, agency trainings, agency perceptions of value, the level of satisfaction among user agencies, the value of SAVE, the process of how Web services interface with SAVE, and how agency staff members interact with benefit applicants. These major themes, as identified, will be used to create a logical draft coding structure which will be applied to a subset of the transcript data. To create the draft coding structure, staff experienced in qualitative research will engage in an iterative process of reading and re-reading selected interview transcripts to identify pertinent recurrent themes and clusters of ideas. These staff will test the structure by independently coding the same set of transcripts using the coding structure, and will then discuss and reconcile differences between them before providing the coding structure and associated codebook (with definitions) to other coders to code the same transcripts. Once any additional discrepancies are reconciled through further discussion and clarification, the revised coding structure will be used to code other transcripts. However, given the iterative nature of qualitative analysis, flexibility will and should be built into the process, so that new codes and sub-codes can be added or old ones consolidated as new themes and patterns emerge in the data. Coders will communicate with one another regularly to indicate any issues that may arise, and then the group of coders will meet together with the lead analyst to determine any needed changes to the coding scheme. The lead coder will periodically check for inter-rater reliability and identify any need for further adjustments to the coding structure. Once the data are coded, NVivo offers a number of tools for systematically exploring relationships in the data pertinent to answering the study questions, including the ability to see how patterns may vary according to key attributes or agency characteristics (such as size, frequency of SAVE use, type of agency, etc.). The final report will present and explain the results of these analyses.
The evaluation contractor will collect case records data and conduct interviews with USCIS staff to obtain a better understanding of benefit applicants’ experience as they interact with the SAVE program to correct their immigration records. The qualitative narrative case-record data collected from agency documents will be compiled into a database of case information that will be imported into NVivo. Utilizing NVivo, the evaluation contractor will engage in an iterative process of identifying themes and clustering ideas by reading through the collected documents to code cases by the issue at hand, the pathways used to correct applicants’ records, the outcome, and other pertinent case information. This iterative process will be used to compare codes and calculate the level of agreement across coders. If there is substantial disagreement over a code, the coders will compare and discuss the codes and reach agreement on the proper coding. The findings from the analysis will highlight the factors most associated with the successful resolution of an IAV request.
The evaluation contractor will concurrently conduct an analysis of interviews with USCIS’s staff. Using the same NVivo coding process as in the analysis of narrative case-record data, the evaluation contractor will conduct a rigorous systematic analysis of the interview transcripts. That analysis will identify themes and cluster ideas related to how agency staff members interact with benefit applicants and the case-resolution processes used by USCIS representatives. The evaluation contractor will develop a brief summary of its analyses that provides an overview of pertinent themes and ideas, along with a collection of brief case studies detailing the processes associated with correcting benefit applicants’ records.
17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
USCIS will display the expiration date for OMB approval of this information collection.
Explain each exception to the certification statement identified in Item 19, “Certification for Paperwork Reduction Act Submission,” of OMB 83-I.
USCIS does not request an exception to the certification of this information collection.
B. Collections of Information Employing Statistical Methods.
See Supporting Statement B.
1 United States Citizenship and Immigration Services: “Independent Evaluation of the Systematic Alien Verification for Entitlements Program, Request for Proposal, Scope of Work.”
2 SAVE Governing Laws: https://www.uscis.gov/save/about-save-program/save-governing-laws
3Georgia Security and Immigration Compliance Act: E-Verify and SAVE Program Overview: http://www.accg.org/library/GSICA%20E%20Verify%20and%20SAVE%20Overview%20071008.pdf
4 SAVE Overview and Demo slides presented at the kickoff on Feb. 18, 2016.
5 These figures pertain to the USCIS-provided list of SAVE user agencies that had one or more verifications during Calendar Year 2015 (CY2015).
6 SAVE Governing Laws: https://www.uscis.gov/save/about-save-program/save-governing-laws
7 United States Citizenship and Immigration Services: “Independent Evaluation of the Systematic Alien Verification for Entitlements Program,
Request for Proposal, Scope of Work.”
File Type | application/msword |
File Title | SUPPORTING STATEMENT FOR |
Author | TSA Standard PC User |
Last Modified By | SYSTEM |
File Modified | 2017-07-20 |
File Created | 2017-07-20 |