SUPPORTING STATEMENT A (Revised May 13, 2009)
(OMB
File No 53)
OMB No. 1615-NEW
E-Verify Non-User Survey and
Employee-Employer Survey in Arizona
A. Justification
The E-Verify Program is a free employment eligibility confirmation system operated jointly by U.S. Citizenship and Immigration Services (USCIS) and the Social Security Administration. The E-Verify Program allows participating employers to electronically confirm the employment eligibility of newly hired employees to help maintain a stable, legal workforce. Authorization for this program expires on September 30, 2009, and Congress will consider alternatives for its reauthorization this summer. One of the primary options for reauthorization is to make E-Verify a mandatory program for over 7 million U.S. employers to verify the employment authorization status of all new hires.
USCIS continually evaluates the E-Verify Program to meet the program goals of:
Reducing unauthorized employment,
Reducing verification-related discrimination,
Protecting employee privacy and confidentiality, and
Minimizing employer burden.
Congress has consistently relied on these evaluations as benchmarks for legislative action, and the USCIS Verification Division depends on the survey results to make necessary program improvements. As part of this effort USCIS plans to conduct two new studies so that it can provide important information to help in the deliberations on whether to reauthorize, and expand E-Verify Program.
The following provides a brief description of these two new surveys:
E-Verify Non-User Survey – This survey will identify barriers to participation in the E-Verify Program by surveying employers not participating in the E-Verify Program to learn why they: (1) have not chosen to participate, (2) what problems they foresee with participating, and (3) what changes would make it more attractive for them to participate. This survey is essential since past evaluations have found that employers who are required to participate in the E-Verify Program have a greater tendency to violate provisions designed to protect worker rights, and fail to prevent unauthorized employment.
Employee-Employer Survey in Arizona – This survey will identify strengths and weaknesses of the E-Verify Program in a mandatory setting from both the employer and employee perspectives. This will greatly assist in moving the E-Verify Program from a small percentage of employers to a national mandatory program should Congress take that step in the fall of 2009.
The use of these surveys provides the most efficient means for collecting and processing the required data. In this case USCIS will employ the use of information technology in collecting and processing information.
USCIS has a central review and approval process for all surveys, which prevents duplication. A review of USCIS Forms Inventory Report revealed no duplication of effort, and there is no other similar information currently available that can be used for these purposes.
The design of the survey will not have a significant impact on small businesses since it will only take a short time to complete. In addition USCIS is offering an incentive to all respondents to help offset the time required to complete the surveys. (See item 9 below)
Consequences of not collecting the Information
Without these surveys, decisions about the design of any proposed mandatory or widespread voluntary national employment eligibility verification program will be based on outdated information.
The special circumstances contained in item 7 of the supporting statement are not applicable to this information collection.
USCIS is requesting emergency review for this information collection. Any public comments will be reconciled and addressed in the justification package with the second submission.
Consultants knowledgeable about issues related to immigration, employment, discrimination, and privacy were also employed by the contractors in order to provide advice for the earlier evaluations. They are as follows:
Joseph Drew, Southeastern University, Washington, D.C.
Michael Leeds, Temple University
Alison Konrad, Temple University
Matt Huffman, University of California, Irvine
Janet Spitz, St. Rose College
Barry Chiswick, University of Illinois at Chicago
The literature on the effectiveness of response rates is extensive. We propose to offer workers $25 to increase the likelihood that they will complete the survey. (See Supporting Statement B for a justification of using incentives for workers.) Neither the employers who complete the web survey of nonusers nor the Arizona employers who participate in the interviews will receive a payment or gift.
Per the language in the contract the Contractor owns the survey data:
“All identifiable hard copy and automated survey data collected and databases containing such information maintained by the Contractor for sole purpose of organizing and analyzing files/records developed as part of the evaluation will be the property of the Contractor to ensure the confidentiality and anonymity of the respondents. Any information made available to the Contractor by the Government must be used only for the purpose of carrying out the provisions of this task order and must not divulged or made known in any manner to any person except as may be necessary in the performance of the task order. The contractor will be required to sign a non-disclosure statement.”
The following safeguards will be taken to ensure respondent confidentiality:
The study contractor will maintain the survey instruments and the microdata files and will not share data with the DHS about individually identifiable organizations and individuals, as specified in the contract between DHS and the contractor.
All contractor personnel working on the data collection efforts will sign an Assurance of Confidentiality Statement.
No public use microdata files containing data from this study will be issued.
The instruments in this package include a number of questions about whether employers and employees are engaging in illegal behavior. These questions are necessary because they will provide important information about the effectiveness and costs of the E-Verify Program as well as the implications of the E-Verify Program for discrimination and privacy.
Type of form and type of respondent |
Anticipated respondents |
Number
of responses
|
Average Burden per Response (in hours) |
Total Burden in hours |
Web survey of nonusers |
2,250 |
1 |
.333 (20 min.) |
749 |
AZ interview with employers |
100 |
1 |
2.00 (120 min.) |
200 |
AZ interview with employees |
450 |
1 |
1.00 (60 min.) |
450 |
Total |
2,800 |
|
|
1,399 |
There are no capital or start-up costs associated with these collections. Any cost burdens to respondents as a result of this collection are identified in question 14. There is no fee associated with this collection of this information.
Printing Cost $ 0
Contract Cost $ 3,800,000
Collecting and Processing $ 100,000
Total Cost to Program $ 3,900,000
Fee Charge $ 0
Total Annual Cost to Government $ 3,900,000
The annual cost to the Government if $3,900,000. USCIS is obligated to pay $3.8 million for contractual services. This includes labor costs and operational expenses such as designing the surveys determining sample design and selection; recruiting participants; printing materials; programming the web survey and Arizona employer and employee interviews; training field interviewers; conducting interviews with employees and employers; coding responses; paying for overhead, support staff, travel for case studies, and costs for data processing; compiling secondary data; performing software tests; interviewing federal, state, and local (Arizona) officials; conducting analysis; and preparing reports. In addition, an estimated cost of $100,000 a year is required for federal salaries and related expenses.
The cost to the public (respondents) associated with this information collection is detailed below.
Collection |
Hourly wage |
Burden hours |
Total Cost |
Incentive |
Nonuser Survey |
$48.00 |
749 |
$35,952 |
$0 |
AZ employer interview
|
$37.18 |
200 |
$7,436 |
$0 |
Annualized costs to the public for hour-burden E-Verify AZ worker interview
Collection |
Hourly wage |
Burden hours |
Total Cost |
Incentive |
Number of Respondents |
Offset Cost |
Net Cost |
AZ employee interview |
$18.50 |
450 |
$8,325 |
$25 |
450 |
$11,250 |
01 |
1The incentive of $25 x 450 respondents = $11,250 offsets the annualized cost of $8,325 for the worker data collection.
Since this is a new information collection there has been an increase of 1,399 burden hours to the OMB inventory.
The evaluation of E-Verify will consist of two main components: (1) a web data collection from nonusers of E-Verify as of May 1, 2009, and (2) case studies of employers and a sample of their employees. The time schedule for the conduct of the data collection, tabulation, analysis, and preparation of reports on the E-Verify evaluation is shown below:
Activity |
Date to start |
Date to complete |
Data Collection Activities |
||
Collect data for web survey of nonusers |
6/2/09 |
7/17/09 |
Conduct nonresponse followup |
7/20/09 |
8/21/09 |
Close data collection for web survey of nonusers |
9/18/09 |
9/18/09 |
Recruit interviewers for Arizona case studies |
5/1/09 |
6/12/09 |
Revise & review training materials for field interviewers |
4/17/09 |
5/5/09 |
Recruit employers for case studies |
5/28/09 |
7/31/09 |
Train field interviewers to conduct case studies |
7/11/09 |
7/17/09 |
Conduct case studies in Arizona |
7/21/09 |
10/2/09 |
Report Writing (Web Nonuser Survey) |
||
Clean and analyze preliminary data |
9/21/09 |
10/16/09 |
Weight Web Survey Data |
10/6/09 |
10/20/09 |
Analyze weighted nonuser survey data |
10/21/09 |
11/25/09 |
Write first draft (Web survey) for USCIS review |
11/30/09 |
12/18/09 |
Prepare third &final draft & edit Web survey report |
2/15/10 |
3/9/10 |
Informal briefing for USCIS |
3/15/10 |
3/26/10 |
Report Writing (Case Studies) |
|
|
Clean, organize, and enter qualitative data into software |
10/19/09 |
10/23/09 |
Analyze data |
10/26/09 |
12/11/09 |
Write first draft for USCIS review |
12/14/09 |
1/22/10 |
Prepare third & final draft & edit case studies |
3/18/10 |
4/1/10 |
Informal briefing for USCIS |
4/14/10 |
4/21/10 |
The key research topics addressed by the data collection efforts outlined above and the types of analyses required to address them are restated here for completeness:
Has E-Verify in Arizona been properly implemented? This requires descriptive and normative analyses (i.e., a description of the verification process and a comparison to the verification process intended by DHS). This question will be addressed through the case studies.
What are the financial costs and other burdens imposed by E-Verify or by alternatives to E-Verify? This requires both descriptive and causal analyses.
What features are important in employers’ decisions not to use E-Verify? This requires both descriptive and comparative analyses.
What are employers’ perceptions of the value of potential changes in E-Verify? This requires both descriptive and comparative analyses.
How does the program affect levels of discrimination in the workplace? This requires both descriptive and causal analyses.
How does the program affect the privacy and security of information on employees and employers? This requires both descriptive and causal analyses.
The analyses proposed to address these topics are described below.
The descriptive phase of the analysis will consist of descriptive statistics (e.g., percentages, means, medians, and standard deviations, as appropriate), cross-tabulations, and graphical summaries to describe the employee verification process, the characteristics and employment verification experiences of employers in the target population, and the results of the verifications from the DHS and SSA transaction databases. In addition, the descriptive analysis will provide a starting point for subsequent analyses. While these analyses will not establish causality, they will provide preliminary insight on the hypothesized relationships.
Analyses of major data elements of the program implementation will result in an overall picture of how employers that do not participate in E-Verify conduct their work authorizations, their perceptions of E-Verify, and their opinions concerning different features of E-Verify that are being implemented or may be implemented. For example, the survey will help to quantify the percentages of employers that do not use E-Verify because they lack adequate staff skills and fast Internet connections. As a rule, the data to be collected are categorical; however, means and medians may still be used based on scales that combine multiple responses (e.g., the number of tests used as part of the hiring process).
Some types of employers may have different employment practices and perceptions than other employers. For example, smaller businesses may do little hiring and have little expertise or resources to apply to checking on work authorization, and farms employing large numbers of temporary migrant workers may face logistical difficulties in using E-Verify. Comparisons of employers based on such differences will help to identify whether special accommodations would be beneficial for certain types of employers, and whether different types of media/ communications should be targeted to particular categories of employers. Depending on the types of statistics being compared, tests of significance may be conducted using statistics such as chi-squared, t-tests, or logistic or multiple regression.
Modeling consists of statistical analysis involving a dependent or outcome variable and two or more independent or explanatory variables. In modeling, statistical control for confounding factors may be achieved by incorporating into the models one or more concomitant variables, in addition to the explanatory variables of interest. Partitioning out the variability in the dependent variable accounted for by the concomitant variables allows a more accurate assessment of the influence of the independent variables of interest.
The general approach to developing multivariate models will involve a series of steps. Preliminary determination of which variables would be of most theoretical interest and practical relevance for modeling will be based on a review of the findings from descriptive and comparative analyses. In addition, pairwise relationships between the independent variables and the dependent variables will be investigated using, as appropriate, chi-square analysis for categorical variables and correlation analysis for ratio and interval-level variables. Each variable of interest in the databases will be reviewed to determine its quality in terms of missing data. As appropriate, we will create composites of several items from the surveys by developing composite scales or combining items into new categorical variables. Scales can be created as weighted or unweighted sums of item scores, or factor analysis can be used to cluster items and develop weights. Examples of items that are suitable for scaling are employers’ perceptions of the program and experiences with the verification process.
Statistical modeling techniques include logistic regression for categorical dependent variables and linear regression for quantitative dependent variables. For example, we expect to use linear regression to investigate the factors related to employers’ perceptions of E-Verify.
Information collected from the case studies is not designed to provide statistically valid results, but rather to provide a more in-depth understanding of how the E-Verify program affects employees in Arizona. This information will, therefore, be summarized and presented as illustrative of the types of situations that employers and employees might encounter during the verification process. This information is designed to supplement the information obtained in prior evaluations.
We also anticipate using content analysis to analyze responses to open-ended questions on the employer and employee interview protocols. Content analysis is a general term covering a variety of techniques for making inferences from different textual sources. Done correctly, content analysis produces a series of themes and patterns that can yield an in-depth understanding of complex patterns of interaction and behavior.
USCIS will display the OMB Expiration date for this information collection.
USCIS does not request an exception to the certification of this information collection.
B. Collection of Information Employing Statistical Methods.
The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results.
See Supporting Statement B
C. Certification and Signatures
PAPERWORK CERTIFICATIONS
In submitting this request for OMB approval, I certify that the requirements of the Privacy Act and OMB directives have been complied with including paperwork regulations, statistical standards or directives, and any other information policy directives promulgated under 5 CFR 1320.
______________________ ___________________
Stephen Tarragon Date
Deputy Chief,
Regulatory Products Division,
U.S. Citizenship and Immigration Services.
File Type | application/msword |
File Title | SUPPORTING STATEMENT A (Revised May 13, 2009) |
Author | S. Tarragon |
Last Modified By | S. Tarragon |
File Modified | 2009-05-19 |
File Created | 2009-05-19 |