E-Verify Supp Statement A (revised 7-9-10)

E-Verify Supp Statement A (revised 7-9-10).doc

E-Verify Program Data Collections

OMB: 1615-0115

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT A

E-Verify Program Data Collections

A. Justification


  1. Circumstances Making the Collection of Information Necessary


The Department of Homeland Security (DHS) requests clearance from the Office of Management and Budget (OMB) to conduct the most recent in a series of evaluations of employment verification programs referred to as the Study of Employment Eligibility (SEE). The original evaluations of pilot employment verification programs were mandated in Title IV of the Illegal Immigration Reform and Immigrant Responsibility Act of 1996 (IIRIRA), which required the then Immigration and Naturalization Service (INS) to establish three pilot employment verification programs. The current E-Verify Program has built upon these evaluations, assessing changes to the program, and continuing to determine the extent to which program goals were met.


There is currently interest on the part of Congress to expand the current program and possibly institute mandatory employment verification for all or a substantial percentage of the nation’s employers. Currently, approximately 12 states (i.e., Arizona, Colorado, Georgia, Idaho, Minnesota, Missouri, Nebraska, North Carolina, Oklahoma, Rhode Island, and Utah) mandate use of the program for some or all of their employers and the Federal government mandates its use by most Federal contractors. Effective January 1, 2008, the Legal Arizona Workers Act mandated the use of E-Verify for all of its employers.1 Additionally, the State of Mississippi began phasing in the E-Verify mandate for its employers based on size beginning with larger employers. Effective July 1, 2008, Mississippi employers with 250 or more employees were required to use E-Verify; all employers in Mississippi are required to use E-Verify by July 1, 2011. Because of the constant flux in program participation requirements as well as in the nature of the program itself, it is important that we continue to evaluate the E-Verify Program and identify the likely impact of a mandatory national program.


The instruments to be cleared request information from general employers for the
E-Verify Program Survey of Users and telephone interviews of two specific types of employers—Designated Agents (DA) and Users of Designated Agents (UDA). A DA is a third party provider who acts on behalf of a company to handle the E-Verify Program process (i.e., the process of verifying employment eligibility) of a client company’s newly hired employees. The number of active DAs has grown to an estimated 1,000 in the ten years that the E-Verify Program has provided access to DAs. It is anticipated that in a mandatory environment more employers will choose to utilize DAs, yet little is known about them; how they work with clients to implement the E-Verify process or what unique challenges DAs and UDAs face. The use of DAs is of special interest due in part over the concern of the potential burden on small employers with the possible implementation of mandatory use of E-Verify; DAs are seen as a potential tool to meet the needs of this sizeable group of employers.


The attached user survey (Attachment B) is similar in content to instruments used in evaluating the IIRIRA pilot programs and in prior (2006 and 2008) user surveys. However, it has been modified to address the specific requirements of the current evaluation; DAs and UDAs are not included in this general employer survey to avoid undue burden on them, since they will be asked focused questions in the proposed telephone interviews. The attached interview protocols for the DAs (Attachment D) and UDAs (Attachment E) are new.


The goal of this E-Verify Program evaluation of employers is to obtain quantitative and qualitative information about how the Program is working nationally and among a specific group of employers, to determine whether employers are using the program as intended, and to evaluate positive and negative impacts of the programs in a mandatory environment. The Survey of Users is designed to better understand how well the Program is working and how it might be improved, how satisfied employers are with various program features and resources, reasons for using the program, how well they understand and comply with the E-Verify Program requirements, what impact the program currently has on companies in voluntary as compared to mandatory environments, and the companies’ opinions concerning a mandatory program. The survey includes a number of questions also contained in the 2008 survey in order to understand changes in employer satisfaction and compliance over time. The expectation is that this information will help inform future legislation and policy making, improve E-Verify Program administration, and lead to overall E-Verify Program enhancements.


The purpose of the linked study of DAs and UDAs is to understand how these companies work together to implement the E-Verify Program process, the burdens and advantages of using DAs, how DAs advertise for their services, how employers find DAs, and what criteria they use to hire them; the extent to which DAs and UDAs comply with the E-Verify requirements, challenges faced by DAs and their clients and how they address them, and their opinions about the desirability of a certification requirement for DAs. We are also interested in hearing about any suggestions for improving program procedures, registration, and communication between DAs and UDAs and with USCIS. These data will be used to identify challenges and effective practices and enhance the E-Verify Program for these employers that jointly constitute an increasingly large part of the E-Verify employer population. Additionally, this baseline data can be used to design and conduct a nationally representative sample of these employers in the future.


Since the potential requirements of a national automated employment verification program for employers, employees, and federal agencies are substantial, DHS believes that a timely evaluation of E-Verify would be beneficial to ongoing immigration reform.


  1. Purpose and Use of the Information


The primary purpose of the data collection efforts submitted for OMB clearance is to obtain objective data from E-Verify users in anticipation of the enactment of mandatory state and/or national employment eligibility verification programs for all or a substantial number of employers. For example, on September 8, 2009, Federal contractors and subcontractors were required to begin using the E-Verify Program to verify their employees’ eligibility to legally work in the United States.   In a final rule, the Civilian Agency Acquisition Council and the Defense Acquisition Regulations Council amended the Federal Acquisition Regulation (FAR) to reflect this change. The new rule implements Executive Order 12989, as amended by President George W. Bush on June 6, 2008. This rule directs federal agencies to require that most federal contractors and their subcontractors agree to electronically verify the employment eligibility of all new employees hired during the contract term, as well as their current employees who perform contract services for the federal government.2


This evaluation will examine the proper implementation of the E-Verify program and the advantages and disadvantages of such a program from the perspectives of different types of users as discussed above. To meet these goals the evaluation will:


  • Describe how well general employers and DAs and UDAs implement the program;

  • Identify how well E-Verify is doing in meeting the goals set by IIRIRA (i.e., reducing unauthorized employment, reducing or not increasing discrimination, protecting employees’ right to privacy, preventing undue burden on employers);

  • Describe how satisfied employers are with current E-Verify features and resources, and communication with USCIS in a mandated and voluntary environment;

  • Describe how well employers understand the program requirements and are complying with the program;

  • Identify the financial and nonfinancial implications of E-Verify;

  • Describe effective practices used and challenges experienced by DAs and UDAs; and

  • Describe the impacts of recent major changes in the program.


To address these issues, the proposed evaluation design requires original data collection from general and specific types of employers that are using the E-Verify Program. Information about the effectiveness and costs of E-Verify, discrimination, privacy, how employers learned about E-Verify, reasons for using the program, employer understanding of and compliance with E-Verify requirements in a mandated and voluntary environment, and opinions about various features of E-Verify will be obtained from companies. Information will also be obtained about how well E‑Verify works for Designated Agents and their users. Information collected using the surveys described in this package will be supplemented by additional evaluation activities including information from the DHS and Social Security Administration transaction databases on employment authorization queries, and informal, unstructured interviews with federal, state, and local officials.


The past evaluations of electronic employment verification programs have been used extensively by the Administration to improve the E-Verify program and by Congress in considering legislation designed to expand or modify the program. External researchers, think tanks, and members of the general public interested in immigration have also widely used information from the evaluations when discussing employment verification programs, immigration-related policies and related immigration issues. Similar uses are expected for the proposed data collection efforts.

  1. Use of Information Technology


The survey of users will be Web based; interviews with DAs and UDAs will be individually conducted as semi-structured telephone interviews, which will be audio-recorded. Trained Westat project staff will conduct the telephone interviews. Given the semi-structured nature of the telephone interviews of DAs and UDAs and the fairly small number of respondents, we do not believe that CATI methodology would be cost-effective.


  1. Efforts to Identify Duplication and Use of Similar Information


There is no other similar information currently available that can be used to evaluate the
wide ranging features and use of the E-Verify program, particularly as it becomes mandated for increasing numbers of employers. The prior evaluations were designed to evaluate the voluntary E-Verify program, and explore mandatory participation through a case study in Arizona which was the first state in the nation to mandate that all employers use E-Verify. This data collection is critical in that it looks at the voluntary and mandatory impacts of the program on a broader group of employers. It is also different in that, for the first time, we are collecting qualitative data from DAs and UDAs, two groups of employers that we anticipate will grow rapidly in a mandatory environment, and for which we have little data about their knowledge, procedures, and working relationships related to E-Verify. Additionally, results of these evaluation activities will be used to compare the E-Verify program with the most recent national data (i.e., 2008 E-Verify Survey of Users) to monitor trends in compliance, satisfaction, and the impact of program improvements. In the study of DAs and UDAs, the data also will be used to understand compliance challenges, identify effective practices, and suggest program improvements. These data will also assist in conducting a nationally representative study of these employers and their practices in the future. Results from previous studies are not adequate to answer these important policy and program questions, especially in a mixed voluntary and mandatory environment.


  1. Impact on Small Businesses or Other Small Entities


The design of the user survey and the DA/UDA interviews is such that it will not have a significant impact on small businesses. The user survey will take only 30 minutes to complete; the interviews with DAs/UDAs should each take 60 minutes to complete. Although some general users and some UDAs are small businesses, the vast majority are medium to large businesses. An important goal of the study of DAs and UDAs is to learn more about how the use of DAs might assist small businesses in participating in the E-Verify Program under a mandatory environment.


  1. Consequences of not collecting the Information


E-Verify and the characteristics of its users are rapidly changing. The various features of the program have continually changed to incorporate enhancements recommended by previous evaluations and a series of ongoing general program improvements. Additionally, the types of employers that are mandated to use E-Verify are constantly changing based on legislative actions by states as well as Federal regulation. Moreover, as more employers participate in the program, —typically making E-Verify users and the workers they hire look more like the national population of employers and workers—opinions about the program, how it is used, and the extent of compliance change. Therefore, regular evaluation on a bi‑annual timeframe is a prudent and reasonable timeframe for gauging progress and detecting new challenges to direct policy and further program improvements. Without the benefit of ongoing evaluation, policy, program, and legislative decision making would be made using out-of-date information potentially resulting in suboptimal results.

  1. Special Circumstances That Would Cause Information Collection


The special circumstances contained in item 7 of the supporting statement (i.e., more than quarterly; responded to in less than 30 days; where records must be retained more than 3 years; where statistical surveys are not designed to produce reliable results; requiring statistical data not approved by OMB; when a pledge of confidentiality is not supported by statue or regulation; which requires the respondent to submit proprietary trade secrets) are not applicable to this information collection.

  1. Comments in Response to the Federal Register Notice and Efforts to Consult Outside Agencies

Public comments have not yet been solicited and addressed for this data collection effort. Any public comments will be reconciled and addressed in the justification package with the second submission.


Consultants knowledgeable about issues related to immigration, employment, discrimination, and privacy have been employed at various times by the contractors in order to provide advice for this and the earlier evaluations. They are as follows:


    • Joseph Drew, Southeastern University, Washington, D.C.

    • Michael Leeds, Temple University

    • Alison Konrad, Temple University

    • Matt Huffman, University of California, Irvine

    • Janet Spitz, St. Rose College

    • Barry Chiswick, University of Illinois at Chicago


More recently, information from stakeholders representing the federal government, states, and special interest groups was obtained through stakeholder meetings held on November 27, 2007, and March 9, 2009. The input from these meetings has helped shape the proposed evaluation. (See Attachments F and G, respectively for the 2009 and 2007 meeting summaries.)

The results of the last evaluation are available to the public at http://www.uscis.gov/files/article/WebBasicPilotRprtSept2007.pdf. In developing the evaluation design for the data collection efforts, the U.S. Citizenship and Immigration Services (USCIS) contractor has built into the design and data collection methodology the lessons learned in the data collections for the earlier evaluations.


  1. Explanation of Decision to Provide Payments or Gift to Respondents

No incentives or payments will be made to respondents.


  1. Assurance of Confidentiality Provided to Respondents

Because some of the information to be collected in this study is sensitive, special care will be taken to protect the confidentiality of both the individuals and the firms participating in the study. At a minimum, the following safeguards will be taken to ensure respondent confidentiality:


  • The study contractor will maintain the survey instruments and the microdata files and will not share data with the DHS about individually identifiable organizations and individuals, as specified in the contract between DHS and the contractor.

  • All contractor personnel working on the data collection efforts will sign an Assurance of Confidentiality Statement (see Attachment H).

  • No public use microdata files containing data from this study will be issued.


The following disclosure statement, signed by the Director of Research and Evaluation, will be sent as an email attachment to a letter sent by the contractor to E-Verify users to be surveyed (Attachment A):


The U.S. Citizenship and Immigration Services (USCIS) is pleased that you have volunteered to participate in the E-Verify Program, which is administered jointly by USCIS and the Social Security Administration.


An integral part of this program, as described in the Memorandum of Understanding that you signed when you registered to participate in E-Verify, is an evaluation to assess the effectiveness of the program. The goals of the evaluation are to understand whether the Program is working as intended and to determine whether the Program is protecting against discrimination, safeguarding privacy, and avoiding undue employer burden. Congress is interested in this information to help it determine whether E-Verify should be made mandatory for a larger group of employers and, if so, what modifications to the current Program need to be made. Your participation in this evaluation will, therefore, be an important factor in the future direction of employment verification in this country.


As part of this evaluation, we have authorized Westat, an independent social science research firm, to conduct a survey of 3,000 E-Verify participants. Your individual responses will not be shared with DHS. Westat will only provide us and others who are not part of the evaluation team with summary results. These summaries will not permit identification of individual respondents or corporate names or locations. We plan to publish the final report with the survey results on the Web; this will give you an opportunity to see how the information that you and others provide is being used to improve the E-Verify Program.


I would very much appreciate your full cooperation with Westat’s request that you participate in this important evaluation, entitled the Study of Employment Eligibility (SEE). On behalf of USCIS, I would also like to take this opportunity to thank you for your participation in the E-Verify employment verification program. If you have any concerns regarding the evaluation, please call Natasha McCann, Program Manager, at (202) 272-8122.


A similar letter to Designated Agents and Users of Designated Agents selected for telephone interviews will be attached to an email sent by the contractor (Attachment C):


The U.S. Citizenship and Immigration Services (USCIS) is pleased that you are using the E-Verify Program, which is administered jointly by USCIS and the Social Security Administration.


An integral part of this program, as described in the Memorandum of Understanding that you signed when you registered to participate in E-Verify, is an evaluation to assess the effectiveness of the program. To protect the confidentiality of participants, we do not know which employers or designated agents Westat has selected and are, therefore, not sending individual letters to these selected companies. However, it is our hope that all companies will cooperate with Westat.


The goals of the Westat evaluation are to understand why employers decide to use DAs and how they learn about them, what services are provided by DAs, how DAs and their clients communicate, and to ask for suggestions on how the DA process might be improved. Your participation in this evaluation will be an important factor in the future direction of employment verification in this country.

We have told Westat to treat all information gathered from employers and DAs as highly confidential, to the extent permitted by law. Westat will only provide us and other USCIS staff who are not part of the evaluation team with summary results. These summaries will not permit identification of either individual or corporate responses.


A staff member from Westat will be contacting you by telephone to ask you to participate in this important evaluation, entitled the Study of Employment Eligibility (SEE). I would very much appreciate your full cooperation with his or her request.


On behalf of USCIS, I would also like to take this opportunity to thank you for your participation in E-Verify. If you have any concerns regarding the evaluation, please call Natasha McCann, Program Manager, at (202) 272-8122.


The following OMB notice will be included on the first page of the Web survey of users:


Public reporting burden for this collection of information is estimated to average 30 minutes per response, including the time for reviewing instructions, searching existing data sources and maintaining the data needed, and completing and reviewing the collection of information. An agency may not conduct or sponsor, and a person is not required to respond to a collection of information unless it displays a currently valid OMB number. Send comments regarding this burden of estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to: Mr. Sunday Aigbe, Chief, Regulatory Management Division, U.S. Department of Homeland Security, 111 Massachusetts Avenue NW., 3rd Floor, Washington, DC 20529. Do not return the completed form to this address.


For the study of DAs and their users, a similar version of the OMB notice will be included on a one-page document emailed to each employer prior to each telephone interview:


Public reporting burden for this collection of information is estimated to average 60 minutes per response, including the time for reviewing instructions, searching existing data sources and maintaining the data needed, and completing and reviewing the collection of information. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB number. Send comments regarding this burden of estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to: Mr. Sunday Aigbe, Chief, Regulatory Management Division, U.S. Department of Homeland Security, 111 Massachusetts Avenue NW., 3rd Floor, Washington, DC 20529. Do not return the completed form to this address.


  1. Additional Justification for Sensitive Questions


The instruments in this package include a number of questions about whether employers are engaging in prohibited behavior. For example, in the telephone protocols and the Web survey of users, employers are asked whether they inform workers privately about tentative nonconfirmation findings and whether they limit work assignments, training, or withhold or reduce pay until they are sure the employee is authorized to work. These sensitive questions are necessary because they will provide important information about the effectiveness and costs of the program as well as the implications of the program for discrimination and privacy. Congress mandated the study of these issues regarding the earlier pilot programs and has remained interested in changes with regard to these behaviors over time.


To protect the confidentiality of individuals and establishments, DHS will not obtain the micro-data from this study and will not issue any public use files from the evaluation. Quantitative information in reports will be based on aggregate information. Some specific quotations and synopses of open-ended questions in the interviews will be published to illustrate particular types of situations; however, the contractor will review this information carefully to ensure that individual identification of the respondent is not possible.

  1. Estimates of the Hour Burden of Collection of Information


With respect to the burden imposed on respondents, Exhibit 1 lists the three surveys that will be completed by respondents, the number of anticipated respondents, the number of administrations for each type of respondent, and the estimated time to complete each administration. Burden, in hours, is totaled for each form and for all the forms together. The burden estimates were based on prior experience on similar E-Verify surveys and interviews. The survey contains many skip patterns, so there is variation in the amount of time needed to respond. Based on the results of our hardcopy pre-test conducted after the original OMB submission, the companies taking the most time will need approximately 40 minutes to complete the survey. Given the skip patterns and the Web design, we think that 30 minutes is roughly correct across all respondents. The estimate of 60 minutes for the telephone interviews is based on past experience; also, since the interviews are typically scheduled in advance, the expectations and schedules of the respondents help to guide the length of the interviews.


Exhibit A-1. Estimates of respondent burden

Type of form and type of respondent

Anticipated respondents

Administrations
per respondent

Estimated time to complete

Burden in hours

Web survey of users

2,385

1

.50 (30 min.)

1,193

Telephone interviews with Designated Agents

20

1

1(60 min.)

20

Telephone interviews with Users of Designated Agents

60

1

1(60 min.)

60

Total

2,465



1,273

NOTE: The number of anticipated respondents to the web survey is based on a sample size of 3,727, an eligibility rate of 80 percent, and a response rate of 80 percent. The estimate for the eligibility rate may be high; an eligibility rate of 85 percent would result in 2,534 respondents, a web survey burden of 1,267 hours, and a total burden of 1,347 hours.


The estimates of annualized cost to the public (respondents) associated with the collection of information are calculated as the total hours of burden (see Exhibit A-1 above) times the appropriate hourly wage category divided by the length of time of the study. The wage rate for employers nationally was estimated at $48 per hour (http://www.bls.gov/oes/current/oes113049.htm. These estimates are based on the average full-time hourly earnings of managers in human resources departments in the private sector.


Exhibit A-2 shows the annualized costs to the public (respondents) for the hour-burden for data collection.


Exhibit A-2. Annualized costs to the public for hour-burden E-Verify data collections

Collection

Hourly wage

Burden hours

Total

User Survey

$48.00

1,200

$57,600

Designated Agents

$48.00

80

3,840

Total

$48.00

1,280

$61,440



  1. Estimate of Other Total Annual Cost of Burden to Respondents to Support Recordkeeping Requirements


There are no capital or start-up costs associated with these collections. Any cost burdens to respondents as a result of this collection are identified in question A.12. There is no fee associated with collecting this information.



  1. Estimates of the Annualized Cost to the Federal Government


The option year 1 cost, for contract HSSCCG-08-F-00606 to Westat, which also includes costs for analyzing the Transaction Database and conducting other special studies using existing data, is estimated to cost the federal government about $3.4 million for contractual services. This estimate includes labor costs and operational expenses such as designing the studies; determining sample design and selection; recruiting participants; printing materials; programming the Web survey and management system,; training interviewers; conducting interviews with employers; coding responses; paying for overhead, support staff, travel associated with pre-testing the instruments and interviewing federal and state officials, conducting online focus groups to pretest protocols, and costs for data processing; compiling secondary data; performing software tests; conducting analysis; and preparing reports. In addition, an estimated cost of $100,000 for federal salaries and related expenses, making the total annualized project cost $3.5 million.


  1. Explanation for Changes in Burden Hours


There has been a decrease of 119 hours in the estimated burden hours previously reported for this information collection. The decrease can be primarily attributed to the collection of information for the study of DAs and UDAs (80 hours), which replaces last year’s CAPI interviews with employers and workers in Arizona (286 hours).


  1. Plans for Tabulation and Publication


The evaluation of E-Verify will consist of two main components: (1) a web data collection from employers that signed the MOU required for participation in E-Verify and (2) telephone interviews with DAs and UDAs. The time schedule for the conduct of the data collection, tabulation, analysis, and preparation of reports on the E-Verify evaluation is shown in
Exhibit A‑3.


Exhibit A-3. Project schedule for evaluation of E-Verify

Activity

Date to start

Date to complete

Data Collection Activities



Collect data for Web survey of E-Verify employers

6/14/10

9/14/10

Clean data for Web survey

9/14/10

9/21/10

Prepare training materials for case study telephone interviewers

6/21/10

712/10

Train telephone interviewers

7/13/10

7/14/10

Conduct case studies of DA and UDAs

7/15/10

9/11/10

Report Writing (Web Survey)

 

 

Weight Web survey data

9/21/10

9/28/10

Analyze Web survey data

9/28/10

10/20/10

Write first draft (Web survey) for USCIS review

10/20/10

11/19/10

Complete final draft of Web survey report

12/3/10

12/20/10

Informal briefing for USCIS

12/16/10

12/16/10

Report Writing (Case Studies)

 

 

Code data

9/12/10

10/10/10

Analyze data

10/11/10

12/23/10

Write first draft for USCIS review

12/23/10

1/22/11

Prepare final draft of case studies

2/9/11

2/23/11

Informal briefing for USCIS

2/25/11

2/25/11


Examples of the key research topics to be addressed in the Web survey report:


  • Has E-Verify been properly implemented, and does employer compliance vary based on industry or whether they operate in a mandatory or voluntary environment?

  • How satisfied are employers with the E-Verify Program? Has this changed since the 2008 Web survey? How does this differ between employers that are mandated to use E-Verify and those that use it voluntarily? How does it differ between employers that have had TNCs recently and those that have not had TNCs?

  • How is the program associated with the levels of verification-related discrimination appearing in the workplace? Does this differ between employers using the survey voluntarily and those mandated to use it?

  • How is program participation associated with the privacy and security of information on workers and employers?

  • What are the financial costs and other burdens associated with E-Verify use? Has this changed since 2008?

  • What factors are important in employers’ decisions to use E-Verify?

  • What are the reasons that some employers sign up to use E-Verify then either don’t use it or stop using it?


The key research topics addressed in the DA/UDA report, using descriptive statistics and normative analyses (e.g., a description of the tentative nonconfirmation process followed by employers compared to the process intended by USCIS), and comparative analyses are:

  • What factors are important in employers’ decisions to hire a DA?

  • What are employers’ perceptions of the quality and frequency of communication with USCIS and between UDAs and DAs?

  • How satisfied are DAs and UDAs with the E-Verify program? How satisfied are UDAs with the quality of services provided by their DAs?

  • What are the financial costs and other burdens associated with E-Verify when using DAs?

  • Do special compliance problems arise with DAs and their users, e.g., because the UDAs may not have a clear understanding of their responsibilities for the program since they are less directly involved in its use?


Content analysis of interview results will be used and we will use Nvivo to assist in coding and organizing the responses. Additional information on the analytic techniques to be used is included below.


Web Survey Analyses

Many of the Web survey analyses will consist of descriptive statistics (e.g., percentages, means, medians, and standard deviations, as appropriate), cross-tabulations, and graphical summaries to describe the E-Verify verification process, and the characteristics and employment verification experiences of employers in the target population. In addition, the descriptive analyses will provide a starting point for subsequent analyses. While these analyses will not establish causality, they will provide preliminary insight on the hypothesized relationships.


Analyses of major data elements of the program implementation will result in an overall picture of how employers that participate in E-Verify conduct employment authorizations, their perceptions of E-Verify, and their opinions concerning different features of E-Verify that are being implemented or may be implemented. It will also help to quantify the percentages of employers that signed up for E-Verify but are not using it simply because they had no cases, found it burdensome, etc. As a rule, the data to be collected are categorical; however, means and medians may still be used based on scales that combine multiple responses (e.g., the number of tests used as part of the hiring process).


Comparative analyses will be used to compare employer responses to the 2010 and 2008 surveys to determine changes between the two surveys. Additional analyses will be done to determine the relationship of employer characteristics such as industry and size with outcome variables such as satisfaction, burden, and compliance. Tests of significance will be conducted using statistics such as chi-squared, t-tests, or logistic or multiple regression. WesVar will be used in these analyses as appropriate to take into account the complex sampling that will be used in this study.


We anticipate using content analysis to analyze responses to open-ended questions on the DA and UDA interview protocols.


Case Study of DAs and UDAs

Information collected from the case studies is not designed to provide statistically valid results, but rather to provide a more in-depth understanding of the experiences of DAs and UDAs. We expect to obtain a better understanding of how well DAs are serving the needs of different types of employers. The case studies are also designed to understand the division of labor between DAs and UDAs in implementing the various E-Verify responsibilities and to determine whether this division of responsibilities has implications for worker rights, since split responsibility may result in some steps “falling through the cracks.” The interviews will be summarized and presented as illustrative of the types of situations in which DAs and their clients encounter. In summarizing the interviews, UDA interviews will be summarized along with the interviews of the DAs they are using. Content analysis will be used to assist in making inferences from different textual sources. Done correctly, content analysis produces a series of themes and patterns that can yield an in-depth understanding of complex patterns of interaction and behavior.


When the questions asked of DAs and their users are of a quantitative nature, counts of employer responses will be presented to emphasize the exploratory nature of this study. It is anticipated that NVivo will be used to assist in organizing the responses to facilitate analyses of the case study employers.



  1. Plans to Display Expiration Date for OMB Approval

All surveys conducted under this clearance process will display the OMB clearance number. The Web survey will include the OMB expiration date on the first page. The case studies will include this information in a document provided to respondents.


  1. Explanation of Any Exceptions to the Certification Statement

DHS does not request an exception to the certification of this information collection..



B. Collection of Information Employing Statistical Methods.


See Supplemental Supporting Statement B


C. Certification and Signatures



PAPERWORK CERTIFICATIONS



In submitting this request for OMB approval, I certify that the requirements of the Privacy Act and OMB directives have been complied with including paperwork regulations, statistical standards or directives, and any other information policy directives promulgated under 5 CFR 1320.



______________________ ___________________

Sunday Aigbe Date

Chief,

Regulatory Management Division,

U.S. Citizenship and Immigration Services


1The Legal Arizona Workers Act, as amended, prohibits businesses from knowingly or intentionally hiring an “unauthorized alien” after December 31, 2007. Under the statute, an “unauthorized alien” is defined as “an alien who does not have the legal right or authorization under federal law to work in the United States.” The law also requires employers in Arizona to use the E-Verify system (a free web-based service offered by the U.S. Department of Homeland Security) to verify the employment authorization of all new employees hired after December 31, 2007.

2http://www.uscis.gov/portal/site/uscis/menuitem.5af9bb95919f35e66f614176543f6d1a/?vgnextoid=8459535e0869d110VgnVCM1000004718190aRCRD&vgnextchannel=534bbd181e09d110VgnVCM1000004718190aRCRD

14

File Typeapplication/msword
File TitleSUPPORTING STATEMENT A (Revised May 13, 2009)
AuthorS. Tarragon
Last Modified ByStephen Tarragon
File Modified2010-07-20
File Created2010-07-20

© 2024 OMB.report | Privacy Policy