SUPPORTING
STATEMENT B
Online Survey of Web Services Employers
OMB Control No.: 1615-NEW
COLLECTION INSTRUMENT: File No. OMB-70
B. Collection of Information Employing Statistical Methods
We are proposing to use statistical methods for our Online Survey of Web Services Employers, as described in Sections B.1-B.3.
The respondent universe will be defined by a list of Web Services employers provided by USCIS. The current list of 569 Web Services employers consists of all employers that are registered with the E-Verify Program as Web Services employers and are processing worker verifications through a Web Services interface.
Among this population there are three types of Web Services employers:
Regular Web Services employers – employers that access and use E-Verify for their own employees through commercially available software or software that they develop themselves; they may or may not directly access E-Verify through the browser.
Web Services E-Verify Employer Agents (EEAs) – employers that develop and/or provide customized E-Verify software for their clients; they might also provide additional E-Verify services to their clients, usually for a fee.
Software developers – companies that frequently enroll as Web Services EEAs and provide only the software (a product) but typically offer no E-Verify services to their clients.
The population may also be broken down by additional key characteristics, including company size, number of Tentative Nonconfirmation findings (TNCs) received as a result of worker verification, and industry type (see table B-1).
Table B-1. Number and percent of Web Services Companies by selected characteristics
Characteristic |
Number |
Percent |
Total |
568 |
100.0 |
|
|
|
Industry Type |
|
|
Employment/Temp Agencies |
3 |
0.5 |
High Risk |
83 |
14.6 |
Other |
482 |
84.9 |
|
|
|
Employer size |
|
|
1 to 4 |
51 |
9.0 |
5 to 9 |
40 |
7.0 |
10 to 19 |
38 |
6.7 |
20 to 99 |
159 |
28.0 |
100 to 499 |
129 |
22.7 |
500 to 999 |
40 |
7.0 |
1,000 to 2,499 |
38 |
6.7 |
2,500 to 4,999 |
19 |
3.3 |
5,000 to 9,999 |
10 |
1.8 |
10,000 and over |
44 |
7.7 |
|
|
|
Active Status in the past 12 months1 |
|
|
Active |
470 |
82.7 |
Inactive |
98 |
17.3 |
|
|
|
Number of TNCs2 |
|
|
None |
103 |
35.4 |
1 to 10 |
187 |
32.9 |
11 to 20 |
45 |
7.9 |
21 to 50 |
45 |
7.9 |
51 or more |
90 |
15.8 |
1Active status was measured if employers made transactions in the past 12 months.
2Number and percentage for companies with no TNC include 98 inactive companies.
Westat will survey the universe of 568 Web Services employers. Based on other surveys of employers enrolled in E-Verify, we expect at least an 80 percent response rate. Initial feedback from Web Services employers indicate a willingness to share information about the Web Services software packages that companies use for employment verification. Thus, we are optimistic that the response rate will be higher than 80 percent. If we do not obtain this response rate, we will conduct a non-response bias analysis.
We view the software review
portion of this study as a collateral case study to provide more
detailed information about how the Web Services GUIs are similar to
or different from the
E-Verify browser GUI. Since this part of
the study is exploratory, we had some preliminary conversations with
a few software developers who agreed to talk to us about how Westat
could gain access to their software. We learned that very few Web
services employers that develop their own software or software
developers that develop customized E-Verify software packages for a
fee would be willing to participate in this portion of the study.
Based on this information alone, we anticipate that a maximum of
10-12 employers would participate in this portion of the study.
Information Collection Procedures
Westat will conduct a population survey. Therefore, we will not be performing sample selection, stratification, or estimation procedures related to sampling.
To minimize nonresponse, Westat will devote resources to developing and implementing approaches likely to achieve good respondent cooperation with the Online Survey of Web Services Employers. We expect high levels of cooperation with the evaluation among employers that have enrolled as Web Services providers. These employers have signed Memoranda of Understanding (MOUs) with the Department of Homeland Security (DHS) and have agreed to respond to inquiries about E-Verify. Specifically, the MOU states the employer’s responsibilities as follows:
“The Employer agrees to cooperate with DHS and SSA in their compliance monitoring and evaluation of E-Verify, including by permitting DHS and SSA, upon reasonable notice, to review Forms I-9 and other employment records and to interview it and its employees regarding the Employer’s use of E-Verify, and to respond in a timely and accurate manner to DHS requests for information relating to their participation in E-Verify.”
The techniques that will be used to ensure high response rates are:
Pretesting. A paper version of the survey was pretested with five Web Services employers. Pretesting helped ensure that the survey is not overly burdensome and that questions are clearly worded, easily answered, and relevant to the experiences of Web Services employers. More information on pretesting is provided in section 4 of this document.
Motivational material. Care will be taken in the final production of survey materials to:
Create a professional image for the study;
Emphasize the importance of participation towards shaping future directions in federal immigration policy;
Emphasize the steps that will be taken to ensure respondent confidentiality; and
Use language appropriate for the target population.
Aggressive follow-up. One of the major factors that increases study response rates is the use of aggressive follow-up procedures to gain cooperation with the study. The Online Survey of Web Services Employers, therefore, includes multiple contacts with selected respondents. More specifically, the data collection procedures consist of the following steps:
A personalized pre-notice letter will be included in the USCIS materials sent to all individuals identified as primary contact persons. This letter will be from USCIS (Attachment B) and will state that this is part of the evaluation effort the employers authorized when they signed the MOU. The letter will stress both the importance of participation to future employment verification efforts and the fact that USCIS will only use the information for research purposes.
A personalized email will be sent to all contact persons reiterating the importance of the study and providing information on how to log on to the website (Attachment C).
If the email results in a response indicating the email address is no longer valid, an email will be sent to the alternate contact person, if any.
If no email is provided for the primary contact person or if there is no alternate contact person for a non-valid email address, callers will research the company’s contact information online to identify a staff person in Human Resources or a similar office who might be knowledgeable about the appropriate contact person.
If calling the Human Resources Department does not result in identifying the correct contact person, a phone interviewer will call the main number of the company to determine who is the correct contact person and, if possible, obtain the name and contact information for an alternate person who will be responsible for the study, if the primary contact person is not available. The phone interviewers will also collect information on important changes in status among the contacted companies to determine if they are out-of-scope for the study (e.g., the employer no longer provides E-Verify Web-services).
A reminder email will be sent to contact persons approximately one week after the initial contact.
Approximately two weeks after the reminder email, a second reminder e-mail will be sent to non-respondents.
Approximately one to two weeks following the second reminder e-mail, phone interviewers will contact non-respondents to inquire if they received the study materials and to encourage them to complete the survey. If necessary, reluctant respondents will be reminded of the MOU in which the employer had agreed to participate in the evaluation. Information on how to access the online survey site will be provided, if necessary.
A second phone reminder will be made approximately two weeks after the first phone reminder. At that time, the interviewer will offer to send a hard copy survey if the respondent prefers to answer in this fashion. Again, nonrespondents will be reminded to complete the online survey and will receive specific log-in instructions, if necessary.
If necessary, a final contact will be made approximately two weeks after the second phone reminder. The non-respondents will be sent, via Federal Express, another cover letter, log-in information to access the survey Web site, and a hard copy survey. In addition to the above contacts, a thank you email will be sent to respondents that complete the survey.
While the survey data collection is in process, Westat will maintain a help desk (using a toll free telephone number) that companies may call to ask questions about both the mechanics of the survey (such as how to access the survey and enter responses) and the survey content (e.g., if employers are uncertain of the meaning of a particular question).
Training. All individuals working on the study who will be in contact with potential respondents by phone or email will be trained in ways to optimize response without placing undue pressure on potential respondents. In addition to general survey procedures, they will be trained to respond to specific questions that are likely to be raised in this study. This training will include help desk personnel as well as telephone interviewers.
Nonresponse conversion. Experienced interviewers who are particularly skilled in nonresponse conversion will re-contact initial refusals. The major exception to this rule is for hard refusals (i.e., companies who have requested not to be called again).
Unit nonresponse adjustments. If the response rate is very high (95 percent or above), we will not adjust for survey nonresponse. If the response rate is between 80 and 94 percent, we will make adjustments for survey nonresponse. However, a full-fledged nonresponse bias analysis will be conducted if the response rate falls below 80 percent.
Editing and data cleaning. A number of editing features will be built into the online survey. For example, if the respondent attempts to provide multiple answers to a question requiring a single response, the respondent will be asked to select only one response. Additional editing checks will be done subsequent to survey completion to check for completeness, and inter-item consistency, extraneous remarks, and, for respondents completing a mail survey, proper adherence to any skip instructions.
Item nonresponse adjustments. Although our procedures are designed to maximize item response rates, the analysis will need to confront the issue of missing data. Experience with previous surveys indicates that some respondents will omit responses to some specific items (e.g., sensitive items), although they may have provided most of the data required. By employing good survey data collection practices, we expect to minimize the amount of missing data on any single variable to a very low level. However, if item nonresponse is unexpectedly high for any of the key analytic variables, hot deck imputation techniques will be used to impute missing-item values. Westat has a SAS macro, called WESDECK to perform hot deck imputation. Hot Deck imputation will replace a missing value of a questionnaire item with a reported value from another employer with similar known characteristics (e.g., company size and type of web services employer). The employer with a missing item is referred to as the recipient and with the reported item as the donor. First, we will form imputation classes using the reported items by both donors and recipients. These reported auxiliary variables used in forming the imputation classes are desired to be correlated both with the variable being imputed and also with propensity to respond to that variable. Within an imputation class, a donor will be selected randomly among the potential donors for each recipient. Then, the reported value of the donor will be assigned to the missing value of the recipient. This imputation procedure will reduce potential nonresponse bias to the extent that the imputation classes are correlated with the item being imputed and also with propensity to respond to that item. In addition, imputation of missing data items will result in a rectangular complete data set at least for key analytic items that will prevent inconsistencies resulting from analysts using their own separate adjustments for missing data.
For analyses involving just one or two variables that have not been subject to imputation, we will handle the problem by omitting the cases with missing data; or, in the case of categorical response variables, we will use an explicit “missing” or “unknown” category. When multivariate techniques involving several variables are used, analytic techniques for missing values will be used (such as using the variable mean or adding a dummy variable to reflect how the nonrespondents differ from the other companies).
The Online Survey of Web Services Employers submitted in this request for clearance was pre-tested with five Web services employers. Pre-test respondents were asked to provide feedback on whether the questions are clearly worded, easily answered, and relevant to the company’s use/development of Web services software. Respondents were also asked to identify any terminology used in the survey that requires further definition.
In general, pretest participants did not report any problems with the terms used in the survey. However, several participants offered suggestions for modifying questions to make them clearer or more relevant to the experiences of web services employers and software developers. The following substantive changes were made to the survey based on key findings from the pretest.
Several questions were dropped from the survey because participants reported too much overlap among questions. In addition, a few questions were added to address some concerns of pretest participants (e.g., their level of satisfaction with contacting USCIS and CSC).
Some skip patterns were modified to cast a wider net. For example, some pretest participants pointed out that questions about whether employers contacted CSC were relevant to employers that purchased their software from another company and not only those who developed the software.
Text boxes were added to several questions in order to allow survey respondents to provide additional information or explanations for their response to those questions.
The question on how employers obtained their software was modified to include an important response category for employers that purchased the basic software from another company but had the software modified for their use.
The question on how the software extracted data from other databases was modified to more accurately reflect the various processes.
Based on feedback from the pretest, Westat modified the survey, and a revised instrument is attached (Attachment A). .
The following statisticians were consulted on the statistical aspects of the design and analysis of the current study:
Huseyin Goksel
Westat
1600 Research Blvd.
Rockville, MD 20850
301-251-4395
The following individuals will collect and/or analyze data for the current study:
Basmat Parsad
Westat
1600 Research Blvd.
Rockville, MD 20850
301-294-3946
Denise Glover
Westat
1600 Research Blvd.
Rockville, MD 20850
301-251-2269
Atsushi Miyaoka
Westat
1600 Research Blvd.
Rockville, MD 20850
301-610-4948
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Lisa Schiavo |
File Modified | 0000-00-00 |
File Created | 2021-01-28 |