Supporting Statement B - ASR Redesign_202006_CLEAN2_EOP (10-6-20)_ACF 20201009_CLEAN

Supporting Statement B - ASR Redesign_202006_CLEAN2_EOP (10-6-20)_ACF 20201009_CLEAN.docx

Annual Survey of Refugees

OMB: 0970-0033

Document [docx]
Download: docx | pdf



Annual Survey of Refugees



OMB Information Collection Request

0970 - 0033





Supporting Statement Part B –

Statistical Methods

June 2020



Submitted By:

Office of Planning, Research, and Evaluation

Office of Refugee Resettlement

Administration for Children and Families

U.S. Department of Health and Human Services


Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201



B1. Respondent Universe and Sampling Methods

Respondent Universe and Target Population

The Annual Survey of Refugees (ASR) is an annual cross-sectional study, measuring the characteristics and experiences of refugee individuals and households entering the United States in the previous five fiscal years.


The size of the U.S. refugee population varies by federal fiscal year, according to an allowable ceiling set by the President of the United States and required pre-arrival processing by the U.S. Department of State. The 2020 Annual Survey of Refugees, which will introduce a revised survey instrument (Instrument 2), will cover refugee households and their constituent individuals entering the U.S. in the time period of FY2015-FY2019. Table B1 presents the universe of refugee entrants by fiscal year for FY 2015 through FY 2019.


Table B1. Admitted Refugee Individuals by Federal Fiscal Year

Federal Fiscal Year

Refugee Individuals

2015

69,933

2016

84,994

2017

53,716

2018

22,496

2019

30,000

Total

238,643

Source: US Department of State Office of Admissions, Refugee Processing Center, accessed 5/19/2020


Sampling Method

The sampling frame for the ASR is the Office of Refugee Resettlement (ORR) Refugee Arrivals Data System (RADS), which contains demographic information about the universe of refugee entrants to the United States. Each year, a sample of refugee Principal Applicants (PAs) will be drawn from RADS. The PA is the individual whose case formed the basis for the household’s administrative claim to refugee status. This individual is typically also the head of the household. The PA responds to the survey on behalf of all eligible adult in their household. The methodology for drawing the sample is as follows:


  • Contractors, in consultation with ORR, will analyze administrative data on refugee arrivals to determine which language groups to include in the survey administration. The goal is to maximize coverage of the population while minimizing the logistical challenges of serving small language groups. In 2019, the ASR was offered in 21 languages (including English), covering over 75% of eligible refugee entrants from the focal fiscal years. To achieve complete coverage of the remaining 25% of refugees would require the addition of about 200 languages and each language would average 25/200 = 0.125% representation in the sample. This amounts to less than 2 completed interviews per language in a 1,500 interview survey. The cost of recruiting and training both a male and female interviewer for each language (because of the need to gender match to the respondent) for only 2 interviews would be prohibitive and represent an inappropriate allocation of resources from the perspective of fit-for-use. Moreover, the expected coverage bias would be small given the overall coverage rate of 75%. For instance, if the survey estimate (say % employed) was 40% and the noncovered population was 10 percentage points different (say 50%), then the noncoverage bias would be only 2.5 percentage points which is not meaningful from a policy perspective. Moreover, the refugees that are observed in the ASR (i.e. the covered 75%) do not have drastically survey estimates by language. This suggests that we would not expect large differences between covered and noncovered populations and therefore the noncoverage bias would be small.

  • Contractors will draw a stratified sample from the universe of primary applicants in eligible language groups. The principal stratum is a three-category “arrival cohort” (arrived in prior fiscal year; arrived two or three years ago; arrived four or five years ago). Within arrival cohort, data are further stratified by geographic sending region, language, age group, gender, and household size. These are proportionate strata, ensuring the resultant sample is representative of the eligible refugee population.

  • The survey will secure 500 completed interviews from each principal stratum; new arrivals are over-sampled to ensure statistical power to accurately capture the important first year of transition as well as detect meaningful differences between the three cohorts. We expect the maximum margin of error (MOE) at the 95 percent level of confidence to be +/- 4.8 percentage points for estimated percentages of refugees arriving in FY2019 and +/- 6.8 percentage points for percentage estimates of refugees arriving in a given year between FY2015 and FY2018.

  • A replicated sample design is used for sample management. Within each arrival cohort, the stratified sample is randomly partitioned into 30 smaller “snapshots.” This strategy allows the contractor to monitor the sample release and response production closely, maximizing response rate within arrival cohorts while securing the targeted number of completed surveys per cohort in the 12-week fielding period.

Projected Response Rate for 2020

Survey administrations in 2017 and 2018 provide the best available estimates of expected response rate for future ASR administrations.


In 2017, the overall response rate to the ASR was 25%. In 2017, contractors were unable to update contact information for (33%) or make contact with (32%) sampled individuals. Conditional on contact, 74% of sampled individuals completed a survey in 2017. In 2018, the overall response rate to the ASR was 21%. As in 2017, the response rate was driven by the inability to update contact information (32%) or speak to (40%) sampled individuals. Conditional on successful contact, 75% of sampled individuals completed the survey in 2018.


Since updating the ASR’s administration protocol in 2016, ACF has actively monitored the ASR’s response rate and sources of non-response during survey administration. The principal driver of the lower response rate in 2018 (administered early calendar 2019) was the increase in our inability to reach the sample member via phone, despite having an available phone number. Data on changes in sample disposition from 2017 to 2018 ASR are presented in Table B2.

Table B2. Sample Disposition Category, 2018 vs 2017

Disposition Category

2016 ASR

2017 ASR

2018 ASR

2018-2017 difference

Completed Interview

24%

25%

21%

-4%*

Screened Refugee, not interviewed

8%

9%

7%

-2%

Unable to Screen Refugee (Available phone number)

32%

32%

40%

8%*

Unable to Find Refugee

(No available phone number)

36%

33%

32%

-1%

* Statistically significant difference at p < 0.05



Successfully securing responses to telephone surveys is a broad challenge in contemporary survey research, and is also consistently a challenge in the administration of the ASR. To respond to this known challenge, ACF has elected to explore the feasibility of additional survey modes.


Beginning in fall 2019, the contractor has performed a systematic review of recent surveys of refugee populations in the U.S. and abroad, with a goal of identifying how survey mode is related to response rates and demographic response bias in this highly mobile, educationally- and linguistically-diverse population. The review will support recommendations for potential future improvements to the ASR that could result from incorporating additional survey modes, such as text message and/or web, particularly for ASR questions of key policy interest. ACF will consider these recommendations based on the strength of evidence for potential improvements in data quality and resource efficiency, and in context of the overall resources available for this data collection effort. Should the findings of the field scan and available resources support changes to the ASR administration protocol, ACF will submit proposed changes to OMB for approval.


B2. Procedures for the Collection of Information

Design of Conceptual Framework

The redesign of the ASR instrument was the result of a multi-year process, beginning in fall of 2016. Planning for the redesign included a literature review of social science research on refugee integration, an expert convening, question design, and field testing of proposed survey items with refugee respondents. The revisions are intended to collect information about factors that social science research suggests affect refugee resettlement outcomes, in order to help ORR better understand the experiences of the population it serves, and increasing the practical utility of the data collection.

Domains in the revised survey include household demographics; experiences before arrival; human capital; economic self-sufficiency; social connection; well-being and the receiving community; health; and children and schools.


To enable the contextualization of data from the ASR, particularly for key measures reported in the Annual Report to Congress, such as labor force participation and use of public benefits, survey items were drawn from nationally-representative studies of the U.S. population whenever possible. The survey redesign included a review of items from existing studies of refugee populations in the U.S. and abroad.




Statistical Methodology for Stratification and Sample Selection

To ensure quality of sampling selection and stratification, addition statistical methods and procedures are implemented before and during the fielding process.


Prior to beginning subject tracing, the study team will conduct sample validation exercises to ensure that the stratified random sampling and the replicate partitioning procedure performed as intended, and are representative of the intended inferential population.


During the fielding process, the replicate sample release procedure allows for close monitoring, adaptation, and continuous learning. Throughout the field period, the contractor will produce weekly summaries of survey progress by key demographic groups, in order to monitor sample representativeness and redouble efforts to secure participation from underrepresented populations as necessary.


Tracing of Respondents

The most recent contact information contained in ORR administrative data is collected by the U.S. State Department 90 days after the principal applicant’s arrival in the United States. Sampled individuals will be submitted to location tracing, in attempts to update their contact information in preparation for the survey administration.


The contractor will seek updated contact information for sampled respondents using the National Change of Address system and TransUnion Batch Lookup. Based on past experience with this survey’s administration, these are the most comprehensive, relevant sources of updated information, as travel loans from the U.S. State Department for refugees’ arrival in the United States are reported to the TransUnion credit bureau. Given the lack of U.S. credit history for recently-arrived individuals, other attempted batch lookup processes have proven ineffective.


As part of ACF’s effort to redesign the ASR instrument, contractors conducted substantial outreach to refugee-serving organizations and community groups to identify ways to improve respondent tracing and cooperation. This data collection was approved under the ACF generic clearance for pretesting (OMB No. 0970-0355; Pretest of the Annual Survey of Refugees; approved September 13, 2017). We continue to consider a sub-study of hard-to-trace sample members, to determine whether those who are harder to locate vary systematically on key outcomes of interest in the survey (economic outcomes, language ability, benefits use, etc.).


Advance Mail-out

The contractor will prepare and send an Introduction Letter and Postcard (Instrument 1) to each of the potential respondents, which reflects the updated time estimate for survey completion and the $2 token of appreciation to update participant’s information. This letter introduces the survey and provides means to contact the research team with updated telephone information via pre-paid postal mail, email, or telephone. If a potential respondent does not update their information upon receipt of the letter, the contractor relies on the most up-to-date telephone number available from tracing efforts.


Interviewing

The contractor will use culturally sensitive interview methods, including matching interviewer and interviewee by gender and language, and avoiding calls on major religious days and holidays.

Telephone interviews will be conducted in the Principal Applicant’s preferred language, using computer-assisted telephone interviewing (CATI) protocol to ensure accurate and complete data collection. Interviewers will attempt to contact each selected refugee up to 10 times before final disposition is listed as “unable to contact.”


The study team will also:

  • Prepare a questionnaire reference book for use by the interviewers. The contractor will provide training to the interviewers in the conduct of these interviews in order to reduce interviewer error prior to interviewing. Interviewers will receive a thorough explanation of each survey question and identify logical and acceptable responses to questions; be briefed on their commitment to privacy; familiarize themselves with the flow and the CATI application, and then be evaluated to ensure an acceptable command of all concepts and technical aspects involved in the interview process. They will also be trained on handling respondent distress should that arise (which is a very rare occurrence in the ASR).

  • Provide ongoing monitoring of interview quality, including live listening to a sample of calls. Stronger interviewers will be assigned more difficult cases to maximize data quality.

  • Download and review data tables from CATI system, including frequency tests to identify any erroneous anomalies.


Estimation Procedures

The ASR data are used to produce national estimates of key household and individual characteristics for official external release in ORR’s Annual Report to Congress. Additional analyses provide descriptive information to inform internal discussions for program management and improvement. All point estimates are accompanied by corresponding measures of error using 95% confidence intervals.


Analytic weights are used to produce statistically valid point estimates and calculate statistical uncertainty that accounts for clustering of individuals within households (for person level statistics). In 2017 and 2018, the nonresponse/post-stratification adjustment was developed by first conducting a Chi-square Automatic Interaction Detector (CHAID) Analysis to identify the factors most associated with survey response. The factors that emerged from this analysis (which are also available in RADS administrative data) were then used for the nonresponse weighting. We do not anticipate changes to the weighting procedure for 2020.


The ASR data are not used to produce projections of future values.


Data Handling and Analysis

The ASR is administered via CATI with pre-programmed acceptable value ranges and skip patterns to mitigate data entry errors. As described in B4 above, during the fielding period, the contractor periodically downloads and reviews data tables from the CATI system, including frequency tables, to identify and investigate any potentially erroneous anomalies.


As part of survey post-processing, the contractor compares respondent-provided household roster data to administrative data from ORR to ensure that only eligible refugees (arriving during the specified time period) are included in tabulations about refugee adults in the household.


The primary statistical product from the ASR is a set of tabulations included in ORR’s Annual Report to Congress. ORR presents the ASR results alongside other quantitative and qualitative information sources to fulfill its Congressionally-mandated reporting requirements, per the Refugee Act of 1980. In ORR’s Annual Report to Congress, ASR data are used in descriptive (bivariate) tabulations.

The analytic code that generates the contractor’s tabulations is returned to ACF as a deliverable of the contract, to facilitate reproducibility.


Data Use

ORR’s Annual Report to Congress includes a description of data quality and limitations in the text of the report and technical footnotes accompanying each table. The report also includes a brief technical appendix that provides additional information important to the interpretation of survey data.


The expansion of the ASR survey instrument provides new opportunities to generate descriptive information about the characteristics, experiences, and outcomes for refugee households. A restricted use analytic file and detailed technical documentation are produced for ACF’s internal use, and will support the expansion of in-house analysis of the survey data.


ACF is interested in making data collected through federal contracts available for secondary research use. To this end, a public use analytic file from the ASR will be generated to properly protect respondent privacy while enabling use by external researchers. These files are archived annually on the website of ICPSR at the University of Michigan. The ICPSR archiving procedure sets standards for the form and content of technical documentation and codebooks necessary to support secondary analysts’ understanding of the data’s production and its limitations. The ASR public use technical documentation also includes sample analytic code and detailed instructions to assist secondary researchers in the correct application of analytic weights to generate representative point estimates and correctly measure statistical uncertainty.






Burden Reduction

In the spring of 2019, a working group of staff social scientists from ORR, OPRE, and HHS’s Office of the Assistant Secretary for Planning and Evaluation (ASPE) conducted an internal review of pre-test survey items to identify candidates for streamlining. The group assessed the contractor’s findings on the scientific validity of measures from the pretest and additionally considered the policy and programmatic relevance of candidate survey items. This analysis resulted in the reduction of survey questions. The currently-approved instrument has 99 unique items.

Efforts to further reduce burden involved streamlining the number of questions asked for each adult member of the household roster and the interviewer instructions for rostering. Demographic data will be collected about all household members, any age, up to 15 people. Moreover, to increase data quality from proxy reporting, we propose that PAs will only be asked economic questions for
family-member eligible refugees aged 16+ (rather than all eligible refugees aged 16+, as is current). This will further reduce burden for households where non-family members are in residence.

At the conclusion of this exercise, the number of questions increased for households where only the Principal Applicant is eligible for the survey, from 99 to 128. For a two-adult household, the instrument increased by 9 questions, and for a 3 person household, the instrument decreased by 11 questions. The strategy of reducing household roster items was intended to reduce disproportionate burden on large households, and to increase the reliability and validity of the instrument by reducing the portion of questions that require proxy reporting for other adults. Table B3 provides a comparison of burden between the currently approved instrument and the revised instrument:

Table B3. Comparison of Number of Questions: Currently Approved and Revised ASR


Currently-approved ASR

Proposed Revised ASR

Difference

Questions Asked of:

Principal Applicant Only

36

85

+49

All roster individuals age 16+

63

43

-20





Questions by Number of Eligible Roster Individuals

PA Only

99

128

+29

2

162

171

+9

3

225

214

-11

4

288

257

-31

5

351

300

-51



B3. Methods to Maximize Response Rates and Deal with Non-Response

Maximizing Response Rates

The primary strategy for maximizing response rates while securing the target number of completed surveys will be the replicated sampling strategy, outlined in B1 above. This allows the contractor to closely follow response rate by arrival cohort, and release further sample into the field if production within a cohort is lower than expected. In both 2017 and 2018, the oldest cohort (4 or 5 years since arrival) required additional replicate releases to meet production goals, reflecting that non-traceability is higher among refugees who have been longer in the United States. Post-participation tokens of appreciation will also be used to increase participation.


However, we are proposing an additional avenue in order to increase the response rate. We are finding that response rates are decreasing over time (from 25% in ASR 2017 to 21% in ASR 2018) and if this trend continues it could possibly diminish the data quality of ASR statistics in the Congressional Report. Historically, the majority of non-response in this survey comes from inability to successfully contact respondents, including those for whom we have updated telephone numbers from batch look-up. This signals that participants are no longer voluntarily updating their numbers or no longer incentivized to participate. To combat the decreasing response rate, we are proposing to offer an experimental $2 token of appreciation; this strategy is supported in the survey literature1. The token of appreciation will accomplish two goals: 1) will encourage more voluntary telephone updates; and 2) familiarize sample members with the information collection. Both are important to the overarching goal of the ASR to inform the U.S. government on the overall well-being of its refugee population. The ASR demonstrated high level of respondent participation conditional on successful contact (74-75%) and no substantial non-response bias on observable demographic characteristics, conditional on successful contact, using this token of appreciation structure.


Non Response

Analysis of survey- and item-level non-response bias is a routine task in the production of ASR analytic files, survey weights, and data documentation. The contractor takes advantage of ORR administrative (RADS) data on the universe of refugees eligible for the survey to assess non-response bias on observable characteristics.

At the completion of data collection, the contractor calculates analytic weights to enable nationally-representative point estimates and the calculation of statistical uncertainty that accounts for clustering of individuals within households. These weights include a base (sampling) weight reflecting the refugee household selection probability. Because sample allocations of each cohort are managed separately, selection probabilities vary by the size of the arrival cohort population and amount of the sample released into the field. Weights also include a post-stratification adjustment to correct for differential non-response across cohort and demographic subgroups, aligning the data to known population distributions taken from ORR administrative data. Both household level (i.e., PA) and person level analytic weights are developed for the ASR.


Identifying Measurement Error

ACF employs a continuous quality improvement framework during ASR survey administration. In the past, this effort has resulted in qualitative efforts to understand and reduce measurement error, such as debriefing field interviewers following the survey administration, and using that information to update item translations and interviewer training materials to improve the comparability and quality of data across the ASR’s many languages and cultural groups. Since the new survey instrument will require fresh translation into the survey’s languages, we anticipate employing similar quality improvement processes over the course of this approved data collection.


Generalizability of Results

This study is intended to produce national estimates of the characteristics and experiences of refugee households and refugee individuals aged 16 and older entering the United States in the previous five fiscal years.

As described in B1 above, the 2018 ASR was offered in 17 languages and 2019 ASR was offered in 21 languages, covering approximately 75% of refugee entrants each year. Households speaking less-frequent languages (typically less than 1 % of the total refugee population per omitted language) are intentionally excluded from the sample, and this limitation to national representativeness is noted in all technical documentation, written products, and digital products associated with the data collection.



B4. Test of Procedures or Methods to be Undertaken

Survey Pre-Testing

Following the development of the survey’s conceptual framework and a list of potential survey items, the contractor conducted preliminary field testing of the proposed revision of the instrument under ACF’s generic clearance for pre-testing (OMB No. 0970-0355; Pretest of the Annual Survey of Refugees; approved September 13, 2017). Half of the pre-testing respondents (58) were administered the full survey followed by a semi-structured interview about their resettlement experience. The goal of the qualitative interviews was to assess whether the survey’s conceptual framework accurately addressed key dimensions of the resettlement experience, as understood by refugees themselves. The other half of pre-test respondents (47) participated in survey administration with cognitive interviewing to assess the survey items’ reliability, validity, and intelligibility across linguistically and culturally diverse populations.

In the spring of 2020, the latest round of pre-testing included a full administration of the proposed redesigned questionnaire as well as nineteen cognitive questions with 8 respondents (not subject to PRA). The aim of the cognitive questions was to probe comprehension and ask for feedback to improve the instrument. Mock interviews were conducted to get an accurate sense of interview timing to estimate length of interview by household size and language. The pretest results concluded by calculating the overall average timing of approximately 48 minutes for the refined survey instrument.



B5. Individuals Consulted On Statistical Aspects and Individuals Collecting and/or Analyzing Data

Dr. Xiayun Tan

Social Science Research Analyst

ACF Office of Refugee Resettlement

Mary E. Switzer Building, 5th Floor

330 C Street, SW, Washington, DC 20201

(202) 260-6768

[email protected]


Dr. Nicole Deterding

Senior Social Science Research Analyst

ACF Office of Planning, Research, and Evaluation

Mary E. Switzer Building, 4th Floor

330 C Street, SW, Washington, DC 20201

(202) 205-0742

[email protected]


Robert Santos

Vice President and Chief Methodologist

The Urban Institute

(202) 261-5904

[email protected]


Dr. Hamutal Bernstein

Principal Research Associate

The Urban Institute

(202) 261-5840

[email protected]



Attachments

Instrument 1 - Introduction Letter and Postcard (2020 Proposed)

Instrument 2 - ORR-9 Annual Survey of Refugees (2020 Proposed)

Appendix A - Crosswalk of Current and Revised ASR Instruments

Appendix B - Roundtable Participant List

Appendix C - Current IRB Packet 11.19.18



1 Mercer, A., Caporaso, A., Cantor, D., and Townsend, R. (2015) How Much Gets You How Much? Monetary Incentives and Response Rates in Household Surveys, Public Opinion Quarterly, Vol. 79, Issue 1, Spring 2015, Pages 105–129, https://doi.org/10.1093/poq/nfu059. (See Table 5.)

10


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDeterding, Nicole (ACF)
File Modified0000-00-00
File Created2021-12-07

© 2024 OMB.report | Privacy Policy