B. Collections of Information Employing Statistical Methods
The New Applicant Survey (NAS) will produce a nationally representative sample of working-age adults who have recently applied for disability benefits from the Social Security Administration (SSA). We are designing the sample to yield a total of 10,000 completed surveys. The information we collected using this survey will help SSA understand more about the experiences of the sampled individuals before and during the application process, and pathways taken after their application to identify potential policies or interventions that could improve their experience or support later work efforts.
The NAS will document the characteristics of new applicants and explore a variety of pre‑ and post-application experiences, especially as these pertain to work and health, as well as access to relevant support services including knowledge of, interest in, and access to application support and representation.
The survey aims to answer the following initial research questions:
What are the pre- and post-application employment experiences of awarded and denied Social Security Disability Insurance (SSDI) and Supplemental Security Income (SSI) applicants?
What employment-, vocational-, medical-, or income-related services and supports did applicants use leading up to and since application?
What sources of information about SSDI or SSI did applicants use or have access to?
What were the applicants’ experiences with representation during the application or post-application periods?
In addition to addressing the initial research questions, the survey data will allow researchers to conduct analyses of responses among policy-relevant subgroups. These subgroups will include: (1) individuals at different stages of the application and appeals process (e.g., applicants who completed an initial application, applicants who have filed an appeal, etc.); (2) individuals with different outcomes (e.g., applicants who were allowed, applicants who were denied, and applicants with no decision); (3) applicants with and without legal representation; and (4) applicants with different policy-relevant characteristics (e.g., geographic region, age, impairment, demographic characteristics).
Respondent Universe and Sampling Methods
We will randomly select disability applicants from an administrative data file containing those who applied within a one-year period (where we will determine the specific one‑year period based on the survey timeline). First, we will create a single sampling frame using applicant characteristics available in the administrative data to identify disability applicants eligible to complete the survey. We will clean the data file to remove duplicates based on Social Security Number (SSN) (or a deidentified proxy based on SSN) prior to drawing the sample.
We will utilize stratification in the sampling to ensure representation of policy-relevant subgroups. We are particularly interested in the experiences of disability applicants at different stages in the application and appeals process, as well as applicants with different demographic characteristics and experiences with legal representation. We will select a single sample that captures applicants at every stage of the application process. If the applicants have moved to another stage by the time they complete the survey, the instrument will retrospectively capture that information.
Exhibit 1 provides a summary of the expected distribution of subgroups of interest using population distributions drawn from past disability applicants available from SSA administrative records. The sample size assumes a 25 percent response rate, resulting in a total sample of 40,000 applicants to achieve an expected 10,000 completed surveys. The 2019 National Beneficiary Survey (NBS) of disability beneficiaries, OMB No. 0960‑0827, yielded a 36 percent unweighted response rate. There are several reasons why the NAS will likely yield a lower response rate than the NBS. First, the NAS is a survey of disability applicants, instead of beneficiaries. Disability applicants tend to be more difficult to contact than beneficiaries because they are less likely to have stable addresses or contact information (Taylor, et al., 2021). Furthermore, overall trends in response rates indicate that response rates will likely be substantially lower for new survey efforts compared to those completed in 2010 (Stedman, et al., 2019).
Exhibit 1 also shows the expected precision of statistics calculated for different sample sizes in the fourth column. We set the values for the percent and design effect to be “safe” or “conservative.” We assume that the population estimates we are interested in calculating is a proportion of 0.5 (50%) because it gives the maximum standard error, and therefore the maximum (most conservative) margin of error, which is the half-width of the 95% confidence interval (CI). This level of precision is enough to detect 95% CI half-widths that are mostly within the one to two percentage point range, which we expect to be more than enough to detect meaningful differences between these subgroups.
In addition, Exhibit 1 shows the expected half-width of 95% CI for questions in modules we expected only 50% of sample members will complete (yielding an expected 5,000 responses for analysis) due to skip patterns. While half-widths for the sample of 5,000 respondents are larger than those for the full sample of 10,000 respondents, we still expect them to yield half-width CIs of 5 percentage points or less for subgroup comparisons despite drawing on responses from half of the sample.
Exhibit 1. Expected distribution of subgroups of interest for NAS, assuming an equal probability sample |
||||||
|
|
Full Sample |
Modules |
|||
% of population |
Sample sizea |
Target completed surveys |
Expected half-width of 95% CI (for P=0.5) |
Target completed surveys |
Expected half-width of 95% CI (for P=0.5) |
|
Total |
100 |
40,000 |
10,000 |
1.1 |
5,000 |
1.6 |
Application stage (most recent application) |
|
|
|
|
|
|
Initial application |
64.9 |
25,960 |
6,490 |
1.4 |
3,245 |
2.0 |
Reconsideration appeal |
16.7 |
6,680 |
1,670 |
2.7 |
835 |
3.9 |
Appeal to Administrative Law Judge (ALJ) |
18.4 |
7,360 |
1,840 |
2.6 |
920 |
3.7 |
Most recent decision |
|
|
|
|
|
|
No decision (pending) |
34.0 |
13,600 |
3,400 |
1.9 |
1,700 |
2.7 |
Allowed |
18.3 |
7,320 |
1,830 |
2.6 |
915 |
3.7 |
Denied |
47.7 |
19,080 |
4,770 |
1.6 |
2,385 |
2.3 |
Appointed representative (AR) |
|
|
|
|
|
|
Assigned an AR |
59.7 |
23,860 |
5,965 |
1.4 |
2,983 |
2.0 |
Has not assigned an AR |
40.3 |
16,140 |
4,035 |
1.8 |
2,018 |
2.5 |
Completed consultative exam |
|
|
|
|
|
|
Yes |
83.1 |
33,248 |
8,312 |
1.2 |
4,156 |
1.7 |
No |
16.9 |
6,752 |
1,688 |
2.7 |
844 |
3.8 |
Age categories |
|
|
|
|
|
|
18-29 |
14.1 |
5,620 |
1,405 |
3.0 |
703 |
4.2 |
30-39 |
15.7 |
6,292 |
1,573 |
2.8 |
787 |
4.0 |
40-49 |
21.1 |
8,452 |
2,113 |
2.4 |
1,057 |
3.4 |
50-59 |
33.4 |
13,348 |
3,337 |
1.9 |
1,669 |
2.7 |
60-64 |
15.7 |
6,288 |
1,572 |
2.8 |
786 |
4.0 |
Urban/Rural |
|
|
|
|
|
|
Urban |
75.0 |
30,000 |
7,500 |
1.3 |
3,750 |
1.8 |
Rural |
25.0 |
10,000 |
2,500 |
2.2 |
1,250 |
3.2 |
SOURCE: We obtained population distribution via an independent tabulation of disability applications drawn from SSA’s Structured Data Repository filed from December 1, 2022, through June 30, 2023.
NOTES: a Assumes a 25% expected response rate.
We also examined the power to detect differences between the “Reconsideration appeal” subgroup and the ALJ subgroup. With the sample sizes given in Exhibit 1, we expect to detect, with 80% power, a difference of 5.4 percentage points between the two subgroups (e.g., proportions of 50 percent for the “Reconsideration appeal” group and 55.4 percent for the ALJ group).
Based on our previous experiences, we also suggest allowing a reserve sample of 20,000 applicants to allow for lower-than-expected response rates. If needed, we will release the reserve sample in blocks of 1,000 individuals at a time to reach the total number of 10,000 completes. Drawing a reserve sample and increasing the pool of potential respondents will help mitigate challenges related to locating applicants due to incorrect contact information and failure of applicants to respond to survey requests.
Procedures for Collecting the Information
As described in Part A, our approach is a mixed mode design offering applicants the ability to complete the survey by web, paper, or telephone. Our core data collection design strategy is a sequential push-to-web design with the mail mode introduced during the fourth mail communication and the telephone mode introduced during the fifth mail communication. In addition, we embedded experiments in this core design. We discuss these experiments in Part A.
The contact protocol for the core survey includes the following steps:
A few weeks prior to starting data collection, SSA will announce the survey on its website. SSA will disseminate survey information to SSA field offices, teleservice center, and line staff.
A few weeks prior to the first mailing to applicants, we will send an email to appointed representatives (ARs) through their professional associations. The email will alert ARs that SSA is conducting the survey and that we may select some of their clients for the survey sample. It will ask ARs to encourage their clients to participate in the survey if we select them and will include a link to the survey website.
We will send an email to ARs who represent applicants selected for the survey sample a few days prior to first mailing to applicants. The letter will alert the AR that we selected their client for the survey sample and will ask the AR to encourage the client’s participation in the survey.
We will send the first USPS mailing to sampled applicants. The mailing will include an introductory letter, survey information sheet, and visible $2 cash pre‑pay incentive. The letter will highlight that participation in the survey is voluntary, explain that we will keep the applicants' information confidential, and describe the incentive applicants will receive for completing the survey. The letter will also include a QR code to the survey website, a link to the survey website, a personal password to access the web version of the survey instrument, and a toll-free telephone number and email address where applicants can direct questions about the survey. An SSA official will sign the letter and it will include the SSA logo.
One week after first mailing, we will send the second USPS mailing to all sampled applicants. The mailing will consist of a fold-over postcard that will include a link to the website, QR code, toll-free number, promised incentives. The postcard will include all the key pieces of information in the initial letter, such as the link and QR code to the survey website and the personal password to access the web version of the survey instrument. In addition, we will send a text message reminder to applicants for whom SSA does not have an email address, or an email reminder to applicants for whom SSA has an email address.
One week after the second mailing (postcard), we will send the third USPS mailing to non-respondents. This mailing will include a letter with similar content to the first letter, plus a non-visible $2 cash pre-pay incentive for a subset of sampled participants. We will also send an email reminder to non-respondents for whom SSA has an email address, and a text message reminder for those for whom we have a phone number.
Two weeks after the third mailing, we will send fourth USPS mailing to non‑respondents. This mailing introduces the paper mode and will include a letter, a paper version of the survey instrument, and a postage-paid return envelope. Applicants will still have the option of responding via the web. About a week prior to this mailing, we will send email or text messages to those for whom we have email and/or phone numbers, informing the respondent to be on the lookout for the mailing with the hard copy questionnaire.
Finally, two weeks after the fourth mailing, we will introduce the telephone mode. We will call non-respondents up to three times. We will also send an email message to non-respondents for whom SSA has an email address to alert them that they can contact the Survey Help Desk to complete the survey via a telephone interview. We will send a text message alert to non‑respondents for whom we have a phone number explaining that they can contact the Survey Help Desk to complete the survey via telephone.
Both our telephone interviewers and respondents who complete the survey via the website will access a web version of the survey instrument, allowing a single database to house all completed surveys. This design will facilitate easy access to both the English and Spanish versions of the instrument for both telephone interviewers and web respondents. The paper and telephone versions of the survey instrument will mirror the web version, with some format adaptations appropriate to the mode and to minimize respondent burden. We will integrate the instrument with the survey management system.
The start of the survey instrument will display informed consent language. In the web and paper versions of the instrument, we will require respondents to click on a radio button or check a box to provide consent before continuing to the survey. For the telephone version of the instrument, the interviewer will read the informed consent language to the respondent, ask the respondent to provide verbal consent, and then click on a radio button indicating that the respondent has done so.
Methods to Maximize Response Rates
The data collection procedures will involve multiple strategies to maximize response rates:
Incentives
We will offer both pre-incentives and post-incentives to maximize response rates:
$2 pre-incentive with first USPS mailing. Research shows that pre-paid incentives of this size significantly increases response to both web surveys (e.g., Messer & Dillman, 2011) and telephone surveys (Cantor et al., 2008). The meta‑analysis by Mercer et al. (2014) found that incentives of this size increased response rates by approximately 11 to 16 percentage points, depending on the mode. A subset of sample members will receive a second $2 pre-incentive with the third USPS mailing as part of the experiments described below.
Two levels of post-incentives for survey completion. Our push-to-web design offers a larger incentive to complete the web survey. Web survey respondents will receive $40 compared to $30 for paper and phone survey respondents. As part of the experiments described below, subsets of sample members will receive either (1) $40 for completing by web before the date of the fourth contact and $30 after that date or (2) $40 for completing by web or paper before the date of the fourth contact and $30 after that date.
Experiments to improve participation through varying modes and incentives
As described in Part A, we will randomly draw a subsample of applicants to test the impact of varying modes and incentive structures on response rates. We will test the impact of:
1. Offering the opportunity to complete the survey by web first and paper second (sequential) versus offering web and paper at the same time (concurrent).
2. Offering a higher “early bird” incentive for completing the survey by the date of the fourth contact.
3. Providing a second $2 incentive in the third mailing.
Mixed-mode data collection
We designed our mixed-mode data collection strategy to maximize response rates. The design uses three modes: web, paper, and telephone. Our core data collection design strategy is a sequential push-to-web design with the mail mode introduced during the fourth mail communication and the telephone mode introduced during the fifth mail communication. As described in Part A, we embedded experiments in the design to test the impact of varying modes and incentive structures on response rates. Our contractor’s proprietary survey delivery system integrates a customized survey software application and management system to accommodate the large volume of respondents and the simultaneous administration of multiple surveys.
Testing and accounting for non-response bias
Although we will make efforts to achieve as high a response rate as practicable with the available resources, nontrivial nonresponse losses can occur. As specified in the Standards and Guidelines for Statistical Surveys published by the Office of Management and Budget, a nonresponse bias analysis (NRBA) is required if the overall unit response rate for a survey is less than 80 percent (Guideline 3.2.9).
In general, we will use weight adjustments to handle compensation for nonresponse in sample surveys. The purpose of the NRBA is to assess the impact of nonresponse on the survey estimates and the effectiveness of the weight adjustments to lessen potential nonresponse biases.
We will conduct the following types of analyses to evaluate possible nonresponse biases:
• Comparing characteristics of non-respondents (or the total sample) to those of respondents using information available for both non-respondents and respondents. For example, we will use variables available from the SSA Structured Data Repository available for all applicants, such as impairment, age, work history, and education, to compare respondents and non-respondents.
• Modeling response propensity using multivariate analyses. We will apply logistic regression models and Chi-square automatic interaction detection (CHAID) analysis to identify the significant predictors of nonresponse, using variables available from the respective sample frame as independent variables in the models. CHAID is a predictive model used to forecast scenarios and draw conclusions. It involves the use of: regression (statistical analysis to estimate the relationships between a dependent response variable and other independent ones); machine learning (the use of artificial intelligence to leverage data and absorb information to make logical predictions or decisions); and decision trees (branching models of decisions or attributes followed by their event outcomes). This analysis will inform the specification of nonresponse weighting classes that we will use to adjust the sampling weights. We will use CHAID to develop the weighting cells, using information available in the sampling frame. The CHAID algorithm provides an effective and efficient way of identifying the significant predictors of applicant nonresponse.
• Evaluating differences found in comparisons between unadjusted (i.e., base-) weighted estimates of selected sampling frame characteristics based on the survey respondents and the corresponding population (frame) parameter. In the absence of nonresponse, the unadjusted weighted estimates are unbiased estimates of the corresponding population parameters. This analysis provides an alternative way of assessing how nonresponse may have impacted the distribution of the respondent sample and thus potentially affects the sample-based estimates.
• Comparing weighted survey estimates (e.g., selected error rates by type) using unadjusted (base) weights versus nonresponse-adjusted weights. We will conduct this analysis after the development of the final nonresponse‑adjusted weights and will provide a measure of how well the weight adjustments compensated for differential nonresponse.
If the respondents differ in characteristics from the total sample, we account for this in developing the survey weights.
Tests of Procedures
We received generic clearance from OMB on December 5, 2023, to conduct cognitive testing as well as pre-testing of the questions. Under the generic clearance, the contractor conducted a cognitive test of new and revised survey items to assess respondent understanding of the questions, followed by a pretest of the survey to confirm respondent burden and assess question flow. We used the results of the tests to refine the survey items and minimize burden.
Statistical Agency Contact for Statistical Information
For further information, you can communicate with the following staff members:
Jarnee Riley, M.S., Project Director
Telephone: 240-453-2724
Email: [email protected]
Mustafa Karakus, Ph.D., Senior Researcher
Telephone: 240-370-4907
Email: [email protected]
Jill DeMatteis, Ph.D., Lead Statistician
Telephone: 301-517-4046
Email: [email protected]
Jeffrey Taylor, Ph.D., Lead Analyst
Telephone: 301-212-2174
Email: [email protected]
References
Cantor, D., O'Hare, B., and O'Connor, K. (2007). The use of monetary incentives to reduce non-response in random digit dial surveys. In J.M. Lepkowski, C. Tucker, J.M. Brick, E. DeLeeuw, L. Japec, P.J. Lavrakas, M.W. Link, and R.L. Sangster (Eds.), Telephone survey methodology (pp. 471-498). New York, NY: Wiley.
Dillman, D. A., Smyth, J. D., and Christian, L. M. (2009). Internet, mail, and mixed-mode surveys: The tailored design method (3rd ed.). Hoboken, NJ, US: John Wiley & Sons Inc.
Mercer, A., Caporaso, A., Cantor, D., and Townsend, R. (2015). How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly, 79(1), 105-129. doi: 10.1093/poq/nfu059
Messer, B.L., Dillman, D.A. (Fall 2011). Surveying the General Public over the Internet Using Address-Based Sampling and Mail Contact Procedures, Public Opinion Quarterly, Volume 75, Issue 3, Pages 429–457. https://doi.org/10.1093/poq/nfr021
Stedman, R.C., Connelly, N.A., Heberlein, T.A., Decker, D.J., Allred, S.B. (2019). The End of the (Research) World as We Know It? Understanding and Coping with Declining Response Rates to Mail Surveys, Society & Natural Resources, 32:10, 1139-1154, DOI: 10.1080/08941920.2019.1587127
Taylor, J.A., Salkever, D.S., Frey, W.D., Riley, J., Marrow, J. (2021). Enrollment in the Supported Employment Demonstration: An Employment Intervention for Denied Disability Benefits Applicants with a Mental Impairment, Administration and Policy in Mental Health and Mental Health Services Research, 49, 909–926.
Wright, D., Livermore, G., Hoffman, D., Grau, E., Bardos, M. (2012). 2010 National Beneficiary Survey: Methodology and Descriptive Statistics. Available from: https://www.ssa.gov/disabilityresearch/documents/NBS%20stats%20methods%20508.pdf
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Title of Information Collection and Form Number(s) |
Author | Phil Masiakowski |
File Modified | 0000-00-00 |
File Created | 2024-11-09 |