A. Justification
Introduction/Authoring Laws and Regulations
Since 1980, Congress requires the Social Security Administration (SSA) to conduct demonstration and research projects to test the effectiveness of possible program changes that could encourage individuals to work and decrease their dependence on disability benefits. In fostering work efforts, SSA intends for this research and the program changes evaluated to produce federal program savings and improve program administration. Section 1110 of the Social Security Act provides the Commissioner of Social Security the authority to conduct broad, cross-programmatic projects for the SSDI and SSI programs. Under Section 1110, SSA funds a wide range of research projects including projects to maintain and improve basic data about SSA programs and beneficiaries.
The Social Security Administration (SSA) provides income assistance to more than 13 million working-age adults and children with disabilities through the Social Security Disability Insurance (SSDI) and Supplemental Security Income (SSI) programs. To evaluate these respondents as they navigate SSA’s application process, we are implementing the New Applicant Survey (NAS). The objective of the New Applicant Survey (NAS) is to provide SSA’s Office of Research, Demonstration, and Employment Support (ORDES) with information about recent applicants’ experience at different stages or touchpoints in the disability application process. SSA will use findings from the survey to inform testable policy interventions to improve the application experience for applicants.
The primary goal of NAS is to help SSA improve our current application process through the use of feedback from the public who use it. The research questions and survey will allow SSA to evaluate current practices and improve upon them. Ultimately, we expect the purpose of this survey will help SSA to implement a better overall application experience for respondents, as they use SSA’s systems.
To provide information to SSA regarding applicants’ experiences at the different touchpoints in the disability application process, SSA’s evaluation will include the following analysis components:
Comparison of Characteristics: Comparing characteristics of non‑respondents (or the total sample) to those of respondents using information available for both non-respondents and respondents. For example, we will use variables available from the SSA Structured Data Repository available for all applicants, such as impairment, age, work history, and education, to compare respondents and non-respondents.
Modeling: Modeling response propensity using multivariate analyses. We will apply logistic regression models and Chi-square Automatic Interaction Detector (CHAID) analysis to identify the significant predictors of nonresponse, using variables available from the respective sample frame as independent variables in the models. This analysis will inform the specification of nonresponse weighting classes that will be used to adjust the sampling weights. We will use CHAID to develop the weighting cells, using information available in the sampling frame. The CHAID algorithm provides an effective and efficient way of identifying the significant predictors of applicant nonresponse.
Evaluation of Differences: Evaluating differences found in comparisons between unadjusted (i.e., base-) weighted estimates of selected sampling frame characteristics based on the survey respondents and the corresponding population (frame) parameter. In the absence of nonresponse, the unadjusted weighted estimates are unbiased estimates of the corresponding population parameters. This analysis provides an alternative way of assessing how nonresponse may have impacted the distribution of the respondent sample and thus potentially affects the sample-based estimates.
Comparison of Estimates: Comparing weighted survey estimates (e.g., selected error rates by type) using unadjusted (base) weights versus nonresponse-adjusted weights. This analysis will be conducted after the final nonresponse-adjusted weights are developed and will provide a measure of how well the weight adjustments have compensated for differential nonresponse.
Through these methods, we will analyze the data we receive to evaluate our current application process and determine methods to improve it.
Description of Data Collection
The primary purpose of the NAS is to evaluate our current application process to determine what works best for new applicants. The NAS will provide SSA will information regarding our current application touchstones, and how we can improve them, based on customer feedback.
The NAS will help SSA answer the following research questions:
What are the pre- and post-application employment experiences of awarded and denied SSDI and SSI applicants?
What employment-, vocational-, medical-, or income-related services and supports did applicants use leading up to and since application?
What sources of information about SSDI or SSI did the applicant use or have access to?
What were the applicants’ experiences with representation during the application or post-application periods?
The responses to these research questions will inform the types of data collection activities SSA used and plans to use in the future. Ultimately, we expect the purpose of this survey will help SSA to implement a better overall application experience for respondents, as they use SSA’s systems.
Survey Instrument
The survey asks questions that focus on the applicant’s experience with different aspects of the application process. This information is not available in SSA program records. The survey will collect data from 10,000 new applicants at different touchpoints in the application process to understand applicant experiences at each stage and obtain the information needed to address the research questions.
To accommodate respondent preferences, we will create three modalities of the survey instrument: Internet-based, telephone, and paper. The Internet and telephone versions will have essentially the same design as these modalities use dynamic pathing, which facilitate the automatic skipping of questions based on the respondents’ earlier responses. We will include instructions and formatting on the paper instrument which will also allow the respondents to skip questions based on previous responses; however, they will be able to see all of the questions (which is not the case on the Internet and telephone versions).
The key domains for the survey (Attachment A-1) will include:
The touchpoints in the application process completed and applicants’ experience with the most recent touchpoints,
Use of Appointed Representatives,
Actions taken when applicants receive an unfavorable SSA decision,
Use of and experience with SSA services,
Recommendations for improvements to the application process,
Personal financial environment including use of assistance programs,
Personal support system, and
Demographic information.
SSA will use the information we collect from this survey to understand applicants’ experiences at different stages in the application process, as well as the types of SSA services applicants accessed, and to evaluate changes to the application process that could potentially improve applicants’ experience.
Sample Selection, Recruitment and Consent Procedures
Sample Selection
SSA will conduct this survey with 10,000 respondents nationally. SSA will provide a list of recent applicants to the contractor to use for sample selection. The target population is adults who have applied for Social Security disability benefits. To ensure that sampled applicants have recent experiences with the application process, we will restrict the target population to those who have applied, appealed, or received a determination in the six months prior to sampling. The sample will include individuals to whom SSA awarded benefits (beneficiaries), those to whom SSA denied benefits, as well as applicants who remain at different stages of the application process. Prior to sampling, we will exclude the following types of applicants:
Applicants who previously received SSI or SSDI disability benefits as children or adults and had their benefits terminated
Applicants residing in foreign countries
Applicants younger than age 18 or older than 65
Deceased applicants
Applicants flagged as Compassionate Allowances or Quick Disability Determination (QDD)
We will draw a main sample of 40,000 applicants and a reserve sample of 20,000 applicants (to be released if needed). We will stratify the sample using the following characteristics drawn from SSA administrative data:
Application stage (based on most recent application/appeal): (1) initial application, (2) reconsideration appeal, or (3) appeal to hearing level
Allowance (based on most recent determination): (1) allowed or (2) denied or pending decision
Consultative Examination (CE) requested: (1) requested a CE during the determination process or (2) did not request a CE during the determination process
Appointed Representative appointment: (1) appointed and AR or (2) did not appoint an AR.
We will also draw geographic information about applicants from SSA administrative data (residence in an urban versus rural area and SSA region based on regional office assigned to the applicant’s home state). We will examine the geographic distribution of eligible applicants in the sample frame to determine if we need to stratify on geographic location in addition to the characteristics listed above.
Recruitment
For this survey, we will use the current best practices for the recruitment approach which are designed to maximize response rates while ensuring that the government is able to successfully conduct the survey with available funding. To maximize response rates, the survey will implement a multi-pronged experimental design (detailed below) to determine the contact and incentive protocol that will yield the highest participation. We will engage Appointed Representatives prior to reaching out to the sampled applicants. We will assign sampled applicants in the initial wave to either the concurrent (will introduce both the Internet and paper modality at the same time) or sequential (will introduce the Internet modality only initially) group. Within each group, we will also assign applicants to one of two experimental incentive groups – early bird incentive and no early bird incentive. We will also test whether providing a second $2 cash incentive in a subsequent mailing improves participation. Table 2 provides an overview of the design for each experiment group. Subsequent sample releases will use the design that yields the best response rate. The eight-step contact strategy for the core sample (sequential design) follows.
SSA Announcement. SSA will announce the survey on its website and will disseminate survey information to SSA regional field offices, teleservice centers, and line staff.
Email to Appointed Representatives from Appointed Representative Associations. Some applicants use Appointed Representatives to help navigate the application process. We will draft an email (Attachment A-2) and will request that Appointed Representative professional associations send it to all their member Appointed Representatives. The email will alert Appointed Representatives that SSA is conducting the survey and that we may select some of their clients for the survey sample. It will ask Appointed Representatives to encourage their clients to participate in the survey if SSA selects them and will include a link to the survey website.
Email to Appointed Representatives (Directly). We expect that a significant number of recent applicants will have Appointed Representatives. We will send an email (Attachment A-3) directly to Appointed Representatives who represent applicants selected for the survey a few days prior to making the first contact with the applicant. The email will alert the Appointed Representative that SSA selected their client for the survey and will ask the Appointed Representative to encourage the client’s participation in the survey. It will also include a link to enable Appointed Representatives to access the survey website that will provide information on the survey and answer key questions.
First USPS Mailing to All Applicants. A few days after the email to Appointed Representatives, we will send a mailing to all applicants selected for the survey sample.
The applicant mailing will include:
A visible cash pre-pay incentive of $2.
A letter introducing the survey (Attachment A-4). The letter will highlight that participation in the survey is voluntary; explain that SSA will keep applicants’ information confidential; and describe the incentive applicants will receive for completing the survey. The letter will also include a QR code to the survey website, a link to the survey website, a personal password to access the Internet version of the survey instrument, and a toll-free telephone number and email address where applicants can direct questions about the survey. An SSA official will sign the letter and we will also include the SSA logo.
An Information Sheet (Attachment A-5) that will summarize key information on the survey, the promised incentives, and how to provide consent and access the survey.
Second USPS Mailing to All Applicants. A week after the first mailing, we will mail a fold-over postcard to all applicants selected for the survey sample (Attachment A-6). The postcard will include all the key pieces of information in the initial letter, such as the link and QR code to the survey website and the personal password to access the Internet version of the survey instrument. In addition, we will send a text message reminder to applicants for whom SSA does not have an email address or an email reminder to applicants for whom SSA has an email address.
Third USPS Mailing to Nonrespondents. We will send a third letter to non-respondents a week after the second mailing (Attachment A-7). The letter content will be similar to first letter. We will send a text message reminder to applicants for whom SSA does not have an email address or an email reminder to applicants for whom SSA has an email address.
Fourth USPS Mailing to Nonrespondents. We will send a fourth mailing to non-respondents about two weeks after the third mailing. This mailing will include a letter (Attachment A-8) similar to the previous mailings, a paper version of the Internet survey instrument, and a postage‑paid return envelope. Our design continues to push the Internet response as the electronic version of the instrument is more cost effective and less burdensome than the paper version. About a week prior to this mailing, if we have a phone number, we will send a text message informing non‑respondents to be on the lookout for the mailing with the paper questionnaire. We will send an email to be on the lookout for the mailing to non-respondents for whom SSA has an email address.
Fifth contact introducing the telephone modality. Two weeks after the fourth mailing, we will introduce the telephone modality. We will call non-respondents up to three times. We will also send non-respondents for whom SSA has an email address an email message (Attachment A-9) to alert them that they can contact the Survey Help Desk to complete the survey via a telephone interview. For non‑respondents for whom we have a phone number we will send a text message alert that they can contact the Survey Help Desk to complete the survey via telephone.
Response Rate Experiments
Most of SSA’s prior major surveys focused on beneficiaries; however, these surveys have been experiencing declining response rates. The NAS is the first nationally representative survey of the applicant population, which may face different and more challenging barriers to participation and survey completion. Anticipating that response rates could be lower than the 25 percent assumption in the burden table, the survey plans to test a few strategies to improve response rates via experiments that are described in the paragraphs that follow. Table 1 summarizes the experiments and the causal tests embedded within them.
Table 1. Summary of Causal Experiments
The following provides information on our three response rate experiments: offering the opportunity to complete the survey by paper concurrently with the Internet version; sending a second cash incentive; and providing an “early bird” incentive to complete the survey by a specific date.
Concurrent vs. sequential invitation to complete survey by Internet and paper. Completing the survey by Internet significantly reduces the time burden on respondents. Research indicates that providing sampled cases the opportunity to take a survey only by Internet in initial contact attempts helps increase the proportion of electronic responses (Buck et al., 2020; Bretschi et al., 2023). In this experiment, we propose to test whether randomly offering half the sampled applicants in the first sample release wave the opportunity to complete the Internet version of the survey in the initial invitation mailing and the paper version in the fourth mailing (the “Sequential” group in Table 2) yields higher rates of completions than offering the opportunity to complete both the Internet and paper versions in the initial invitation mailing (the “Concurrent” group).
Second cash incentive. Including cash incentives in mailings to sampled cases is an effective means to gain potential survey respondents’ attention and results in higher response rates (Debell et al., 2020; Zhang et al., 2023). Research indicates that mailing sampled individuals who do not respond to initial contact attempts a second incentive increases response, helping reduce non-response bias (Zhang et al., 2023). In this experiment, we propose offering a randomly selected 60 percent of sampled applicants within each of the four groups within the 2x2 experimental survey design who do not respond to the survey after the initial letter and postcard a second cash incentive in the third mailing (see row 3 of Table 2). The other 40 percent of sampled applicants within each of these groups will not receive the second cash incentive in the third mailing.
Early bird incentive. Recent studies suggest that offering higher “early bird” incentives to complete surveys by a particular date can decrease survey non-response (Friedel et al., 2023; McGonagle et al., 2022; Peycheva et al., 2023). For this experiment, we propose to offer half of the sampled applicants within the concurrent sample and half within the sequential sample a higher incentive for completing the survey before the date of the fourth contact. As indicated in Table 2, the initial mailing sent to those in the randomly selected “early bird” incentive samples will include a calendar showing them that they will receive a higher incentive if they complete the survey before the fourth contact attempt (approximately four weeks after the initial mailing). The postcard sent a week after the initial mailing will also remind sampled applicants that they have about three more weeks to receive the higher incentive for completing the survey.
Table 2 provides additional detail of the proposed design of the three experiments to investigate methods to enhance response rates for the NAS. All mailings to applicants selected for the survey sample will include a URL where they can log on and complete the survey with a personalized PIN and a QR code that when scanned directs them to the survey. We will implement these experiments beginning with the first release of sample. As soon as we determine the design combination that yields the best results, we will implement that design in subsequent sample releases.
Table 2. Design of the Experiments
|
Sequential (50%, n=10,000) |
Concurrent (50%, n=10,000) |
||
Contact stage |
Group A: No early bird incentive (25%, n=5,000) |
Group B:
Early
bird (25%, n=5,000) |
Group C: No early bird incentive (25%, n=5,000) |
Group D:
Early
bird (25%, n=5,000) |
|
Insert describes $40 incentive for completing survey by Internet |
Insert includes calendar showing $40 incentive for completing survey by Internet before fourth contact and $30 incentive for completing by Internet after fourth contact |
Insert describes $40 incentive for completing survey by Internet and $30 for completing and returning included paper version of survey instrument
|
Insert includes calendar showing $40 incentive for completing survey by Internet or for completing and returning included paper version of survey instrument before fourth contact and $30 for completing by Internet or paper after fourth contact |
(Seven days after initial mailing) |
Reminder of $40 incentive for completing survey by Internet |
Reminder of $40 incentive for completing survey by Internet before fourth contact and $30 for completing after fourth contact |
Reminder of $40 incentive for completing the survey by Internet and $30 for completing by paper |
Reminder of $40 incentive for completing survey by Internet or by paper before fourth contact and $30 for completing by Internet or paper after fourth contact |
(Seven days after second mailing) |
Reminder of $40 incentive for completing survey by Internet; 60% (n=2,550) receive second cash incentive of $2 in letter |
Reminder includes calendar showing $40 incentive for completing survey by Internet before fourth contact attempt and $30 incentive for completing by Internet after fourth contact; 60% (n=2,550) receive second cash incentive of $2 in letter |
Reminder of $40 incentive for completing the survey by Internet and $30 for paper completion; 60% (n=2,550) receive second cash incentive of $2 in letter |
Reminder includes calendar showing $40 incentive for completing by Internet or paper before fourth contact and $30 for completing by Internet or paper after fourth contact; 60% (n=2,550) receive second cash incentive of $2 in letter |
(Fourteen days after third mailing) |
Insert describes $40 incentive for completing survey by Internet and $30 incentive for completing and returning mailed paper survey |
Insert describes $30 incentive for completing survey by Internet or for completing and returning mailed paper survey |
Insert describes $40 incentive for completing the survey by Internet and $30 incentive for completing and returning mailed paper survey |
Insert describes $30 incentive for completing survey by Internet or for completing and returning mailed paper survey |
(Fourteen days after fourth mailing; Post-experiment) |
Reminder of $40 incentive for completing survey by Internet and $30 incentive to complete and return mailed paper survey; offer of $30 for calling to schedule and complete interview by phone |
Reminder of $30 incentive for completing survey by Internet or for completing and returning mailed paper survey; offer of $30 for calling to schedule and complete interview by phone |
Reminder of $40 incentive for completing survey by Internet and $30 incentive to complete and return mailed paper survey; offer of $30 for calling to schedule and complete interview by phone |
Reminder of $30 incentive for completing survey by Internet or for completing and returning mailed paper survey; offer of $30 for calling to schedule and complete interview by phone |
Consent Procedures
The survey website will display the Privacy Act Statement prior to the collection of any personally identifiable information (see Attachment A-10). Additionally, the start of the survey instrument will display informed consent language. On the Internet and paper versions of the instrument, we will require respondents to click on a radio button or check a box to provide consent before continuing to the survey. For the phone version of the instrument, the interviewer will read the informed consent language to the respondent, ask the respondent to provide verbal consent, and then click on a radio button indicating that the respondent has done so. We will not proceed with the survey until the respondent provides consent.
We identified the following psychological costs based on the requirements for this information collection:
Requirement for the Program: The survey includes questions about applicants’ experience applying for disability benefits.
Psychological Cost: Some of the questions may cause stress, discomfort, or anxiety to some respondents, which may lead respondents to skip these questions or decline participation in the survey. Instructions at the start of each section of the survey tell respondents why we are asking the questions. Respondents may skip any questions they do not want to answer. Participation in the survey is not mandatory and is voluntary.
We understand these psychological costs may cause some applicants to delay their completion of the survey or cause them to abandon the survey entirely. We have taken this potential psychological cost into account when calculating burden associated with this survey.
The respondents are current SSA beneficiaries who have undergone the application process; individuals to whom SSA denied benefits; applicants for Social Security services in various stages of the application process, and their Representatives (as applicable).
Use of Information Technology to Collect the Information
SSA’s contractor will develop a secure and cost-effective Internet-based Survey Management System (SMS) to support data collection for the NAS. The SMS will serve as a centralized mission control center, allowing survey staff to monitor and manage data collection activities. It will track all contacts with participants, record completion statuses and track incentives. The SMS will not contain Social Security numbers (SSNs).
Our contractor will program a self-administered Internet instrument, allowing respondents to complete the survey themselves online. The contractor will employ telephone data collectors who will have access to the online survey and will use it to administer the survey over the phone at the applicant’s preferred date and time. We will format the paper version of the instrument to facilitate the scanning of completed surveys directly into the database. Respondents will return the paper survey by mail using a pre-paid return envelope. We will not offer the option to return paper surveys electronically.
Our contractor will store all data on a secure project network directory with restricted access for designated staff. We will transfer data files to the contractor through the encrypted Government Services Online portal.
Why We Cannot Use Duplicate Information
The nature of the information we collect and the manner in which we collect it precludes duplication. We do not use another collection instrument to obtain similar data.
Minimizing Burden on Small Respondents
This collection does not significantly affect small businesses or other small entities.
Consequence of Not Collecting Information or Collecting it Less Frequently
The information we collect from this survey will provide us with the data needed to understand applicants’ experiences during all stages of the application process and, if needed, make improvements. The information collected is not available from other sources and will provide information that cannot be found in program records or other databases. We will reduce burden by providing applicants with multiple modalities to participate in the survey. Respondents will complete a single interview, so we cannot collect the information less frequently.
Special Circumstances
There are no special circumstances that would cause us to conduct this information collection in a manner inconsistent with 5 CFR 1320.5.
Solicitation of Public Comment and Other Consultations with the Public
We published the 60-day advance Federal Register Notice on August 19, 2024, at 89 FR 67141 and received no public comments. We published the 30-day FRN published on October 22, 2024, at 89 FR 84431. If we receive any comments in response to this Notice, we will forward them to OMB. We did not consult with the public in the development of this form.
Payment or Gifts to Respondents
All applicants selected for the survey sample will first receive a $2 cash pre-pay incentive as part of the initial mailing. A subset of applicants will receive a second $2 cash pre-pay incentive in the third mailing. Applicants who complete the survey will receive an incentive, either via an electronic link (if they complete the survey electronically and select the electronic option) or via check. The post‑survey incentive will vary depending on the modality the respondent uses to complete the survey and, for a subset of applicants, how long after the initial mailing they participate. The incentive for completing the survey will be either $30 or $40.
Assurances of Confidentiality
We will assure respondents of the voluntary nature of the information collection via consent language that they acknowledge prior to starting the survey. Our contractor will maintain a secure environment for data collection, with survey materials stored in a restricted access project directory on the contractor’s network. We will transfer data files to the contractor through the encrypted Government Services Online portal. The contractor will assign a data steward to handle the sample file and create participant identifiers. In addition, the contractor will use special ID keys to link participant data without disclosing SSNs. The contractor will also store contact information and participant IDs in a password‑protected crosswalk spreadsheet on the SMS, accessible only to key project staff. The contractor will store additional documentation with identifying information on a password-protected network drive for select key staff. The contractor will store mailing materials with personal contact information within securely locked storage spaces at the contractor’s office. The contractor requires confidentiality pledges, human subjects protection training, and background screenings for its employees. After data collection, the contractor will provide SSA with restricted access data files and public use files, with appropriate measures to protect personally identifiable information. Upon survey completion, the contractor will delete data from its network and destroy backup tapes. The contractor will delete participant contact information and survey data from its servers at our direction.
SSA protects and holds confidential the information it collects in accordance with 42 U.S.C. 1306, 20 CFR 401 and 402, 5 U.S.C. 552 (Freedom of Information Act), 5 U.S.C. 552a (Privacy Act of 1974), and OMB Circular No. A-130.
Justification for Sensitive Questions
The information collection does not contain any questions of a sensitive nature.
Estimates of Annualized Burden Hours and Costs
Modality of Completion |
Number of Respondents |
Frequency of Response |
Average Burden per Response (minutes) |
Estimated Total Annual Burden (hours) |
Average Theoretical Hourly Cost Amount (dollars)* |
Total Annual Opportunity Cost (dollars)** |
Internet Survey (including informed consent and screener) |
5,000 |
1 |
35 |
2,917 |
$7.25* |
$21,148** |
Paper Survey (including informed consent and screener) |
3,500 |
1 |
35 |
2,042 |
$7.25* |
$14,805** |
Telephone Survey (incl informed consent and screener) |
1,500 |
1 |
40 |
1,000 |
$7.25* |
$7,250** |
Totals |
10,000 |
|
|
5,959 |
|
* We base this figure on the Federal minimum wage of $7.25, as survey participants will have recently applied for SSA disability benefits and will typically not have started receiving benefits yet.
**This figure does not represent actual costs that SSA is imposing on applicants to complete this survey; rather, these are theoretical opportunity costs for the additional time respondents will spend to complete the survey. There is no actual charge to respondents to complete the survey.
. Note: Our contractor will schedule an appointment to call the recipient at their preferred date and time; therefore, the respondents will not incur an average wait time.
Learning Cost
The survey questions do not require the respondents to learn any new information about any SSA program. Instead, the questions ask about what the respondent has already experienced during the application process. Therefore, there are no learning costs associated with participating in this survey.
The total burden for this ICR is 5,959 burden hours, which results in an associated theoretical (not actual) opportunity cost financial burden of $43,203. SSA does not charge respondents to complete our applications.
13. Annual Cost to the Respondents (Other)
This collection does not impose a known cost burden on the respondents.
Annual Cost to Federal Government
The cost to the Federal government by contract year is shown below.
Description of Cost Factor |
Methodology for Estimating Cost |
Contract Year 1 (8/15/22–8/14/23) Cost in Dollars |
Contract Year 2 (8/15/23–8/14/24) Cost in Dollars |
Contract Year 3 (8/15/24–8/14/25) Cost in Dollars |
Contract Year 4 (8/15/25–8/14/26) Cost in Dollars |
Contract Year 5 (8/15/26–8/14/27) Cost in Dollars |
Total Cost in Dollars |
Designing and Printing the Form |
Design Cost + Printing Cost |
$208,660 |
$131,966 |
$21,060 |
$21,060 |
$0* |
$382,746 |
Distributing, Shipping, and Material Costs for the Form |
Distribution + Shipping + Material Cost |
$0* |
$0* |
$282,880 |
$282,880 |
$0* |
$565,760 |
SSA Employee (e.g., field office, 800 number, DDS staff) Information Collection and Processing Time |
Number of responses collected by SSA employees and processing time for each response |
$0* |
$0* |
$0* |
$0* |
$0* |
$0* |
Full-Time Equivalent Costs |
Out of pocket costs and other expenses |
$0* |
$0* |
$0* |
$0* |
$0* |
$0* |
Systems Development, Updating, and Maintenance |
Costs to develop NAS participant website pages and content |
$0* |
$0* |
$120,010 |
$0* |
$0* |
$120,010 |
Quantifiable IT Costs |
Additional IT costs |
$0* |
$0* |
$0* |
$0* |
$0* |
$0* |
Other |
Project management |
$219,105 |
$162,640 |
$167,535 |
$172,576 |
$173,475 |
$895,331 |
Other |
Sample Design, Testing, and Extraction |
$664,922 |
$109,139 |
$153,423 |
$7,275 |
$0* |
$934,759 |
Other |
OMB Clearance |
$43,909 |
$24,844 |
$0* |
$0* |
$0* |
$68,753 |
Other |
Survey Administration |
$6,662 |
$63,761 |
$1,550,206 |
$1,523,986 |
$380,908 |
$3,525,523 |
Other |
Quality Reviews and Documentation of the DRS Results |
$0* |
$0* |
$0* |
$171,984 |
$181,791 |
$353,775 |
Other |
Final Estimates and Contract Closeout |
$22,261 |
$21,855 |
$21,910 |
$63,473 |
$165,247 |
$294,746 |
Other |
SSA Information Security and General Privacy |
$96,588 |
$12,928 |
$37,523 |
$71,024 |
$13,786 |
$231,849 |
Total |
|
$1,262,107 |
$527,133 |
$2,354,547 |
$2,314,258 |
$915,207 |
$7,373,251 |
* We have inserted a $0 amount for cost factors that do not apply to this collection.
SSA is unable to break down the costs to the Federal government further than we already have. In addition, many of these costs are part of the overall contract with the contractor administering this information collection. Therefore, we have calculated these costs as accurately as possible based on the creation, implementation, and administration of this information collection.
15. Program Changes or Adjustments to the Information Collection Request
This is a new data collection that increases the public reporting burden. See #12 above for updated burden figures.
16. Plans for Publication Information Collection Results
SSA may publish reports for some of the data collected and release a data file for public access.
17. Displaying the OMB Approval Expiration Date
SSA is not requesting an exception to the requirement to display an expiration date. We will display the OMB number and expiration date on all public-facing materials used for the survey.
Exceptions to Certification Statement
SSA is not requesting an exception to the certification requirements at 5 CFR 1320.9 and related provisions at 5 CFR 1320.8(b)(3).
References
Bretschi, D., Schaurer, I., & Dillman, D. A. (2023). An experimental comparison of three strategies for converting mail respondents in a probability-based mixed-mode panel to internet respondents. Journal of Survey Statistics and Methodology, 11(1), 23-46.
Bucks, B., Couper, M. P., & Fulford, S. L. (2020). A mixed-mode and incentive experiment using administrative data. Journal of Survey Statistics and Methodology, 8(2), 352-369.
Debell, M., Maisel, N., Edwards, B., Amsbary, M., & Meldener, V. (2020). Improving Survey Response Rates with Visible Money. Journal of Survey Statistics and Methodology, 8(5), 821-831.
Friedel, S., Felderer, B., Krieger, U., Cornesse, C., & Blom, A.G.. (2023) The Early Bird Catches the Worm! Setting a Deadline for Online Panel Recruitment Incentives. Social Science Computer Review 41(2): 370-389.
McGonagle, K. A., Sastry, N., & Freedman, V. A. (2022). The Effects of a Targeted “Early Bird” Incentive Strategy on Response Rates, Fieldwork Effort, and Costs in a National Panel Study. Journal of Survey Statistics and Methodology, smab042.
Peycheva, D., Calderwood, L., Wong, E., & Silverwood, R. (2023) Effects of a time-limited push-to-web incentive in a mixed-mode longitudinal study of young adults. Survey Research Methods, 17(2), 147-157.
Zhang, S., West, B. T., Wagner, J., Couper, M. P., Gatward, R., & Axinn, W. G. (2023). Visible Cash, a Second Incentive, and Priority Mail? An Experimental Evaluation of Mailing Strategies for a Screening Questionnaire in a National Push-To-Web/Mail Survey. Journal of Survey Statistics and Methodology, smac041
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2024-11-09 |