OF CUSTOMER SATISFACTION SURVEYS OMB No. 0960-0526
TITLE OF INFORMATION COLLECTION: Social Security Statement Evaluation
Survey
SSA SUB-NUMBER
DESCRIPTION OF ACTIVITY:
BACKGROUND
Since the mid-1990s, the Social Security Administration (SSA) has commissioned surveys to determine the public’s knowledge of Social Security; their preference for receipt of the Social Security Statement (paper vs online); and satisfaction with the content of the information and how the agency delivers that information. The Social Security Statement Evaluation Survey will allow SSA to obtain constructive feedback from the public on their satisfaction with their online mySocial Security accounts; including suggestions on how SSA can improve these accounts. SSA will use data from this survey to increase the effectiveness of its outreach efforts, specifically the use and effectiveness of the mySocial Security accounts.
SURVEY
Description of Survey
The Social Security Statement Evaluation Survey asks the public about their satisfaction with the information they received from SSA about their retirement benefits, and with the way SSA communicates with them about their future benefits. The survey will allow SSA to determine what the public knows about Social Security; how the public uses the Social Security Statement; how the public prefers to receive their Social Security Statement; and receive feedback on how the public feels about the proposed future online features.
The Social Security Statement Evaluation Survey questionnaire includes the following questions:
Questions 1 – 7 ask whether respondents are currently receiving Social Security retirement benefits and, if so, if respondents are satisfied with the information they received from SSA about their retirement benefits. We also ask respondents how satisfied they are about the information they received from SSA about their future benefits.
Questions 8 – 13 ask about respondents’ knowledge of various aspects of Social Security’s programs and benefits; the number of years of work needed for benefit eligibility; whether Social Security pays benefits to workers who are disabled; whether benefits are adjusted for cost of living, etc.
Questions 14 – 18 ask about respondents’ familiarity with the Social Security Statement; whether they ever looked at the Statement either on paper or online, and what information they remember being on it.
Questions 19 – 22 ask about respondents’ comfort accessing information about their Social Security benefits online; awareness of the online Social Security Statement; and creation of an online mySocial Security account.
Questions 23 – 28 ask respondents about their satisfaction with their mySocial Security account; what they use their account for; and how often they access their online account.
Questions 29 – 32 ask respondents their preference for receiving the Social Security Statement in the future (paper or online); their reasons for viewing the Statement; whether they received an email from SSA that prompted them to view their online Statement,; and what actions they took after viewing their Statement (such as filing it with important documents, changing financial plans, planning for retirement, etc.).
Questions 33 – 41 ask respondents for their preferences among features to be included in the Social Security Statement in the future. These features include interactive graphs of estimated benefits under different assumptions about future earnings; estimated spousal benefits; an illustration of the four claiming and working options; and information on non-taxed earnings.
Statistical Information
Sample Selection
For this study, the geodemographic dimensions used for weighting the entire KP will include:
Gender (Male and Female)
Age (25–34, 35–44, 45–54, 55-64, and 65+)
Race/Ethnicity (Hispanic and non-Hispanic White, African American, Asian, and Other)
Education (Less than High School, High School, Some College, Bachelor and beyond)
Census Region (Northeast, Midwest, South, West)
State
Household Income ($0-$10K, $10K-$25k, $25K-$50k, $50K-$75k, $75K-$100k, $100K+)
Home ownership status (Own and Rent/Other)
Metropolitan Area (Yes and No)
Methodology
The contractor will recruit a nationally representative sample of adults 25 years of age and older from an in-house online panel to achieve a minimum of 1,400 completed surveys. The contractor will select a maximum sample of approximately 2,000 panel members to achieve a minimum survey cooperation rate of 70 percent. The resulting sample will be a probability-based nationally representative set of 25 years old and older adults, including coverage of non-Internet households. The composition of the survey sample will be representative of the U.S. 25+ population in terms of sex, age, race, ethnicity, income, and educational attainment.
Survey responses are confidential, with identifying information never revealed without panelists’ approval.
Survey responses are confidential, and we never reveal identifying information without panelist’s approval. All personally identifying records are securely stored in our Azure Government Cloud infrastructure. All electronic survey data records are stored in a secured database that does not contain personally identifying information. Only an incremented ID number identifies the survey response data. The personally identifying information is stored in a separate database that is accessible only to persons with a need to know. We retain the survey response data in a secure database after the completion of a project. We retain the data for operational research, such as studies of response rates, and for the security of our customers who might request later for additional analyses; statistical adjustments, or statistical surveys that would require re-surveying research subjects as part of validation or longitudinal surveys.
Response Rate
The SSA-approved contractor will take the following steps to maximize the response rate for this survey on their in-house panel:
The contractor sends their in-house panel members a notification email letting them know there is a new survey available for them to complete. This email notification contains a link that sends them to the survey.
To assist panel members with their survey taking, the contractor provides each panel member with a personalized member portal listing for all assigned surveys that have yet to be completed.
After three days, automatic email reminders are sent to all non-responding panel members in the sample.
To improve the cooperation rate for the study, we will send an additional email reminder to all non-responding panel members in the sample on the seventh day of the field period.
The contractor sends Spanish surveys to sampled individuals where panel profile records indicate that is their preferred language; and,
The contractor provides panel members with a toll-free National 800 number and e-mail address so participants can call if they have any questions.
The SSA-approved contractor regularly achieves cooperation rates of 60 percent or higher using their standard study methodology for their in-house panel. They anticipate an equally good response rate in the upcoming survey. The SSA-approved contractor has a known demographic profile of non-responders and may conduct a non-responder analysis to identify any significant differences in the responder and non-responder populations and their potential impact on the survey results.
As a member of the American Association of Public Opinion Research (AAPOR), the contractor follows the AAPOR standards for survey response rate reporting. The in-house panel is a probability-based panel and, by definition, all panel members of have a known probability of selection. Therefore, it is mathematically possible to calculate a proper response rate that takes into account all sources of nonresponse. The in-house panel is composed of individuals recruited at different periods, who must complete profile surveys to become panel members, and who are committed to answering multiple surveys for a certain period. As a result, we calculate respondent-level cohort recruitment; profile; and retention rates for each study respondent and averaged across all study respondents to yield the study-specific response rates.
Sampling Variability
The key variables for this survey are: (1) respondent’s satisfaction with the information they receive from SSA about their retirement benefits (Q3 and Q6); (2) their preferences for receiving the Social Security Statement in the future [paper or online] (Q4); (3) their satisfaction with their mySocial Security account (Q28); and (4) their preferences among features to be included in the Social Security Statement in the future (Q36-Q41). Other variables of interest are respondents’ knowledge of various aspects of Social Security’s programs and benefits (Q8-Q13), respondents’ familiarity with the Social Security Statement (Q14-Q18), and respondents’ comfort accessing information about their Social Security benefits online (Q19-Q22).
The survey will ask all respondents about their interactions, and satisfaction with SSA services. Current sample and assuming all respondents provide a response to at least one of the various satisfaction questions included, the estimated margin of error would be +/- 3 percentage points. Assuming a 70 percent response rate, the proposed sample size is large enough to provide a sampling variability at the 95‑percent confidence level of +/-3 percentage points.
A critical piece of quality control program is survey pretesting. The pretest will evaluate the survey instrument as well as the data collection and respondent selection procedures. Following OMB approval, the contractor will conduct a pretest of 25 interviews over a three to seven day field period using the same survey instrument prepared for the main study. We will use the pretest to verify the survey is functioning correctly; the respondents understand the question wording and response categories; and to estimate the median survey length. The survey pretest will be conducted online using KP. We will examine quit rates at each question; open-ended responses; and other response patterns to determine whether we should revise the survey.
SSA’s Office of Communications and SSA’s contractor Iposos are responsible for sampling and data analysis.
IF FOCUS GROUP MEMBERS WILL RECEIVE A PAYMENT, INDICATE AMOUNT
We will not compensate participants for this survey.
USE OF SURVEY RESULTS
SSA seeks assessments and recommendations concerning ways to improve uptake, and use of its online mySocial Security accounts. One of the goals of this research is to make certain we add the features the public are most interested in seeing to their mySocial Security accounts.
BURDEN HOUR COMPUTATION (Number of responses (X) estimated response time (/60) = annual burden hours)
Number of responses: 1,400
Estimated response time: 30 minutes
Annual Burden Hours: 700 hours
NAME OF CONTACT PERSON: Dellareese Morton-Smith
MAJOR OFFICE, OFFICE, DIVISION, BRANCH: Office of Communication, Office of Strategic and Digital Communications
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | SAMPLE CLEARANCE FORMAT – use “Basic Elements Template” for guidance on completing this form |
Author | 666429 |
File Modified | 0000-00-00 |
File Created | 2021-01-16 |