Social Security Statement Survey Supporting Statement B

Social Security Statement Survey Supporting Statement B.doc

Social Security Statement Survey

OMB: 0960-0770

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT B FOR

SOCIAL SECURITY STATEMENT SURVEY


OMB No. 0960-NEW


  1. Collections of Information Employing Statistical Methods


  1. Potential respondent universe; sampling/respondent selection methods


Each year, more than 140 million workers age 25 or older, receive a four-page document from the Social Security Administration (SSA) that, among other information, contains the current record of an individual’s earnings and an estimated retirement benefit. This document, the Social Security Statement, is sent out two to three months prior to the birthday month.


Our methodology calls for a nation-wide random sample of 1,200 interviews to be completed evenly divided among:


1) Statement recipients who have an earnings history with both covered and non-covered earnings under Social Security and


2) Statement recipients who have only earnings covered under Social Security.


The sample will be drawn from individuals who have received their Statement in a single week during the previous 90 days. This methodology assumes that the age distribution of the approximately 2.7 million individuals in each weekly mailing cohort is the same.


TABULAR PRESENTATION OF METHODOLOGY


Strata

Estimated Population Size of One Week Mailing

Completed Sample Size

Estimated Response Rate*

Recipients who have an earnings history with both covered and non-covered earnings under Social Security

1,100,000

600

80%

Recipients who have only earnings covered under Social Security

1,600,000

600

80%

Total

2,700,000

1,200

80%


(*This estimate can more properly be considered a Participation Rate [Completed Interviews / (Numbers Dialed – Bad Numbers – Non-Contacts)]. The estimated response rate is based on previous studies we have conducted of various groups of SSA beneficiaries though not with this exact population. )


Interviewing quotas will be established within each stratum in order to over represent and have more robust estimates for those age groups who are more likely to be interested in retirement planning. The age quota for each group will be as follows:


  • 100 respondents age 25-30;

  • 100 respondents age 31-45;

  • 200 respondents 46-55; and

  • 200 respondents over age 55.


Prior to analysis the completed interviews will be weighted by age group so that the contribution of each age group to the final sample will be the same as it was in the population from which it was drawn.


  1. Collection of Information


The sample will be drawn randomly from lists of individuals who had received their annual Statement in a single week during the previous 90 days in each stratum. A sample will be drawn randomly from cases assigned to each stratum to guarantee that there will be at least a 5-to-1 sample of telephone numbers within each age quota cell. At this time it is estimated that a 20-to-1 draw will be used in order to create the master sample. This large number of cases is needed since the list does not contain telephone numbers. Telephone numbers will be found through an internet service for as many cases as possible. While not all numbers will be needed, it is more efficient to draw cases at one time then to request a second draw. The age distribution from the entire master sample, regardless if a number was found, will be used as the basis of the weighting procedure used prior to the analysis phase.


A 5-to-1 sample will be randomly drawn from the master sample for each age quota cell from those cases where a telephone number was found and randomly placed into replicates of 50 for the two younger age groups and 100 for the two older age groups. Initially, three times the number of cases will be fielded and thoroughly worked prior to added additional numbers.


It is assumed that the completed interviews are a valid sample of the strata/age quota cells from which they were drawn. At this point it is anticipated that experiences of the two strata will be reported separately. Therefore, each strata/age quota cell will be re-weighted so that it represents proportion of the age quota cell within the strata for the master sample. Should it become necessary to report information of the combined sample it will be necessary to further weight the cases by strata to reflect their proportion of the total population from which the sample was drawn.


The sample sizes of each component will yield a maximum expected sampling error of approximately + 4 percentage points at the 95% confidence level.


No unusual problems requiring specialized sampling procedures are anticipated.


  1. Maximizing Response Rates


To maximize the response rate to these surveys and attempt to reach the response rate goals of OMB, the following procedures will be used:

  • Initial release of 3-to-1 ratio of sample to quota for completes in each stratum/age cell, in order to accommodate expected rates of non-location, non-eligibility and non-participation;

  • Subsequent adjustment of sample release by cell where additional cases may be needed based upon actual experience with location, eligibility and participation;

  • Up to 8 calls to locate an individual selected to be interviewed;

  • Make appointments if the initial contact was an inconvenient time to complete the survey; and

  • Refusal conversion attempts for initial “soft” refusals.


Maximizing response rates begins with interviewers who are fully trained in the procedures used in the phone center as well as the specific procedures used on this study. Training is followed by close supervision to guarantee that all procedures are followed. These steps will help ensure quality control over the collection of survey data.


General background training of interviewers, regardless of the specific project, includes:


  • An understanding of telephone sampling procedures and the importance of rigorous adherence to sampling procedures in the field;

  • An understanding of respondent selection procedures and the importance of following these procedures rigorously;

  • The role of the interviewer in the survey process;

  • Recommended methods for contacting potential respondents and procedures for setting appointments;

  • Effective methods for gaining initial agreement to be interviewed;

  • Methods for overcoming initial reluctance to schedule or agree to be interviewed;

  • Interviewer behavior in the interview setting – how to be courteous, neutral and non-intrusive;

  • How to avoid biasing responses by verbal and non-verbal cues;

  • How to ask and record closed-ended questions;

  • How to probe and record open-ended questions;

  • How to control irrelevancies and digressions without offending the respondent;

  • How to reassure respondents about the confidentiality of the information collected and the anonymity of survey respondents; and

  • General recording conventions.


Specific training related to this study will include the following:


  • Purpose of the study and importance to the client;

  • Question-by-question specifications with particular attention paid to interviewer instructions;

  • Review of the study procedures for contact, selection, administering the instrument and recording the responses correctly; and

  • Practice interviews in the presence of the trainer.


Once interviewing begins, maximizing the response rate depends on the interviewers’ ability to develop a rapport with the respondent and the accurate identification and documentation of refusals and terminations to record what happened and why it happened. This type of documentation assists in future refusal conversion efforts. For example, many surveys fail to differentiate between refusal by the designated respondent and refusal by a third party, or refusal prior to the specification of who is the designated respondent. These latter types of refusals are a refusal to screen, rather than a refusal to interview. This difference may have a significant effect on the likelihood of eventual conversion, as well as the most appropriate approach to refusal conversion.


Whether the initial refusal is made by the selected respondent, a third party, or an unidentified voice at the end of the phone, the computer assisted telephone interviewing (CATI) system is programmed to move the interviewer to a non-interview report section. The interviewer asks the person refusing to tell him/her the main reason that they don't want to do the interview. The interviewer enters the answer, if any, and any details related to the refusal into the program to assist us in understanding the strength and the reason for the refusal. This information is entered verbatim and is reviewed on a daily basis by the field manager and, on a weekly basis, by the project director.


This level of detail provides a systematic record of the exact point in the interview that the refusal or termination occurred; the circumstances and reasons, if given, for the refusal or break off; and the position of the person refusing/terminating, if known. The non-interview record can also provide the qualitative information necessary to identifying the source of problems with the survey instrument or procedures, as well as suggesting possible strategies for both reducing future refusals and converting current refusals. This may involve modifying the introduction, since most refusals occur within the first 30 seconds, while the interviewer is introducing the study. It may lead to interviewer scripts for better handling of the most common types of questions or respondent concerns that emerge from a review of the early refusals. It also will guide the refusal and termination conversion scripts that will be used in the study.


The actual process of converting terminations and refusals, once they have occurred, involves the several steps. First, there is a diagnostic period, when refusals and terminates are reported on a daily basis and reviewed daily to see if anything unusual is occurring. Second, after enough time has passed to see a large enough sample of refusals and terminations, a refusal conversion script is developed. Third, the refusal conversion effort is fielded with re-interview attempts scheduled about a week after the initial refusal. (Conversions of interviews that are more than half complete would not be delayed this long.) Fourth, the outcomes of the refusal conversion efforts are reviewed on a daily basis. Revisions of the script or the procedures would be made, if indicated by the ongoing results of the conversion effort.


Refusal conversion efforts are usually undertaken by more experienced, senior interviewers. Prior to beginning refusal conversion, they review the reason for refusal or termination with the Operations Manager and discuss general strategy. They then begin re-contacting the refusals and terminations, approximately one week after initial non-participation. The delay permits time for the respondent to distance himself/herself from the original refusal. Also, it allows time for personal situations to change – family situation, work schedules, etc. – in case these contributed to the refusal.


  1. Tests of Procedures/Methods


The procedures used in this study allow for the survey to be tested at several points at the beginning of the study. Once programmed in the CATI system, the questionnaire is subjected to several testing procedures. First, the analytical staff attempts to test all possible response categories for each question in order to identify embedded logic errors, as well as obvious skip problems. Several analysts may test the program simultaneously to identify problems quickly, and to double check the logic of the programming.


Once the research team is satisfied with the CATI program, an “autopilot” sub-routine is used to test skip patterns more comprehensively. The autopilot program conducts a specified number of interviews (e.g., 1,000 or more). In each interview, the CATI program goes through the questionnaire and the autopilot program auto-fills the answer with a random response. The program continues through each questionnaire until the specified number of interviews is completed. Then, the research staff reviews the marginal distribution for the answers to the interviews to check bases against the skip patterns in the programs. This permits us to identify any errors in skip instructions in the program before fielding the survey.


The last step is to pre-test the questionnaire among a representative sample of the target population for this survey. Typically, 20 to 30 pretest interviews are conducted using all procedures that will be used for the survey. The pre-testing of the survey instruments will test the adequacy, clarity, and balance of the data collection instruments, both for the interviewer administering the questionnaires and the respondents answering the questions. The pretests of the survey instruments will evaluate the wording of questions and response categories in terms of clarity and confusion for respondents; assess the flow of the interview, including question sequencing and skip patterns; determine the time required to administer the questionnaire, both the range and the average; help gauge item non‑response to sensitive questions; and review and evaluate the full range of procedures and problems associated with respondent, contact and the interview. It will also determine the average interview length. If the survey is over the anticipated time then questions will have to be eliminated.


Interviewers who participate in the pretests are debriefed by the field manager concerning any problems or observations regarding the instrument. The field manager will consolidate the observations and suggestions of interviewers along with his own recommendations. These will be returned to the project director.


Please note several points about the pretest:


  • The 20-30 individuals participating in the pretest are part of the total number of respondents we reported in question #12 of Supporting Statement A.


  • The pretest will be asked the same questions as all other respondents. The evaluations we want to make about the survey will be based solely on interviewer and Project Director observation.


  • We do not anticipate that pretest observation will cause us to make changes to this survey. However, if we do want to make changes, we will re-submit the revised survey to OMB.


  1. Statistical Staff Contact Information


Contractor contact: Cheryl Lampkin (SRBI Analyst), (301) 608-3883


Social Security Statement Survey

Part B

2/5/2021


6


File Typeapplication/msword
File TitleSUPPORTING STATEMENT B FOR
Author666429
Last Modified By666429
File Modified2008-01-17
File Created2008-01-15

© 2024 OMB.report | Privacy Policy