FINAL 9th_National_Survey_Older_Americans_Act_ParticipantsOMB_PRA_Section B_July_16_2014

FINAL 9th_National_Survey_Older_Americans_Act_ParticipantsOMB_PRA_Section B_July_16_2014.docx

National Survey of OAA Title III Service Recipients

OMB: 0985-0023

Document [docx]
Download: docx | pdf

Application for New Data Collection:


Supporting Statement for

Ninth National Survey of

Older Americans Act Participants


Section B



March 18, 2014

Updated July 16, 2014







Submitted to:


U.S. Administration for Community Living

Administration on Aging

1 Massachusetts Avenue, NW

Washington, DC 20001


Submitted by:


WESTAT

1600 Research Boulevard

Rockville, Maryland 20850-3195

(301) 251-1500





B Collection of Information Employing Statistical Methods



B.1 Respondent Universe and Sampling Methods

Introduction


This Paperwork Reduction Act (PRA) request is to conduct a longitudinal survey of OAA service recipients. It is important to note that the ninth survey (conducted in 2014) will serve as the baseline for the longitudinal survey with data collection at the second and third anniversaries of the baseline data collection. At the second and third data collection points, we will also recruit a sample of new service recipients to provide their representation in the follow-up surveys and to ensure an adequate sample at each data collection point for both the cross-sectional and longitudinal analyses.


For the baseline survey, we will employ a two-stage sample design for the Ninth National Survey of Older Americans Act Participants (NSOAAP). The following sections discuss the respondent universe and sampling methods.


Baseline Respondent Universe


For the first stage of the sample design, we will select a probability sample of AAAs proportional to size (PPS) of the total annual budget. When selecting AAAs for the Ninth National Survey, Westat will select a sample of AAAs large enough to recruit approximately 300 Area Agencies on Aging, which is about 48 percent of the total number of AAAs (629). The second stage is the selection of a random sample of service recipients by service within each sampled AAA, including all of the largest ones. In this way all service recipients will have a known probability of selection. A fixed number of service recipients will be selected within each service based on the size of the AAA for a total of 6,000 recipients. It is important to note that clients are sampled independently by service and no client will be asked to participate for more than one service. Further if a client happens to be sampled for more than one service, the client will be assigned to a single service at random.


Exhibit B-1 presents the respondent universe for each module proposed for the Ninth National Survey of Older Americans Act Participants (NSOAAP).



Exhibit B-1. Respondent Universe




Service Recipient Survey


PERFORMANCE MEASURES

INDICATOR

TARGET POPULATION OF PARTICIPANTS

Congregate Meals Module

Questions on nutrition intake, nutrition risk, food security and clients’ assessments of the Congregate Meals program.

All service recipients receiving Congregate Meals services

Home-delivered Meals Module

Questions on nutrition intake, nutrition risk, food security and clients’ assessments of the Home-delivered Meals program.

All service recipients receiving Home Delivered Meals

Transportation Module

Questions on client’s experience and assessment of transportation services

All users of Transportation Services

Case Management Module

Questions on clients’ experiences and assessments of case management services.

All service recipients receiving Case Management services.

Homemaker/Housekeeping Module

Questions on clients’ experiences and assessments of Homemaker/Housekeeping services.

All service recipients who receive Homemaker/Housekeeping Services

Additional Services List

Questions asking service recipients if they receive other OAA services.

All service recipients. Caregivers will be asked about services received by their care recipients.

Physical Functioning Module

Revised Katz Activities of Daily Living (ADL) Scale and Quality of life measures from the Behavior Risk Factor Surveillance System (BRFSS) questionnaire

All service recipients, with the exception of Caregivers, will be asked these questions about their care recipients.

Emotional Well-Being

Questions on mood and affect from prior surveys of the elderly

All service recipients, except Caregivers.

Social Functioning

Degree of satisfaction with social activity and of health effects on social activities

All service recipients, except Caregivers.


Caregiver Survey


PERFORMANCE MEASURES

INDICATOR

PARTICIPANTS TO BE SAMPLED

National Family Caregiver Support Program Questionnaire

Questions on caregiver support and assessment of the program based on the Caregiver survey developed for the first, second and third national surveys.

Caregivers who participate in the National Family Caregiver Support Program


Service Recipient and Caregiver Surveys


PERFORMANCE MEASURES

INDICATOR

PARTICIPANTS TO BE SAMPLED

Demographic Information Module

Demographic Information

All service recipients and caregivers.



Response Rates from other National Surveys of Older Americans Act Participants


This is the ninth time this type of survey will be conducted. This OMB approved survey (0985-0014, 0985-0017, 0985-0020, 0985-0023) was done in 2002, 2003, 2005, 2008, 2009, 2011, 2012, and 2013. The research team anticipates an 83 percent response rate for AAAs and an 80 cooperation rate for the telephone survey of respondents, based on the success we had with the first, second, third, fourth fifth, sixth, seventh, and eighth surveys.


B.2 Procedures for the Collection of Information

B.2.1 Introduction

Several data collection activities will be conducted to support the survey. They are designed to ensure as complete a sample of AAAs (stage one) and service recipients (stage two) as possible. This will provide a representative sample for the analyses and to inform ACL/AoA on results of performance measures for state and community programs on aging under the Older Americans Act.


B.2.2 Data Collection Procedures

B.2.2.1 Telephone Contact with State and Local Agencies on Aging

Information will be collected in a two-step process. The proposed design will employ a probability sample of all AAA proportional to size (PPS) of the total annual budget. Once an agency is selected, it will receive a Federal Express package that contains an introductory letter from ACL along with detailed instructions for the AAA (see Appendix F)1. Approximately two days later, a researcher will call the agencies to explain the purpose of the participant telephone survey provide instructions for sampling the service recipients. The researcher will explain the numbered participant lists the agency needs to generate from which to select the random sample of service recipients for each of the six services. In addition, it will provide detailed instructions specific to the client tracking software used by the AAA. Previous experience has enabled Westat, the contractor, to streamline the data collection procedures for the AAAs.


At the second and third data follow-up collection points, Westat will recruit additional service recipients from the AAAs sampled at baseline to ensure that new clients are adequately represented in the cross-sectional ample. The “refreshed sample” at Time 2 can be drawn from client lists supplied by the AAAs. We will use the AAA-supplied unique ID for each client surveyed at Time 1 and compare it to the lists obtained at the second data collection point. We will use a customized software program to compare the lists. The program will print the ID numbers of the respondents’ names that did not appear on an earlier list. We will then sample the new respondents by service at the same rate as at the baseline. We will repeat this process at the third data collection point.


The size of each refreshment sample will be a function of how many new clients there are since the last sample was drawn.  The target is to have 6,000 completes with a combination of follow up calls and new calls.  We estimate that 25% will be new clients.



B.2.2.2 Telephone Survey of Older Americans Act Participants and Caregivers

Pre-notification Advance Letters


Potential respondents selected for the telephone interview will receive a letter from their respective AAAs on the agency’s letterhead. The letter contains an introduction to the study, explanation about the nature of participation, and a number to call if they do not wish to participate. Those who opt out of the study are not contacted further.


Telephone Interview


Interviewers participate in intensive training sessions prior to data collection and are monitored during data collection to ensure the protocol is properly followed. The training covers general interviewing techniques, topics specific to administering the Ninth National Survey of Older Americans Act Participants, and practice sessions.


The study sample includes people who are elderly and who may be living with disabilities. With that in mind, the training designed and conducted for the data collectors/telephone interviewers includes special guidance for interviewing and accommodating respondents who are elderly and who may have disabling conditions and/or communication problems (hearing impairments, speech disorders, cognitive impairments, memory disorders, non-native English speakers.) In certain instances, an interview with an interpreter or a proxy is arranged. Additionally, data collectors are advised to be alert to the respondent’s fatigue and to suggest calling back and completing the interview during another session. For Spanish-speaking respondents, trained bilingual data collectors conduct the interview in Spanish.

At all three data collection points (baseline survey in 2014 and two follow-ups in 2015 and 2016), Interviewers will conduct a 40-minute telephone survey of a representative sample of Older Americans Act service recipients and caregivers. The interview includes modules for each service (e.g., home delivered meals, congregate meals, case management, caregiver, transportation, and homemaker) as well as modules that are the same for all services on demographics, physical functioning, and quality of life. Interviewers administer the appropriate service model (i.e., the module that focuses on the service from which the participant was sampled.)


The service modules include items on the extent to which the respondents use the service, consumer assessment of services, and self-reported outcomes, such as the ability to live independently at home. The demographic module identifies age, living arrangements, race/ethnicity, and income categories. The module on physical functioning identifies the extent to which respondents are able to care for themselves (e.g., bathe dress, eat, etc.) and are able to handle paying bills, going to the doctor, and grocery shopping, for example.


Reminder Cards

We will maintain contact with participants between waves. Researchers will send a card to the participants 6 months after each interview to remind them of their participation and the approximate time frame for the follow-up interview. A sample of the reminder card is in Appendix I. Two weeks prior to data collection at the second and third data collection points, the AAA will send a letter to the respondents notifying them of the upcoming interview. The letter will contain a toll-free number that they can call to schedule the telephone interview if they prefer to know the day and time of the interview in advance. The card will also ask for an address change and/or new telephone number.

Obtaining Outcome Data from Non-locatable Respondents at Waves 2 and 3


We plan to model predictors of nursing home placement and time in the community with the longitudinal data using the Cox proportional hazards model or a similar approach. Therefore, it is important for the research team to collect information on the reasons why respondents drop out of the study after the baseline data collection and the time at which they stopped receiving services, especially the date of any permanent nursing home placement. We will use several methods to determine if a person is continuing to receive services or has exited, including the reasons why respondents drop out of the study. First, we will contact the AAA from which we sampled the respondent to determine the outcome. If the AAA does not collect the information, we will ask them to consult the applicable service provider to ascertain the client’s current status. If that is unsuccessful, we will conduct an Internet search for the respondent using tools, such as LexisNexis. In the event that the first two methods do not yield sufficient information, we will contact the next of kin or contact person previously specified by the respondent. The interviewers will ask the contact person whether or not the respondent still receives the service. If the respondent no longer receives the service, the interviewer will ask for the reason, the date of the termination of the service, and the reason that the reason for no longer receiving the service (e.g., nursing home placement, mortality, moved to another location, other). A copy of the script is in Appendix J.



Quality Control Procedures

Westat has quality control procedures in place for every phase of the project. Interviewers participate in rigorous training that includes general interviewer training and project specific training. Trainers observe interviewers conducting practice interviews and they monitor interviewers during data collection. During data collection, data are checked to ensure that there are no outliers in the dataset. In addition, when questions are raised during an interviewer, interviewers complete a form explaining an ambiguous or inconsistent response. Researchers review the forms and make any necessary adjustments.



B.2.3 Sampling Plan

B.2.3.1 Sample Design

The sample design for the ninth survey will consist of two stages, with a sample of approximately 300 AAAs in the first stage and a sample of clients, by service type, from each selected AAA, in the second stage. This design is similar to that of the third, fourth, fifth, six, seventh, and eighth surveys. The client sample sizes by service type, as specified by ACL, are as follows:


  • Caregiver Services 2,000

  • Home Delivered Meals 1,000

  • Congregate Meals 1,000

  • Case Management Services 500

  • Transportation Services 1,000

  • Homemaker Services 500



As in the third through eighth, these sample sizes will permit the production of reliable estimates both at the national level and at the geographic regional or demographic sub-group level. If measures of change are longitudinal (based on repeated interviews with the same respondents) the figures in Table B-2 likely represent upper bounds on the margins of error for estimated differences.


For a two-stage design, Table B-1 presents the half-widths of the 95 percent confidence intervals (CI) for various sample sizes and for cross-sectional estimates of target characteristics of proportions ranging from 10 percent to 50 percent.2 The 50 percent target is a worst-case scenario, where respondents are expected to be fairly evenly split on a particular response item, limiting the reliability of the estimate (e.g., such as trying to predict the outcome of an election where the sample of voters is about evenly divided between two candidates). Also, the precision of any estimate greater than 50 percent is the same as that of its complement, i.e., the precision of a 70 percent estimate is the same as the precision of a 30 percent estimate. The numbers in the tables are half-widths of 95 percent CIs, (i.e., the estimate, the half-width is the CI, where half-width is 1.96 times the standard error (SE) of an estimate). For example, Table B-1 shows that for a sample of size 1,000, for a target characteristic of around 30 percent, the CI would be the estimate 3.24 percent.


The table can be used to assess the adequacy of the sample sizes for both the national, and the regional or sub-group level estimates. For example, if the sample size is 1,000 at the national level then the sixth row in Table B-1 would provide the precision of the estimates at the national level. From the same table, the precision of an estimate at the regional or sub-group level can be obtained by computing the sample size that is expected for a particular region. For instance, if the region covers 25 percent of the target population, then the sample size for that region is expected to be about 250 (out of 1,000) under a proportional allocation, and the precision of the estimates for that region can be checked from the row where the sample size equals 250 in Table B-1. Similarly, if a sub-group covers 10 percent of the target population then the expected sample size for that sub-group is 100 out of 1,000 and the precision of the estimates for that sub-group can be checked from the row with sample size equal 100.


The total size of the target population has a negligible impact on the requirement of the sample size. For example, if a sample size of 250 is required to produce an estimate at the national level, then to estimate the same characteristic for a particular region (with the same level of precision), the required sample size from that region alone would be about 250. If there are four regions, then the required sample size at the national level would be about 1,000 (to guarantee adequate representation in each group). Therefore, to meet the objective of the proposed survey (i.e., to produce estimates at the regional or sub-group level with the same level of precision as the national estimates obtained from previous studies), the required sample size for each target region or sub-group will have to be the same as the total sample size of the previous studies.


For instance, a question was asked in the first national survey about the timeliness of the delivery of meals and an estimated 44 percent of all clients reported that the meals arrived on time, all the time. This estimate was based on a sample of 472 clients and had a CI of 5.2 percent. Table B-1 shows that to achieve a CI of 5.2 percent for an estimate, with a proportion between 40 percent and 50 percent, a sample of size around 480 is required. That means if this estimate is required at the regional level with the same level of confidence as the national, then the sample size in each region will have to be 480 and hence the sample size at the national level will be 480x4=1,920. In that case, the CI for this estimate at the national level would be much more precise than for the region (little over 2.5 percent). Table B-1 can be used to see the precisions of the estimates that would be achieved at various levels using the expected sample sizes at the respective levels.

The table can also be used to check the sample size requirement corresponding to a desired level of precision of an estimate.



Table B-1 Half-widths of 95 percent confidence intervals by various sample sizes and estimates of target characteristics (computed for a two-stage design with a design effect of 1.30)



Sample size

Estimates of Target Characteristics

10 percent

20 percent

30 percent

40 percent

50 percent

3,500

1.13

1.51

1.73

1.85

1.89

3,000

1.22

1.63

1.87

2.00

2.04

2,500

1.34

1.79

2.05

2.19

2.23

2,000

1.50

2.00

2.29

2.45

2.50

1,500

1.73

2.31

2.64

2.83

2.89

1,000

2.12

2.83

3.24

3.46

3.53

750

2.45

3.26

3.74

4.00

4.08

500

3.00

4.00

4.58

4.90

5.00

400

3.35

4.47

5.12

5.47

5.59

300

3.87

5.16

5.91

6.32

6.45

250

4.24

5.65

6.48

6.92

7.07

200

4.74

6.32

7.24

7.74

7.90

100

6.70

8.94

10.24

10.95

11.17



It is important to note that if the population sizes in the sub-groups or regions vary widely, then the national sample must be allocated appropriately to produce estimates from all individual sub-groups/regions with an equal level of precision. Otherwise, under a proportionate allocation, larger sub-groups will have more than required sample size while the smaller sub-groups will have less than the sample size required. For example, if the estimates are required separately for Whites and African-Americans, then just increasing the national sample would not ensure sufficient sample size for African-Americans, because less than 15 percent of recipients are African-Americans for many services. In this situation, the national sample can be disproportionately allocated by over-sampling smaller sub-groups to ensure that sufficient samples are drawn from all target sub-groups. However, over-sampling an ethnic or demographic group will require that agencies first list all their clients with the characteristic of interest and then select a sample from this list by sub-group (which may exceed the capacity of many AAA information systems).


B.2.3.2 Sample Size for Estimation of Change

If there is interest in comparing estimates from one year with another year (not longitudinally, though) , or comparing estimates of one sub-group with another sub-group, the sample size requirements are different from those that show individual point estimates at the same level of precision. The standard error (SE) of the difference between two independent estimates (for example, A and B) can be obtained by


, and the half-width of the 95 percent CI is


Since the variance of the estimate (of a difference between estimates) is the sum of the variances of the relevant individual estimates, the required sample size for estimating a difference or change is higher than for a single point estimate.3


Table B-2 presents half-widths of 95 percent CIs under a two-stage design for various sample sizes and various averages of the two estimates to be compared. For example, if the average of the two target characteristics to be compared is around 30 percent (for example, A=25 and B=35) and the sample size in each sub-group is 500, to detect a difference between the two sub-groups with statistical significance, the actual difference between the two sub-group characteristics will have to be at least 6.48 percent. This is much higher than the corresponding half-widths presented in Table B-1 for each of the individual estimates. That means a sample size that is sufficient to produce a reliable point estimate for each sub-group, individually, is not necessarily sufficient to detect the difference between the two sub-groups with the same level of precision.


Therefore, if the survey is designed for use at a region or sub-group level, then the corresponding national estimates can be compared meaningfully from one year to another, or for one service versus another (e.g., the percent of each service’s clients below a certain income level). For example, if the sample size is 1,000 in each year, and if the average response proportion for the two target characteristics is around 30 percent, then a difference of 4.58 percent or more between the years is detectable. The corresponding comparison with a sub-group sample of size 500, would not allow detecting a difference unless it is 6.48 percent or more.


Table B-2 can be used to see the extent of difference that can be detected under a two-stage design, for various sample sizes, and for various characteristics to be compared either at the national or at the sub-group level.





Table B-2 Half-widths of 95 percent confidence intervals for the difference between two estimates by various sample sizes and for various averages of the two estimates (computed for a two-stage design with a design effect of 1.30)



Sample size in each group

Average of the estimates to be compared

10 percent

20 percent

30 percent

40 percent

50 percent

3,500

1.60

2.14

2.45

2.62

2.67

3,000

1.73

2.31

2.64

2.83

2.89

2,500

1.90

2.53

2.90

3.10

3.16

2,000

2.12

2.83

3.24

3.46

3.53

1,500

2.45

3.26

3.74

4.00

4.08

1,000

3.00

4.00

4.58

4.90

5.00

750

3.46

4.62

5.29

5.65

5.77

500

4.24

5.65

6.48

6.92

7.07

400

4.74

6.32

7.24

7.74

7.90

300

5.47

7.30

8.36

8.94

9.12

250

6.00

8.00

9.16

9.79

9.99

200

6.70

8.94

10.24

10.95

11.17

100

9.48

12.64

14.48

15.48

15.80



Nonresponse adjustment was done as part of the weighting process for the previous surveys and will also be done for the Ninth National Survey. The weights of the respondents were inflated to account for the weights of the nonrespondents separately for each service. The adjustment was applied independently within nonresponse adjustment groups defined by census region and size of the agencies. That means the nonrespondents within a group are represented by the respondents in the same group. The same types of nonresponse adjustment will be done for the ninth survey.




B.2.4 Older Americans Act Participant Survey Instruments

The survey consists of telephone interviews with service recipients and caregivers. The interview is structured and will contain specific questions about the mix of services the person has received and his or her assessment of those services. Whenever appropriate, questions will contain predefined categories. Probes will be used to facilitate obtaining complete responses to all the questions. The interviews of caregivers will not include the questions that ask for physical functioning (except health conditions and ADL and IADL limitations of their care recipients) nor the Emotional Well-being and Social Functioning questionnaires. The interviews will last approximately 40 minutes and cover the topics discussed below. This is the same process followed for each of the previous surveys.


  1. Nutrition-Congregate Meals: If a respondent receives Congregate Meals, they will be asked a short questionnaire based on the Congregate Meals survey, used for the first, second, fourth, fifth, sixth, seventh, and eighth national surveys, as well as POMP I through VI. This questionnaire asks how long they have been attending the congregate meals program; how often they eat at the site; when the last time was they ate at the site; to rate the program; and how much of their food intake the meal provides on the days they eat at the site.

  2. Nutrition-Home-delivered Meals: If a respondent receives Home-delivered Meals, they will be administered a short questionnaire based on the Home-delivered Meals survey, used for the first, second, third, fourth, fifth, sixth, seventh, and eighth national surveys, and POMP I through VI. This questionnaire asks how long they have been receiving home-delivered meals; how often they receive home-delivered meals; when the last time was they received a meal; to rate the program; and how much of their food intake the meal provides on the days they receive home-delivered meals.

  3. Transportation: All service recipients who use transportation services will be interviewed using this survey module. The module asks how long they have been using the transportation; how often they use it; when the last time was they used it; trip purpose; to rate the transportation service; and about the number of times the respondent uses the service. This module is based on the instrument used for the first through eighth surveys, and all six of the POMP surveys.

  4. Homemaker/Housekeeping: Questions on the impact of homecare services will be asked of respondents who receive homemaker or housekeeping services. These questions were used in the fourth, fifth, sixth, seventh, and eighth national survey and are based on the Housekeeping Service Module developed by the POMP VI grantees. Again, the set of questions is similar to those asked of the other services: how long respondents have been receiving homemaker services; how often they receive homemaker services; when the last time was they used the services; to rate the program; and if they can depend on their aides to do deliver the allotted services.

  5. Case Management: Service recipients who receive case management services will be asked questions about their experiences with the program. They will be asked: how long they have been receiving the services; how they would rate the various aspects of the case management services (e.g. ease of contact with the case managers; if the case managers understand their needs, etc.); to rate the services overall and if they contribute to the decisions about their care. This module was used in the fourth through eighth national surveys and is based on the case management module developed by the POMP V grantees.

  6. Service List: All service recipients will then be asked about the mix of services they receive and the impact of those services. They will also be asked to rate the services overall. This module is based on the service module used for the third through eighth, national surveys, with added questions from POMP VI.

  7. Physical Functioning: This module will be asked of all service recipients (except Congregate Meals clients). This survey module will include questions on: Activity of Daily Living limitations (e.g., difficulty with personal care activities such as bathing and dressing) and Instrumental Activity of Daily Living limitations (e.g., difficulty with such home management activities as meal preparation, shopping, and housekeeping). Questions about the respondents’ health are also being asked, to help with assessing the frailty of the clients served by OAA services. Caregivers will be asked these questions about their care recipients.

  8. Emotional well-being: This six-question module will be asked of all participants in the surveys, except caregivers. The questions ask if the respondent has felt sad or depressed, worried or tense, and if they feel that they did not get enough rest, within the last thirty days. They are also asked to describe their overall emotional well-being by responding to a close-ended question (i.e., “Would you say…Excellent, Very Good, Good, Fair, or Poor?”).

  9. Social Functioning: All service recipients will be asked four questions from the Social Functioning Survey. These questions ask if the respondent feels his or her social life is adequate and if health concerns have interfered with the ability to participate in social activities.

  10. National Family Caregiver Support Program Assessment: Caregivers who receive caregiver support services through the National Family Caregiver Support Program will be surveyed as part of the Ninth NSOAAP. This module has questions on services offered to caregivers through the National Family Caregiver Support Program, and the impact of those services. There are also questions about services the care recipient receives and satisfaction with and impact of those services; support the caregiver receives, either as part of a formal support group or from other relatives and friends; and what kinds of other information the caregiver would find valuable. The survey asks about the type of help the caregiver provides for the care recipient, the amount of time they provide care, benefits caregiving provides them (companionship, a sense of accomplishment, etc.), drawbacks of caregiving (financial burdens, lack of private time, etc.), and demographic and health information on the care recipient. Three of the questions for this module were adapted from an AARP survey, Caregiving in the U.S.4

  11. Demographic information of the respondent: Demographic information about the respondent will be collected, including type of area of residence (urban, suburban, or rural), Zip Code, education level, race, gender, living arrangements (living alone, with spouse, or with others), and income level. This module will be administered to all participants. The caregiver survey already includes some demographic questions about the care recipient, but the demographic information on the caregiver will be gathered using this demographic module.


Many of the national survey questions come from such commonly used vehicles as the Survey of Income and Program Participation (SIPP), (e.g., the ADL and IADL questions), the Behavioral Risk Factor Surveillance System (BRFSS) surveys conducted within each state using HHS/CDC standard questions, and other existing surveys. These are virtually the same instruments used for the previous eight national surveys as well.





B.3 Methods to Maximize Response Rates and Deal with Nonresponse

Procedures for Eliciting Cooperation and Maximizing Response Rates among AAAs


Westat will use the same procedure to select respondents for the Ninth National Survey of OAA Participants as it did in the previous four surveys, which proved very successful. As part of the recruitment procedures, Westat initially contacts the AAAs by sending an introductory letter from ACL and an information package about the survey via Federal Express. A copy of these materials is also sent to each State Unit on Aging that had AAAs sampled for the survey. (See Appendices D and E for the letter sent to the States and introductory letter and information package sent to the AAAs.) Following up by telephone and email, the Westat research team works closely with each participating AAA to generate numbered lists of clients (using client ID numbers) by service for the client sample frame. The Westat research team uses the numbered lists of client ID numbers for the random selection of the respondents to be interviewed. To complete the random sampling process, Westat research team members enter the total numbers of participants by service into a computer sampling program. The sampling program randomly selects line numbers from the numbered lists of clients. The number of clients to select per service is already entered into the program and is based on the size of the agency. Westat informs the AAAs of the selected line numbers. The AAAs then provide the participant names and telephone numbers associated with those line numbers to Westat.


Westat research assistants serving as recruitment specialists will encourage the participation of all selected agencies by establishing rapport with contacts within each agency, coaching them on how to generate their client lists, and assuring them that the time involved for them to complete the participant selection procedures will be minimal. For agencies that refuse to participate, Westat will send them a refusal conversion letter (already developed for the previous five surveys), and call them one more time to try to gain their cooperation. Once an agency refuses a second time, Westat will not try to contact them again. For the Evaluation of Independent Living Programs (an OMB-approved national study for the Department of Rehabilitation Services Administration Office of Special Education and Rehabilitative Services, U.S. Department of Education), and for the eight previous ACL/AoA National Surveys, Westat research assistants called the original agencies, sent e-mails, and/or faxes, and resent recruitment packages via FedEx. Westat will use the same techniques to gain cooperation for the ninth survey. Additionally, to promote agency participation, we plan to gain the endorsement and support of the National Association for Agencies on Aging (N4A), as well as the National Association of States United for Aging and Disabilities (NASUAD).


To reduce the burden for the AAAs, Westat works with software vendors of commercial client tracking software programs commonly used by AAAs to develop step-by-step instructions for creating numbered lists of client ID numbers by service. By using agency-assigned client ID numbers to generate numbered lists of clients for the participant sample frame, Westat is able to screen the lists for duplicate client entries. Additionally, the use of agency-assigned client ID numbers helps to decrease the amount of personally identifiable client contact information collected by Westat during the survey.


Numbered lists will be developed for the following services: home delivered meals, congregate meals, transportation, case management, and homemaker services, as well as caregivers who are served by the National Family Caregiver Support Program.

To ensure a high participant response rate, each AAA will send participants who are eligible for the telephone survey a letter before they are contacted by an interviewer. Westat also offers the AAAs the option of sending the client notification letters for them. The letters will be on each AAA’s letterhead, as was the pre-contact letter for the first eight surveys. Westat will attempt to contact participants at different times of the day and different days of the week to maximize the possibility of contact. Westat is also experienced in refusal conversion procedures, having achieved a refusal conversion rate of 40 percent for the earlier surveys.


Procedures for Maintaining Cooperation for the Second and Third data Collection Points


As described above, we will send reminder cards to the respondents 6 months after each data collection wave. Whenever reminder cards have been returned, a researcher will contact the next of kin or contact person and follow the procedures discussed in Section B2.2.2.


Tracing


Tracing is an important strategy for achieving good response rates at all three data collection points. At baseline, we will trace potential respondents who are unreachable by first verifying the address with the AAA contact and then searching web-based directories.


At the second and third data collection points, we will use the same methods as described for the baseline. If those methods do not produce information about the non-locatable respondents, we will call the next of kin or contact person identified by the respondent. The interviewer will administer four questions to determine the respondent’s outcome, which is necessary to model factors associated with remaining in the community and time to event (e.g., nursing home placement, mortality, remaining in the community). Please see Appendix J for the telephone script that contains the questions about outcome.




B. 4 Tests of Procedures or Methods to Be Undertaken


As discussed in earlier sections, the individual service modules and the modules on physical functioning, quality of life, and demographics have all been field tested and validated by the POMP participants. For example, for each module the POMP grantees drew samples of service recipients, administered the modules, and analyzed the data. The POMP grantees revised the items on the modules based on the results of the field tests and validity studies.


The majority of the items on the survey instrument for the Ninth National Survey are from the previous survey instruments. Over the years, several items have been removed from the survey instrument because of ambiguity of the wording or in cases where the results of the item showed no variation across response options.



B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


The use of statistical sampling methods is critical to this survey. Under the supervision of AoA, Westat is responsible for selecting the sample, conducting the interviews, data weighting and data analysis. Below are the names and contact information of individuals responsible for the statistical aspects of the study and individual collection and/or analysis of the data.



Administration for Community Living/Administration for Aging Personnel Responsible for Deliverables


Elena M. Fazio, Ph.D.

Social Science Analyst

Office of Performance and Evaluation

Administration for Community Living/Administration on Aging

U.S. Department of Health and Human Services

1 Massachusetts Avenue, NW

Washington, DC 20001

Tel: 202-357-3583 

[email protected]



Westat Staff


Dwight Brock, Ph.D. – Role: statistician with involvement in study design, development of the sampling plan, weighting, and data analysis

Senior Statistician

Westat

1600 Research Blvd.

Rockville, Maryland 20850

301-517-4026

[email protected]


Kathryn Engle – Role: Data collection supervisor

Westat

1600 Research Blvd.

Rockville, Maryland 20850

301-610-4911

[email protected]


Robert Ficke – Role: Project Director with involvement in design, sampling, and data analysis

Senior Study Director

Westat

1600 Research Blvd.

Rockville, MD 20850

301-294-2835

[email protected]



Katie Hubbell

Systems Analyst – Role: systems analyst with involvement in sampling data weighting, data analysis, and reports.

Senior Systems Analyst

Westat

1600 Research Blvd.

Rockville, Maryland 20850

301-294-2020

[email protected]


Robin Ritter – Role: Manager of Survey Operations including agency recruitment

Senior Study Director

Westat

1600 Research Blvd.

Rockville, Maryland 20850

301-240-314-5804

[email protected]


Jacqueline Severynse – Role: design of the sampling plan, weighting, and data analysis.

Senior Statistician

Westat

1600 Research Blvd.

Rockville, Maryland 20850

301-517-8058

[email protected]




Appendix H


Notification Letter to be

Sent to State Units on Aging




Shape1

DATE:

April __, 2014


TO:

Director, State Unit on Aging


FROM:

Robert Hornyak


SUBJECT:

Ninth National Survey of Older Americans Act Participants


The Administration for Community Living (ACL) is undertaking the Ninth National Survey of Older Americans Act Participants. Approximately half of the AAAs in the 50 states and the District of Columbia have been randomly selected to participate in this study. Please see the attached list of Area Agencies on Aging in your state which have been selected to participate in this year's survey.


This survey provides ACL with an effective method for collecting timely data to meet the accountability requirements of Congress and the Administration. As required, ACL requested and received approval from the Office of Management and Budget (OMB) to identify and to survey elderly individuals who receive OAA services (OMB approval number is 0985-0023). The approval covers the sampling of Area Agencies on Aging and their clients, and conducting surveys to assess the following service categories: congregate and home-delivered meals, family caregiver, case management, homemaker, and transportation services.


Westat is the research firm that will be conducting the study for AoA. Each selected AAA will be asked to work with Westat to submit electronic lists of clients via a secure web site, which will be used in randomly selecting the sample of service recipients. Where possible, Westat would like to work directly with State MIS contacts to expedite the collection of client ID number lists and ease the burden of selected AAAs.


In the past, numerous SUAs have worked directly with Westat in providing client tracking data, with the agreement of their AAAs. This is an efficient method for sampling service recipients, as well as reducing the workload for participating AAAs. ACL would like to maximize the number of states that use their client data systems for sampling service recipients. Therefore, we encourage you (SUA) to use your client data system where possible to further data collection associated with this project.  


After advance letters (which Westat can prepare and mail for the AAAs) have been sent to the sampled clients (an average of 80 clients per AAA), Westat will conduct telephone interviews with the sampled service recipients. The survey will determine consumer assessment of the services, as well as selfreported outcomes of those services. All of their responses will be private and the analysis will not link responses to agencies or individual respondents.


In addition to the list of AAAs in your state which have been selected to participate, we have also attached a copy of the letter from me that is being sent to the AAAs, as well as a sample recruiting information packet, explaining the study and encouraging the AAA’s to participate.


If you have any questions about the AAAs’ participation in the survey, please call ACL/AoA at 1-888-204-0271 or Westat at 1-888-204-0046. Thank you for supporting this important effort.

Sincerely,

Robert Hornyak

Director, Office of Performance and Evaluation

Center for Disability and Aging Policy

Shape2

H-1

cc: <<ACL Regional Administrator>>, Regional Administrator

Appendix I


Sample Six-Month Reminder Card


Sample Six-Month Reminder Card





<DATE>


Dear <NAME OF RESPONDENT>:


Thank you very much for participating in the telephone interview of the National Survey of Older Americans Act Participants. As you may remember, Westat is conducting the survey again in six months. At that time, an interviewer will call and ask you to participate in a similar interview.


We urge you to continue your participation in this important study about the services that you receive from the <NAME OF AGENCY>. We need as many participants to remain in the study as possible. This will ensure that the results of the survey best represent those who receive services.


If you are no longer receiving services, please complete the enclosed postcard and return it to Westat. If your telephone number has changed, please complete the enclosed postcard and return it to Westat.


If you have any questions about the study, please call <NAME OF CONTACT PERSON> at <TOLL-FREE NUMBER>. She will be glad to answer any questions.


Again, thank you very much for your participation in this important study.


Yours truly,





NAME OF PARTICIPANT___________________________________________.



  1. My phone number has change, please call me at ________________________.

[Please insert new phone number]

OR


  1. I no longer receive services from the AAA.


I stopped receiving services on _________________ for the following reasons:

[Please insert the date]


___________________________________]

[Please insert reason]














Appendix J


Script for Locating Respondents after Baseline
















SCRIPT FOR OBTAINING INFORMATION

FROM NEXT OF KIN OR CONTACT PERSON




Hello, my name is __________________________ and I am calling from Westat, a research firm located in Rockville, MD. Your relative [NAME OF RESPONDENT] is a participant in a study we conduct for the U.S. Administration on Aging, and he/she gave us your name and number in case we have difficulty contacting him/her. Recently, we tried to contact [NAME OF RESPONDENT] and were unable to because [RING NO ANSWER, PHONE DISCONNECTED, ETC.]. We were wondering if you would be able to tell us the whereabouts of your [RELATIVE/FRIEND].


1. Do you know if the respondent has a new phone number?


YES 1

NO 2

DON’T KNOW -7 GO TO Q2

REFUSED -8


1a. If “yes,” please insert the new number:


( |___|___|___| ) |___|___|___| - |___|___|___|___|



2. Do you know if the respondent is still receiving services?


YES 1

NO 2

DON’T KNOW -7

REFUSED -8


2a. If the respondent is no longer receiving services, do you know the reason why?


YES 1 GO TO Q2aa

NO 2 Thank you for your

DON’T KNOW -7 time. END

REFUSED -8 INTERVIEW


2aa. Is the reason because they...


Moved to another city or state? 1

Moved to an assisted living facility? 2

Moved in with a relative? 3

Deceased? 4

OTHER? 91

(SPECIFY)

DON’T KNOW -7

REFUSED -8



2b. If the respondent is no longer receiving services, do you know the date he/she stopped receiving services?


YES 1 GO TO Q2bb

NO 2 Thank you for your

DON’T KNOW -7 time. END

REFUSED -8 INTERVIEW


2bb. What is the date?


|___|___| / |___|___| / |___|___|












1 State units also receive a letter with a list of AAAs selected in the state (see Appendix H).

2 This percent range refers to the client response patterns that may occur; for example, in a yes/no question, it refers to the expected percent of respondents who will answer yes, versus no.

33 For longitudinal analysis, where the same individuals are interviewed repeatedly, the estimates of precision can be smaller than what is shown in Table B-2 because responses of an individual are likely to be positively correlated, which could reduce the standard error of the difference.

44 National Alliance for Caregiving and AARP (2004, April). Caregiving in the U.S. Appendix C, pp. 16-17 retrieved from AARP Web site: http://assets.aarp.org/rgcenter/il/us_caregiving.pdf


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRobin Ritter
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy