4thNationalOMBReview vdc revised 51707

4thNationalOMBReview vdc revised 51707.doc

Fourth National Study of OAA Title III Service Recipients

OMB: 0985-0023

Document [doc]
Download: doc | pdf


Application for New Data Collection:
Supporting Statement for Fourth National Study of OAA Title III Service Recipients



May 24, 2007

Submitted to:


Administration on Aging

1 Massachusetts Avenue, NW

Washington, DC 20001

Submitted by:


WESTAT

1650 Research Boulevard

Rockville, Maryland 20850-3195

(301) 251-1500




PAPERWORK REDUCTION ACT SUBMISSION

Please read the instructions before completing this form. For additional forms or assistance in completing this form, contact your agency's Paperwork Clearance Officer. Send two copies of this form, the collection instrument to be reviewed, the Supporting Statement, and any additional documentation to: Office of Information and Regulatory Affairs, Office of Management and Budget, Docket Library, Room 10102, 725 17th Street, NW, Washington, DC 20503.

1. AGENCY/SUBAGENCY ORIGINATING REQUEST

Department of Health and Human Services

2. OMB CONTROL NUMBER

a. b. NONE 0985___________

3. TYPE OF INFORMATION COLLECTION (X one)

(For b. – f., note item A2 of Supporting Statement instructions)


a. NEW COLLECTION

b. REVISION OF A CURRENTLY APPROVED COLLECTION

c. EXTENSION OF A CURRENTLY APPROVED COLLECTION

d. REINSTATEMENT, WITHOUT CHANGE, OF A PREVIOUSLY

APPROVED COLLECTION FOR WHICH APPROVAL HAS EXPIRED

e. REINSTATEMENT, WITHOUT CHANGE, OF A PREVIOUSLY

APPROVED COLLECTION FOR WHICH APPROVAL HAS EXPIRED

f. EXISTING COLLECTION IN USE WITHOUT AN OMB CONTROL

NUMBER

4. TYPE OF REVIEW REQUESTED (X one)


a. REGULAR SUBMISSION

b. EMERGENCY - APPROVAL REQUESTED BY: ___/___/___

c. DELEGATED

5. SMALL ENTITIES

Will this information collection have a significant economic impact on a substantial number of small entities?

YES NO

6. REQUESTED EXPIRATION DATE

a. THREE YEARS FROM APPROVAL DATE

b. OTHER 18 months from approval

7. TITLE

Fourth National Study of OAA Title III Service Recipients

8. AGENCY FORM NUMBER(S) (if applicable)

Not Applicable

9. KEYWORDS

Older Americans Act elderly, transportation and disability

10. ABSTRACT

The Administration on Aging (AoA) has undertaken an effort to develop a core set of performance measures for state and community programs on aging operating under the Older American Act (OAA). Entitled the Performance Outcomes Measures Project, this initiative helps State and Area Agencies on Aging address their own planning and reporting requirements, while assisting AoA to meet the accountability provisions of the Program Assessment Rating Tool (PART) and Government Performance and Results Act (GPRA). AoA has funded the nationwide survey of AoA participants, utilizing the survey modules based on the Sixth Performance Outcome Measures Project.

1

P

1. AFFECTED PUBLIC (Mark primary with "P" and all others that apply with "X")

12. OBLIGATION TO RESPOND (X one)

P a. INDIVIDUALS OR HOUSEHOLDS

b. BUSINESSES OR OTHER FOR-PROFIT

c. NOT-FOR-PROFIT INSTITUTIONS

d. FARMS

e. FEDERAL GOVERNMENT

f. STATE, LOCAL OR TRIBAL GOVERNMENT

a. VOLUNTARY

b. REQUIRED TO OBTAIN OR RETAIN BENEFITS

c. MANDATORY

13. ANNUAL REPORTING AND RECORDKEEPING HOUR BURDEN

14. ANNUALIZED COST TO RESPONDENTS (in thousands of dollars)


a. NUMBER OF RESPONDENTS

6,250

a. TOTAL ANNUALIZED CAPITAL/STARTUP COSTS

0.00

b. TOTAL ANNUAL RESPONSES

6,250

b. TOTAL ANNUAL COSTS (O&M

0.00

(1) Percentage of these responses collected electronically

2%

c. TOTAL ANNUALIZED COST REQUESTED

0.00

c. TOTAL ANNUAL HOURS REQUESTED

3,500

d. CURRENT OMB INVENTORY

0

d. CURRENT OMB INVENTORY

0

e. DIFFERENCE (+/-)

0

e. DIFFERENCE (+/-)

+4,000

f. EXPLANATION OF DIFFERENCE


f. EXPLANATION OF DIFFERENCE

(1) Program change (+/-)

+4,000

(1) Program change (+/-)



(2) Adjustment (+/-)

N/A

(2) Adjustment (+/-)


15. PURPOSE OF INFORMATION COLLECTION (Mark primary with "P" and all others that apply with "X"

16. FREQUENCY OF RECORDKEEPING OR REPORTING (X all that apply)

a. RECORDKEEPING b. THIRD PARTY DISCLOSURE

P

a. APPLICATION FOR BENEFITS

b. PROGRAM EVALUATION

P e. PROGRAM PLANNING OR

MANAGEMENT

c. REPORTING

(1) On Occasion


(2) Weekly


(3) Monthly

c. GENERAL PURPOSE STATISTICS

f. RESEARCH

(4) Quarterly

(5) Semi-annually

(6) Annually

d. AUDIT

g. REGULATORY OR COMPLIANCE

(7) Biennially

(8) Other (Describe) one- time


17. STATISTICAL METHODS

18. AGENCY CONTACT (Person who can best answer questions regarding the content of this submission)

Does this information collection employ statistical methods?

YES NO

a. NAME (Last, First, Middle Initial)


Cook, Valerie D.

b. TELEPHONE NUMBER (Include area code)


202-357-3583

OMB-83-I


OMB CONTROL NUMBER


TITLE

Fourth National Study of OAA Title III Service Recipients

19. CERTIFICATION FOR PAPERWORK REDUCTION ACT SUBMISSIONS

a. PROGRAM OFFICAL CERTIFICATION

(1) Signature

(2) Date


On behalf of this federal agency, I certify that the collection of information encompassed by this request complies with 5 CFR 1320.9.


NOTE: The text of 5 CFR 1320.9, and the related provisions of 5 CFR 1320.8 (b)(3), appear at the end of the instructions. The certification is to be made with reference to those regulatory provisions as set forth in the instructions.


The following is a summary of topics, regarding the proposed collection of information, that the certification covers:


  1. It is necessary for the proper performance of agency functions;


  1. It avoids unnecessary duplication;


  1. It reduces burden on small entities;


  1. It uses plain, coherent, and unambiguous terminology that is understandable to respondents;


  1. Its implementation will be consistent and compatible with current reporting and recordkeeping practices;


  1. It indicates the retention periods for recordkeeping requirements;


  1. It informs respondents of the information called for under 5 CFR 320.8 (b)(3):


  1. Why the information is being collected;


  1. Use of information;


  1. Burden estimate;


  1. Nature of response (voluntary, required for a benefit, or mandatory);


  1. Need to display currently valid OMB control number;


  1. It was developed by an office that has planned and allocated resources for the efficient and effective management and use of information to be collected (see note in Item 19 of the instructions);


  1. If applicable, it uses effective and efficient statistical survey methodology; and


  1. It makes appropriate use of information technology.


If you are unable to certify compliance with any of these provisions, identify the item below and explain the reason in Item 18 of the Supporting Statement.














b. SENIOR OFFICIAL OR DESIGNEE CERTIFICATION


(1) Signature


(2) Date

OMB FORM 83-I (BACK), 10/95

Table of Contents

Chapter Page


A JUSTIFICATION 1


A.1 Need for Information 1

A.2 Use of the Information 3

A.3 Use of Information Technology to Reduce Burden 4

A.4 Efforts to Identify and Avoid Duplication and Ease
Respondent Burden 4

A.5 Minimizing Burden on Small Entities 5

A.6 Consequences of Less Frequent Data Collection 5

A.7 Special Circumstances 5

A.8 Federal Register Notice and Consultation Outside the Agency 5

A.9 Remuneration 5

A.10 Assurances of Confidentiality 6

A.11 Sensitive Information 6

A.12 Hour Burden of Information Collection to Respondents 6

A.13 Provide an Estimate of the Total Annual Cost Burden to
Respondents or Recordkeepers Resulting From the Collection
of Information 7

A.14 Annual Cost to the Federal Government 7

A.15 Program Changes 8

A.16 Plans for Publishing 8

A.17 Display of Expiration Date of OMB Approval 8

A.18 Exception for Item 19, “Certification for Paperwork Reduction
Act Submissions.” of OMB Form 83-I 8


B Collection of Information Employing
Statistical Methods 9


B.1 Respondent Universe and Sampling Methods 9

B.2 Procedures for the Collection of Information 12


B.2.1 Introduction 12

B.2.2 Data Collection Procedures 12

B.2.3 Sampling Plan 12

B.2.4 Older Americans Act Participant Survey Instruments 16


B.3 Eliciting Cooperation/Maximizing Response Rates 18

B.4 Tests of Data Collection Instruments and Procedures 21

B.5 Use of Statistical Survey Methodology 21



table of contents (continued)

List of Appendixes

Appendix Page


A Federal Register Notice A-1


B Pertinent Legislation B-1


C constructing performance measures C-1


D westat assurance of confidentiality agreement D-1


E prenotification letter and agency sampling

instructions E-1


f Question-by-question comparison: first, second, and

third national surveys of oaa TITLE III SERVICE
RECIPIENTS survey instruments F-1


G Research Triangle Institute, Evaluation of Title III-B Supportive Services Regression Analysis G-1


H 2006 FOURTH national study of oaa title iii service

recipients survey instruments H-1



List of Tables


Table Page


B-1 Standard errors that would be achieved under different designs, by
various levels of intraclass correlations and percentages of target
characteristics, for the estimates from the four smaller samples 14


B-2 Half-width of 95 percent confidence interval for the difference between two

estimates by various sample sizes and for various averages of the two estimates (computed for a two-stage design with a design effect of 1.30) 15



List of Exhibits


Exhibit Page


A-1 Estimated Hour and Annual Cost Response Burden 7


A-2 Overall Cost to the Federal Government 7


A-3 Data Collection Timetable 8


B-1 Respondent universe 11

A. JUSTIFICATION



A.1 Need for Information


The Administration on Aging (AoA) is continuing its strategy of program improvement through enhanced program performance measurement, in compliance with requirements of the Office of Management and Budget’s (OMB’s) program reviews employing the Program Assessment Rating Tool (PART), the Government Performance and Results Act (GPRA), and the Older Americans Act (OAA) Section 202(f), by proposing to conduct further studies of program outcomes. Two pilot studies of Older Americans Act Title III Service Recipients have been conducted under OMB control numbers 0985-0014 and 0985-0017; the Third National Study of Title III Service Recipients was conducted in early 2005 (0985-0020). The studies have been more than successful; enabling AoA to establish baselines and performance targets for annual and long-term outcome measures required by PART and GPRA and incorporate new performance information in agency budget justifications and performance plans for FY 2005 and FY 2006. Further, the studies demonstrated that services provided under Title III:


  • Are effectively targeted to vulnerable populations;

  • Are provided to individuals who need the services;

  • Are highly rated by recipients (quality); and

  • Provide assistance that is instrumental in enabling recipients to maintain their independence.

In addition, the studies enabled AoA to improve the PART assessment score for the Aging Services Program. AoA is confident that when additional performance information is obtained the score will increase.


With this submission, we are requesting OMB approval to conduct a fourth national study.


Background


Performance Measurement Requirements


GPRA (P.L.103-62, http://thomas.loc.gov) and, more recently, PART (www.whitehouse.gov/omb/budget/fy2005/part.html) require Federal agencies to develop annual and long-term performance outcome measures and to report on these measures annually. Section 202(f) of the OAA (www.aoa.gov/about/legbudg/oaa/legbudg_oaa.asp) requires AoA to work collaboratively with States and Area agencies to develop performance outcome measures.



Title III of the OAA establishes a home and community-based care program for the elderly and their caregivers, to enable elderly persons to live as independently as possible for as long as possible. AoA has chosen to work toward improved program performance throughout the Aging Services Network by working collaboratively with States and Area Agencies on Aging (AAAs) to develop performance outcome measurement tools. The tools identify elements of service quality so that States and AAAs can improve service systems at the local level. These same tools can also be employed by AoA to measure program performance at the national level.


Performance Outcomes Measures Project (POMP)


For the past seven years, AoA has sponsored the Performance Outcomes Measures Project (POMP) demonstration, where grants are awarded to States, who then work collaboratively to develop survey instruments that measure elements of service quality and consumer reported outcomes for various services provided under the OAA. Surveys have been developed for the following topics:


Service Domains:

  • Nutrition (including congregate and home-delivered meals)

  • Transportation

  • Information and Assistance

  • Homemaker/Housekeeper

  • Personal Care

  • Caregiver Support

  • Case Management

  • Senior Centers

  • Service Providers


Client Characteristics:

  • Physical Functioning

  • Demographics

  • Emotional Well-Being

  • Social Functioning


POMP is demonstrating the ability of States and AAAs to apply statistically sound sampling techniques to obtain numeric measures of program performance.


The survey instruments developed under POMP – along with various tools necessary for implementation - can be found at www.gpra.net. These performance measurement surveys have enabled some local agencies to obtain additional financial support and improve program management. Examples of uses of performance measurement at the State and local level follow:


  • The Hawkeye Valley Area Agency on Aging in Waterloo, Iowa compiled information on the level of client support and satisfaction with services and received additional funding from the United Way for exemplary programs.

  • The Area Agency on Aging in Cincinnati, Ohio expanded the use of Home Care Client Satisfaction Measure (HCSM) and incorporated it into an ongoing part of its case management process for all clients to improve service quality.

  • The Florida Department of Elder Affairs developed a computer simulation model that demonstrated the impact of home care programs on reducing nursing home admissions and showed the savings in Medicaid funds.


The POMP VI grantees were tasked with developing a recipient survey that could be asked of all Title III Service Recipients. The recipient survey instruments for the fourth national study are based on the recipient survey developed by POMP VI grantees. The caregiver survey instrument is based on the instrument developed by the POMP V work group, and used for the Third National Survey of OAA Title III Service Recipients.


First, Second and Third National Surveys

The first national survey has provided AoA with performance outcome information that was used extensively in the FY 2005 GPRA plan and the PART reassessment conducted during the FY 2005 budget formulation process. The information was incorporated into the AoA budget justification and used by the Assistant Secretary in testimony. Highlights of the findings are at www.gpra.net .


Information from the second national survey was incorporated into the AoA FY 2006 performance budget and GPRA plan. The information is used in testimony by the Assistant Secretary and other agency officials and is included in the annual report to Congress. The data is also used by AoA evaluation contractors as a secondary data source. See Appendix G for information on regression analysis performed by the Research Triangle Institute in the Evaluation of Title III-B Supportive Services. In addition, the results have been provided to AoA’s POMP grantees so that they can benchmark their own performance measurement data. Highlights of the findings are also in appendix G.


The Fourth National Study of OAA Title III Service Recipients

The Fourth National Study of OAA Title III Service Recipients will be conducted using modified versions of survey instruments developed by POMP VI grantees. OMB requested that a survey be conducted that would allow AoA to give OMB an overall assessment of the services provided through the Older Americans Act. The Fourth National Study of OAA Title III Service Recipients will include modules on:


  • Nutrition programs (both Home-delivered Meals and Congregate Meals);

  • Transportation;

  • Homemaker

  • Case Management; and

  • National Family Caregiver Support Program.


This study is being conducted to give an overview of the outcomes for clients based on the mix of services they receive. Sufficient sample sizes by service are being drawn, to allow within service analysis as well (see Section B.2.3, Sampling Plan). We will employ the same basic sampling strategy that was used successfully in the first two pilot studies and the Third National Study of Title III Recipients; a random sample of AAAs and then a random sample of service recipients within the sampled AAAs. We will conduct telephone surveys of OAA service recipients as we did in the first three surveys.



A.2 Use of the Information

The results of this information collection will be used as follows:


  • Report on FY 2007 performance targets as required by GPRA and PART.

  • Inform the development of new performance measures related to vulnerable population subgroups.

  • Provide refined national benchmarks for use by States and AAAs.

  • Continue to explore the feasibility of substituting survey reporting for some of the NAPIS reporting requirements.

  • Provide secondary data for analysis in various Title III program evaluations.

  • Provide performance information for key demographic subgroups, geographical sub-regions, and different types of AAAs which will enable AoA to identify variations in performance and examine the need for additional targeted technical assistance.

  • Provide overall assessment of services per OMB’s request.


In addition, the data will be analyzed in conjunction with data from the Area Agency on Aging Survey (OMB approval number 0985-0021). The Area Agency survey will compile basic descriptive characteristics of the AAAs and information from the service recipient study will be categorized according to key descriptors (e.g. caseload size) and posted on the POMP website. AAAs can then compare their performance to the performance of a group of AAAs with similar characteristics.


The data will be used by the Assistant Secretary on Aging in testimony and presentations, it will be incorporated into the agency’s Annual Report, and it will be used by program staff to identify areas that may need attention at the national level. For example, the AoA nutritionist is interested in examining nutritional intake information by key population subgroups to identify potential areas for technical assistance initiatives.


A.3 Use of Information Technology to Reduce Burden

There is no technology per se that will limit respondent burden for the contacts with the agencies. Agencies will have the option to send their lists of selected clients to Westat via email, which many of the AAAs have done for the past three surveys. However, the proposed materials requesting information from the agencies have been designed in a way that minimizes respondent burden. For instance, we are utilizing computer technology to maintain survey contact information, and personalized letters to both agencies and participants that introduce the study to respondents.


The service recipient survey will be administered using Computer-Assisted Telephone Interviewing (CATI). CATI reduces respondent burden by taking the interviewer to the next correct question, thereby insuring that the respondent is only on the telephone the minimum time necessary to complete the survey. Westat will also maintain a toll-free telephone line to answer respondents' questions during data collection and agency questions during respondent selection.



A.4 Efforts to Identify and Avoid Duplication and Ease Respondent Burden

Every effort is being made to avoid duplication and minimize respondent burden. Over the last three years, Westat conducted the First and Second National Surveys (pilot studies) of OAA Participants and the Third National Study of OAA Title III Service Recipients. As a result of the information gathered, modifications have been made to the data collection procedures and to the survey instruments. We feel we have reduced agency and respondent burden to the minimum level possible to achieve the study's objectives.


In addition, AoA is careful not to duplicate data collection efforts between performance measurement and evaluation efforts. The National Survey data has been used as a secondary data source in a current evaluation effort. When future evaluation efforts require independent data collection, that data collection will be coordinated with the performance measurement surveys.


A.5 Minimizing Burden on Small Entities


The data sources affected by the study covered by this request for review will be agencies of state and local government, public purpose, quasi-governmental agencies, and clients who are private citizens. We have designed the sample to minimize the burden on both the agencies and the client respondents.


A.6 Consequences of Less Frequent Data Collection

This proposed fourth national study will be conducted one time. If we are unable to conduct this survey we will be unable to move forward with our program performance measurement strategy as required by the OAA. We will be unable to pursue outcome measurement for the AoA Strategic Action Plan and our PART assessments will be compromised, especially since this survey format was developed to allow an overall assessment of Title III, while also continuing to allow assessment of individual services offered by Title III. We will be unable to report our FY 2007 consumer assessment performance measures and targets for GPRA and PART reporting purposes. The proposed sample sizes will allow us to analyze results by subgroup (e.g. region, age, race) and, therefore, allow better targeting of services.


A.7 Special Circumstances

The data collection effort will be conducted according to the guidelines specified in 5 CFR 1320.6. No special circumstances are known that would cause inconsistency with these guidelines.


A.8 Federal Register Notice and Consultation Outside the Agency

The initial notice requesting comments appeared on pages 43871-43872 of the July 29, 2005 edition, Volume 70, Number 145 of the Federal Register. A copy of the notice is attached in Appendix A. There were no public comments.


The survey instruments for this proposed information collection are based on those developed by AoA grantees at the state and local level during the POMP demonstration. The development of the survey instruments has been an iterative process. There were no areas of disagreement during the latest POMP revisions.


A.9 Remuneration

Not applicable.


A.10 Assurances of Confidentiality

Anonymity is an important part of the study design. In response to this concern, Westat will ensure the anonymity of all individuals who provide data. A pledge of anonymity is a major positive incentive for potential respondents to participate in the study. Its absence would be a significant deterrent and could create complications in implementing the study. Westat will take the following precautions to ensure the anonymity of all data collected:


  • All Westat staff, including analysts, coders, editors, and keypunchers, will be instructed in the confidentiality requirements of the study and will sign statements affirming their obligation to maintain confidentiality;

  • Information will be reviewed and data will be cleaned only by Westat staff;

  • Data files that are delivered will contain no personal identifiers for program participants; and

  • Analysis and publication of study findings for the participant survey will be in terms of aggregated statistics only.

Appendix D presents the confidentiality agreement all Westat staff must sign. This agreement requires the signer to keep confidential any and all information about individual respondents to which they may gain access. Any Westat employee who violates this agreement is subject to dismissal and to possible civil and criminal penalties.


A.11 Sensitive Information

The physical functioning module does contain questions on the ability of respondents to perform certain tasks, such as getting around inside and outside the home, getting in and out of a chair or the bed, getting to and using the toilet, etc, as well as questions asking about their health conditions. These types of questions might be considered to be sensitive; however, we have never had a respondent object to answering these types of questions, especially when they are part of a battery of physical functioning questions. In addition, analysis of Activities of Daily Living (ADL) and Instrumental Activity of Daily Living (IADL) limitations in conjunction with outcomes and the type of services a respondent receives is an important outcome measure. This kind of information, along with responses to questions on health conditions, can tell us about the frailty of the respondents served by the nutrition (home-delivered and congregate meals), home care (homemaker/housekeeping), case management and transportation services, and the people who are able to maintain their independence, rather than enter nursing homes, because of those services. Caregivers will be asked about the health conditions and ADL and IADL limitations of their care recipients, again to allow for analysis of the frailty of care recipients whose caregivers are part of the National Family Caregiver Support Program. Respondents can always refuse to answer any question, and the interviewer will move on to the next question on the survey instrument.


A.12 Hour Burden of Information Collection to Respondents

We estimated the respondent burden for the survey instruments based on the First and Second National Surveys of OAA Participants and the Third National Study of OAA Title II Service Recipients. The cost to respondents who participate in the study will be in terms of their time only. The Service Recipient survey instrument takes about 1/2 hour. The Caregiver survey will also take about 30 minutes. Based on the valuation of a participant's time at $20 per hour, the respondent burden for each participant will be $10 for both the Service Recipient and $10 for the Caregiver surveys and $80 for the calls to the agencies (estimated at 4 hours of agency personnel time). Exhibit A-1 presents the estimated hour and annual cost response burden by respondent.


Exhibit A-1. Estimated Hour and Annual Cost Response Burden


Data collection activity

Number of respondents

Responses per respondent

Hours per response

Annual burden hours

Annual burden (cost)

Agency respondent selection process

250

1

4.0

1,000

$20,000







National survey using questions from the Performance Outcome Measures Project : Service Recipient Survey

5000

1

.50

2500

$50,000

National survey using questions from the Performance Outcome Measures Project : National Family Caregiver Support Program Clients

1000

1

.50

500

$10,000







Total

6250

1

5.00

4,000

$80,000

* It is important to note that not all of the respondents (6000 for the national survey) will be asked to complete all of the questionnaire modules (see Sampling Plan).

A.13 Provide an Estimate of the Total Annual Cost Burden to Respondents or Recordkeepers Resulting From the Collection of Information


Total annual cost burden excluding wages of agency time and respondent time is zero (see Exhibit A-1).


A.14 Annual Cost to the Federal Government

The overall cost of this information collection to the Federal Government is presented in Exhibit A-2.


Exhibit A-2. Overall Cost to the Federal Government


Category

Costs

Personnel (plus consultants)

$259,572

Local travel

$200

Telephone (long-distance telephone survey)

$39,980

Other direct

$36,771

Total direct charges (per contract)

$437,889

Total

$774,412


A.15 Program Changes

This is a new collection of information.


A.16 Plans for Data Analysis/Publishing



AoA will post the results of this survey on the web and incorporate the results into numerous published documents such as the AoA Performance Budget and the Annual Report to Congress. Data analysis for performance measurement indicators incorporated into these reports will consist mainly of simple frequency tables and cross tabulations. When the data is used in evaluation work, some regression analysis may be employed.


The timetable for the Fourth National Study of OAA Title III Service Recipients is shown in Exhibit A-3.


Exhibit A-3. Data Collection Timetable


Data collection activity

End dates

Telephone contact with agencies to draw sample

3 months after OMB clearance

Telephone survey of participants

7 months after OMB clearance

Data editing, coding and key entry, data analysis

10 months after OMB clearance

Final report

15 months after OMB clearance


A.17 Display of Expiration Date of OMB Approval

The Administration on Aging is not seeking an exemption from displaying the expiration date of OMB approval.


A.18 Exception for Item 19, “Certification for Paperwork Reduction Act Submissions.” of OMB Form 83-I

AoA is not requesting any exceptions from OMB Form 83-I.

B. Collection of Information Employing Statistical Methods

B.1 Respondent Universe and Sampling Methods

The respondent universe will vary depending on the data collection instrument (see Exhibit B-1). For the telephone contact with the State and Area Agencies on Aging, the research team will collect information from a probability sample that is determined by selecting agencies proportional to size (PPS) of the total annual budget. The service recipients within each sampled AAA will be selected at random. In this way, all service recipients will have a known probability of selection for the sample.


This is the fourth time this type of study will be conducted. The earlier OMB approved surveys (0985-0014, 0985-0017, 0985-0020) were conducted in 2002-2003, 2004 and 2005. Before Westat contacted the AAAs, a letter was sent explaining the survey, with the materials needed to develop lists of participants. (A copy of these materials was also sent to each State Unit on Aging that had AAAs in the study.) Once each agency had a participant list for each service, they contacted Westat for the random selection of the respondents to be interviewed. This process was completed using a program Westat developed, whereby the number of participants on the service list was entered into the program, and the program gave line numbers of the respondents it selected. The number of respondents to select per service was already entered into the program and was based on the size of the agency. The agencies then provided the participant names and telephone numbers associated with those line numbers to Westat. Westat will use the same procedure to select respondents for the Fourth National Study of Title III Service Recipients.


The research team knows that most of the AAAs will not have a complete unduplicated list of service recipients for their agencies. The research team therefore proposes using the same methodology for selecting clients for the survey as used in the previous three surveys: requesting AAAs to develop lists of clients (by service) who have received that service within the past 12 months, then randomly selecting respondents for the survey from those lists. This will help reduce burden: agencies who have previously participated in the survey would otherwise need to learn a new way of preparing the list for selection of respondents or (for all selected AAAs) would need to develop an unduplicated list across services. The services for which lists would be developed would be: home delivered meals, congregate meals, transportation, case management, and homemaker services as well as caregivers who are served by the National Family Caregiver Support Program. Appendix E contains the pre-notification packet the agencies will receive.


For the First National Survey, Westat recruited 132 out of 150 agencies, for a response rate of 88 percent. For the Second National Survey, Westat recruited 138 out of 165 agencies, for a response rate of 83.6 percent. For the Third National Survey, Westat recruited 272 out of 310 agencies, for a response rate of 87 percent. When selecting agencies for the fourth survey, Westat will select a sample size large enough to recruit at least 250 Area Agencies on Aging. Westat will assure that at least 80 percent of the sampled AAAs agree to participate. Westat research assistants will encourage the participation of all selected agencies by contacting those that have not called with participant lists, coaching them on how to easily set up their lists, and assuring them that the time involved for them to complete the participant selection procedures will be minimal, just as Westat did for the first, second and third surveys. For agencies that refuse to participate, Westat will send them a refusal conversion letter (already developed for the previous three surveys), and call them one more time to try to gain their cooperation. Once an agency refuses a second time, Westat will not try to contact them again. For the Evaluation of Independent Living Programs (an OMB-approved national study for the Department of Rehabilitation Services Administration Office of Special Education and Rehabilitative Services, U.S. Department of Education), and for the first, second and third AoA National Surveys, Westat research assistants called the original agencies, sent E-mails, and/or faxes, and resent recruitment packages via FedEx. Westat will use the same techniques to gain cooperation for the fourth study.


The research team anticipates an 80 percent response rate for the telephone survey of respondents, based on the success we had with the first, second and third surveys. To ensure this high response rate, each AAA will send participants who are eligible for the telephone survey a letter before they are contacted by an interviewer. The letter will be on their AAA’s letterhead, as was the precontact letter for the first three surveys. Westat will attempt to contact participants at different times of the day and different days of the week to maximize the possibility of contact. Westat is also experienced in refusal conversion procedures, generally achieving a 25 to 30 percent refusal conversion rate. For all three National Surveys of OAA Participants, the refusal conversion rate was about 40 percent.


Exhibit B-1 presents the respondent universe for each module proposed for the Fourth National Study of OAA Title III Service Recipients.


Exhibit B-1. Respondent universe


Performance measures

Indicator

Participants to be sampled

Service Recipient Survey

Congregate Meals Module

Questions on nutrition intake, nutrition risk, food security and clients’ assessments of the Congregate Meals program.

All service recipients receiving Congregate Meals services

Home-delivered Meals Module

Questions on nutrition intake, nutrition risk, food security and clients’ assessments of the Home-delivered Meals program.

All service recipients receiving Home Delivered Meals

Transportation Module

.Questions on client’s experience and assessment of transportation services

All users of Transportation Services

Case Management Module

Questions on clients’ experiences and assessments of case management services.

All service recipients receiving case management services.

Homemaker/Housekeeping Module

Questions on clients’ experiences and assessments of Homemaker/Housekeeping services.

All service recipients who receive Homemaker/Housekeeping Services

Additional Services List

Questions asking service recipients if they receive other OAA services.

All service recipients. Caregivers will be asked about services received by their care recipients.

Physical Functioning Module

Revised Katz Activities of Daily Living (ADL) Scale and Quality of life measures from the CDC questionnaire1, as well as the full SF 12 v. 2 (see Appendix H)2.

All service recipients. Caregivers will be asked these questions about their care recipients.

Caregiver Survey

National Family Caregiver Support Program Questionnaire

Questions on caregiver support and assessment of the program based on the Caregiver survey developed for the first, second and third national surveys.

Caregivers who participate in the National Family Caregiver Support Program

Service Recipient and Caregiver Surveys

Demographic Information Module

Demographic information

All service recipients and caregivers


B.2 Procedures for the Collection of Information

B.2.1 Introduction

Several data collection activities will be conducted to support the study. They are designed to inform AoA on results of performance measures for state and community programs on aging under the Older Americans Act.


B.2.2 Data Collection Procedures

B.2.2.1 Telephone Contact with State and Local Agencies on Aging

Information will be collected in a two-step process. The proposed design will employ a probability sample of all State and Local Agencies on Aging proportional to size (PPS) of the total annual budget. Once an agency is selected, it will receive a FedEx package which contains an introductory letter from AoA along with detailed instructions for the AAA. Approximately two days later, a researcher will call the agencies to explain the purpose of the participant telephone survey and the agency’s role in it. The agencies will be asked to develop lists of clients by service: Family Caregiver Support Program, home delivered meals, congregate meals, case management, transportation and homemaker service recipients. The researcher will explain the participant lists the agency needs to develop (if they do not already have them) and to schedule a time for the researcher to call back to select the respondents from each list, to participate in the survey. Previous experience has enabled Westat, the contractor, to streamline the data collection procedures for the AAAs.


B.2.2.2 Telephone Survey of Participants and Caregivers

This activity entails conducting a 30-minute telephone survey of a representative sample of Older Americans Act participants and caregivers. The interviews are designed to determine participant and caregiver assessment of program participation and client reported progress outcomes.


B.2.3 Sampling Plan

B.2.3.1 Sample Design

The sample design for the fourth survey will consist of two stages, with a sample of 250 AAAs in the first stage and a sample of clients, by service type, from each selected AAA, in the second stage. This design is similar to that of the third survey. The client sample sizes by service type, as specified by the AoA, are as follows:


  • Caregiver Services 1,000

  • Home Delivered Meals 1,000

  • Congregate Meals 1,000

  • Case Management Services 1,000

  • Transportation Services 1,000

  • Homemaker Services 1,000


As in the third national survey, these sample sizes will permit the production of reliable estimates both at the national level and at the geographic regional or demographic sub-group level.


For a two-stage design, Table B-1 presents the half-widths of the 95 percent confidence intervals (CI) for various sample sizes and for target characteristics of proportions ranging from 10 percent to 50 percent.3 The 50 percent target is a worst-case scenario, where respondents are expected to be fairly evenly split on a particular response item, limiting the reliability of the estimate (e.g., such as trying to predict the outcome of an election where the sample of voters is about evenly divided between two candidates). Also, the precision of any estimate greater than 50 percent is the same as that of its complement, i.e., the precision of a 70 percent estimate is the same as the precision of a 30 percent estimate. The numbers in the tables are half-widths of 95 percent CIs, (i.e., the estimate, the half-width is the CI, where half-width is 1.96 times the standard error (SE) of an estimate). For example, Table B-1 shows that for a sample of size 1,000, for a target characteristic of around 30 percent, the CI would be the estimate 3.24 percent.


The table can be used to assess the adequacy of the sample sizes for both the national, and the regional or sub-group level estimates. For example, if the sample size is 1,000 at the national level then the sixth row in Table B-1 would provide the precision of the estimates at the national level. From the same table, the precision of an estimate at the regional or sub-group level can be obtained by computing the sample size that is expected for a particular region. For instance, if the region covers 25 percent of the target population, then the sample size for that region is expected to be about 250 (out of 1,000) under a proportional allocation, and the precision of the estimates for that region can be checked from the row where the sample size equals 250 in Table B-1. Similarly, if a sub-group covers 10 percent of the target population then the expected sample size for that sub-group is 100 out of 1,000 and the precision of the estimates for that sub-group can be checked from the row with sample size equal 100.


The total size of the target population has a negligible impact on the sample size requirement. For example, if a sample size of 250 is required to produce an estimate at the national level, then to estimate the same characteristic for a particular region (with the same level of precision), the required sample size from that region alone would be about 250. If there are four regions, then the required sample size at the national level would be about 1,000 (to guarantee adequate representation in each group). Therefore, to meet the objective of the proposed study (i.e., to produce estimates at the regional or sub-group level with the same level of precision as the national estimates obtained from previous studies), the required sample size for each target region or sub-group will have to be the same as the total sample size of the previous studies.


For instance, a question was asked in the first national survey about the timeliness of the delivery of meals and an estimated 44 percent of all clients reported that the meals arrived on time, all the time. This estimate was based on a sample of 472 clients and had a CI of 5.2 percent. Table B-1 shows that to achieve a CI of 5.2 percent for an estimate, with a proportion between 40 percent and 50 percent, a sample of size around 480 is required. That means if this estimate is required at the regional level with the same level of confidence as the national, then the sample size in each region will have to be 480 and hence the sample size at the national level will be 480x4=1,920. In that case, the CI for this estimate at the national level would be much more precise than for the region (a little over 2.5 percent). Table B-1 can be used to the see the precision of the estimates that would be achieved at various levels using the expected sample sizes at the respective levels. The table can also be used to check the sample size requirement corresponding to a desired level of precision of an estimate.


Table B-1 Half-widths of 95 percent confidence intervals by various sample sizes and estimates of target characteristics (computed for a two-stage design with a design effect of 1.30)

Sample size

Estimates of target characteristics

10 percent

20 percent

30 percent

40 percent

50 percent

3,500

1.13

1.51

1.73

1.85

1.89

3,000

1.22

1.63

1.87

2.00

2.04

2,500

1.34

1.79

2.05

2.19

2.23

2,000

1.50

2.00

2.29

2.45

2.50

1,500

1.73

2.31

2.64

2.83

2.89

1,000

2.12

2.83

3.24

3.46

3.53

750

2.45

3.26

3.74

4.00

4.08

500

3.00

4.00

4.58

4.90

5.00

400

3.35

4.47

5.12

5.47

5.59

300

3.87

5.16

5.91

6.32

6.45

250

4.24

5.65

6.48

6.92

7.07

200

4.74

6.32

7.24

7.74

7.90

100

6.70

8.94

10.24

10.95

11.17


It is important to note that if the population sizes in the sub-groups or regions vary widely, then the national sample must be allocated appropriately to produce estimates from all individual sub-groups/regions with an equal level of precision. Otherwise, under a proportionate allocation, larger sub-groups will have more than the required sample size while the smaller sub-groups will have less than the sample size required. For example, if the estimates are required separately for Whites and African-Americans, then just increasing the national sample would not ensure sufficient sample size for African-Americans, because less than 15 percent of recipients are African-Americans for many services. In this situation, the national sample can be disproportionately allocated by over-sampling smaller sub-groups to ensure that sufficient samples are drawn from all target sub-groups. However, over-sampling an ethnic or demographic group will require that agencies first list all their clients with the characteristic of interest and then select a sample from this list by sub-group (which may exceed the capacity of many AAA information systems).


B.2.3.2 Sample Size for Estimation of Change

If there is interest in comparing estimates from one year with another year, or comparing estimates of one sub-group with another sub-group, the sample size requirements are different from those that show individual point estimates at the same level of precision. The standard error (SE) of the difference between two independent estimates (for example, A and B) can be obtained by the formula , and the half-width of the 95 percent CI is . Since the variance of the estimate (of a difference between estimates) is the sum of the variances of the relevant individual estimates, the required sample size for estimating a difference or change is higher than for a single point estimate.


Table B-2 presents half-widths of 95 percent CI’s under a two-stage design for various sample sizes and various averages of the two estimates to be compared. For example, if the average of the two target characteristics to be compared is around 30 percent (for example, A=25 and B=35) and the sample size in each sub-group is 500, to detect a difference between the two sub-groups with statistical significance, the actual difference between the two sub-group characteristics will have to be at least 6.48 percent. This is much higher than the corresponding half-widths presented in Table B-1 for each of the individual estimates. That means a sample size that is sufficient to produce a reliable point estimate for each sub-group, individually, is not necessarily sufficient to detect the difference between the two sub-groups with the same level of precision.


Therefore, if the survey is designed for use at a region or sub-group level, then the corresponding national estimates can be compared meaningfully from one year to another, or for one service versus another (e.g., the percent of each service’s clients below a certain income level). For example, if the sample size is 1,000 in each year, and if the average response proportion for the two target characteristics is around 30 percent, then a difference of 4.58 percent or more between the years is detectable. The corresponding comparison with a sub-group sample of size 500 would not allow the detection of a difference smaller than 6.48 percent. Table B-2 can be used to see the extent of difference that can be detected under a two-stage design, for various sample sizes, and for various characteristics to be compared, either at the national or at the sub-group level.


Table B-2 Half-widths of 95 percent confidence intervals for the difference between two estimates by various sample sizes and for various averages of the two estimates (computed for a two-stage design with a design effect of 1.30)



Sample size in each group

Average of the estimates to be compared

10 percent

20 percent

30 percent

40 percent

50 percent

3,500

1.60

2.14

2.45

2.62

2.67

3,000

1.73

2.31

2.64

2.83

2.89

2,500

1.90

2.53

2.90

3.10

3.16

2,000

2.12

2.83

3.24

3.46

3.53

1,500

2.45

3.26

3.74

4.00

4.08

1,000

3.00

4.00

4.58

4.90

5.00

750

3.46

4.62

5.29

5.65

5.77

500

4.24

5.65

6.48

6.92

7.07

400

4.74

6.32

7.24

7.74

7.90

300

5.47

7.30

8.36

8.94

9.12

250

6.00

8.00

9.16

9.79

9.99

200

6.70

8.94

10.24

10.95

11.17

100

9.48

12.64

14.48

15.48

15.80



B.2.3.3 Sample Survey Operations

Westat will work with the States and the 250 selected AAAs to draw a random sample of OAA Service recipients for the six service areas being studied: National Family Caregivers Support Program, Home-Delivered Meals, Congregate Meals, Case Management Services, Transportation Services and Homemaker Services. This work will be completed after OMB clearance. Westat will contact the selected AAAs, collect client sample contact information, and submit this material to the Telephone Research Center (TRC) for interviewing purposes. Based on the experience of the previous surveys, the AAA recruiting process will take three months to complete; however, the client interviewing will be completed at the end of seven months from the date of the beginning of the sampling process, given the benefit of concurrent sampling and interviewing.


B.2.4 Older Americans Act Participant Survey Instruments

The study consists of telephone interviews with service recipients and caregivers. The interview is structured and will contain specific questions about the mix of services the person has received and his or her assessment of those services. Whenever appropriate, questions will contain predefined categories. Probes will be used to facilitate obtaining complete responses to all the questions. The interviews of caregivers will not include the questions that ask for physical functioning (except health conditions and ADL and IADL limitations of their care recipients). The interviews will last approximately 30 minutes and cover the topics discussed below. Since service recipients will be selected from lists by service, respondents will only be asked about the service for which they were selected for an interview. This is the same process followed for each of the previous surveys. These questionnaires can be found in Appendix H.


  1. Nutrition-Congregate Meals--If a respondent receives Congregate Meals, they will be administered a questionnaire based on the Congregate Meals survey, used for the first and second national surveys, as well as POMP I through VI. This questionnaire asks how long they have been attending the congregate meals program; how often they eat at the site; when the last time was they ate at the site;, to rate the program; their daily food intake and how much of their food intake the meal provides on the days they eat at the site.

  2. Nutrition-Home-delivered Meals— If a respondent receives Home-delivered Meals, they will be administered a questionnaire based on the Home-delivered Meals survey, used for the first, second and third national surveys, and POMP I through VI. This questionnaire asks how long they have been receiving home-delivered meals; how often they receive home-delivered meals; when the last time was they received a meal; to rate the program; their daily food intake and how much of their food intake the meal provides on the days they receive home –delivered meals.

  3. Transportation—All service recipients who use transportation services will be interviewed using this survey module. The module asks how long they have been using the transportation; how often they use it; when the last time was they used it; trip purpose; to rate the transportation service and about the number of times the respondent uses the service. This module is based on the instrument used for the first three surveys, and all six of the POMP surveys.

  4. Homemaker/Housekeeping—Questions on the impact of homecare services will be asked of respondents who receive homemaker or housekeeping services. These questions are based on the Housekeeping Service Module developed by the POMP VI grantees. Again, the set of questions is similar to those asked of the other services: how long respondents have been receiving homemaker services; how often they receive homemaker services; when the last time was they used the services; to rate the program and if they can depend on their aides to do deliver the allotted services.

  5. Case Management—Service recipients who receive case management services will be asked questions about their experiences with the program. They will be asked: how long they have been receiving the services; how they would rate the various aspects of the case management services (e.g. ease of contact with the case managers; if the case managers understand their needs, etc.); to rate the services overall and if they contribute to the decisions about their care. This module is based on the case management module developed by the POMP V grantees.

  6. Service List--All service recipients will then be asked about the mix of services they receive and the impact of those services. They will also be asked to rate the services overall. This module is based on the service module used for the third national survey, with added questions from POMP VI.

  7. Physical Functioning— This module will be asked of all service recipients. This survey module will include questions on: how the respondents’ mental and physical health affect their day-to-day lives, Activity of Daily Living limitations (e.g., difficulty with personal care activities such as bathing and dressing) and Instrumental Activity of Daily Living limitations (e.g., difficulty with such home management activities as meal preparation, shopping, and housekeeping). Questions about the respondents’ health are also being asked, to help with assessing the frailty of the clients served by OAA services. Caregivers will be asked these questions about their care recipients.

  8. National Family Caregiver Support Program Assessment—Caregivers who receive caregiver support services through the National Family Caregiver Support Program will be surveyed as part of the Fourth National Study of Title III Service Recipients. This module has questions on services offered to caregivers through the National Family Caregiver Support Program, and the impact of those services. There are also questions about services the care recipient receives and satisfaction with and impact of those services; support the caregiver receives, either as part of a formal support group or from other relatives and friends; and what kinds of other information the caregiver would find valuable. The survey asks about the type of help the caregiver provides for the care recipient, the amount of time they provide care, benefits caregiving provides them, drawbacks of caregiving (financial burdens, lack of private time, etc), and demographic and health information on the care recipient. Three of the questions for this module were adapted from an AARP survey, Caregiving in the U.S.4

  9. Demographic information of the respondent—Demographic information about the respondent will be collected, including type of area of residence (urban, suburban, or rural), Zip Code, education level, race, gender, living arrangements (living alone, with spouse, or with others), and income level. Response items to the income questions for meals recipients have been set to enable determination of poverty level. This module will be administered to all participants. The caregiver survey already includes some demographic questions about the care recipient, but the demographic information on the caregiver will be gathered using this demographic module.

Many of the national survey questions come from such commonly used vehicles as the Survey of Income and Program Participation (SIPP), (e.g., the ADL and IADL questions), the Behavioral Risk Factor Surveillance System (BRFSS) surveys conducted within each state using HHS/CDC standard questions, CDC’s Quality of Life Measures, the Sf12 v. 2 and other existing surveys. These are virtually the same instruments used for the previous three national surveys as well (see Appendix F for a comparison of the questions on the first, second, third and fourth surveys).


B.3 Eliciting Cooperation/Maximizing Response Rates

AoA does not expect problems in eliciting cooperation from the respondents who will be interviewed during the course of the study. Proven methods will be used to achieve very good completion rates (80% for agencies selected for the participant survey and for service recipients). These methods include a letter from AoA informing agencies about the study, encouraging cooperation, and using knowledgeable Westat staff to contact the State and Local Agencies and respondents.


Nonresponse adjustment was done as part of the weighting process for the First and Second National Surveys of OAA Participants and the Third National Study of Title III Service Recipients, and will also be done for the fourth study. The weights of the respondents were inflated to account for the weights of the non-respondents separately for each service. The adjustment was applied independently within nonresponse adjustment groups defined by census region and size of the agencies. That means the non-respondents within a group are represented by the respondents in the same group. The same types of nonresponse adjustment will be done for the 2006 study.


As was done for the previous three national surveys, several steps will be taken to ensure high response rates of agencies (for the First National Survey, Westat recruited of 88 percent of the AAAs; for the Second National Survey, the AAA response rate was 83.6 percent; and for the Third National Survey, Westat recruited 87 percent of the AAAs). First, agencies will receive an early communication introducing the study, explaining the purpose of the data collection, and describing steps that need to be taken to produce numbered participant lists by service (see Appendix E). Westat staff will assist agencies as much as possible by telephone. The agencies will also be given a toll-free number to call with their assigned Westat staff person’s extension, so they may call with any questions they have. Westat staff will also contact agencies that have not contacted them, remind them of the letter they received from the AoA and the packet of information and instructions sent to them by Westat, and assure them of the ease of participating in the selection of respondents


Westat will use proven methods to ensure response rates from older persons. These will include special techniques taught during interviewer training, such as communicating simply and clearly, repeating questions when necessary, and assuring legitimacy and confidentiality. The AAAs will send the selected respondents a prenotification letter that has been approved by AoA, on their own AAA letterhead (see Appendix E), and provide a toll-free number responders can call to verify the study. At all times respondents will be assured of the voluntary nature of the study and the confidentiality of their responses. Westat will also assure them that their decision on whether or not to cooperate with the study will have no effect on their eligibility for services.


Other elements for achieving a high response rate include acquiring an experienced, sensitive interviewing staff; developing a training program that prepares the interviewers for the survey tasks; implementing appropriate interviewing procedures; being sufficiently flexible to accommodate respondents’ requests; and implementing sound management and quality control procedures. Factors that specifically influence reluctant individuals to participate include the following:


Interviewers’ ability to obtain cooperation—Westat will use as many experienced interviewers as possible. New interviewers will be thoroughly trained in general interviewing techniques prior to the project-specific training. All interviewers will be monitored, evaluated, and provided with instant feedback on their performance to eliminate interaction patterns or telephone demeanor that might be detrimental to achieving cooperation. For the national surveys of OAA Participants, Westat used all experienced interviewers. Most of the interviewers for the Third National Study had worked on the prior two studies.


Flexibility in scheduling interviews—Being available to speak with people when it is most convenient for them is sometimes overlooked as a factor that can tip the balance in favor of cooperation for an individual who has doubts about participating. Interviewing activities for the survey will be scheduled to coincide with the hours people are most likely to be at home. Westat normally calls from 9:00AM to 9:00PM Monday through Friday respondent time. However, for the first two surveys, Westat called from 9:00AM to 8:30PM, local time, Monday through Friday. For the Third National Study, per conversations interviewers had with the respondents, Westat began calling at 8:30AM. Westat also calls from 10:00AM to 6:00PM on Saturday respondent time and from 2:00PM to 9:00PM on Sunday respondent time (which will be adjusted to 8:30PM for this survey). Westat will continue calling between 8:30AM and 8:30PM local time Monday through Friday; 8:30AM to 6:00PM local time, on Saturdays, and 2:00PM to 8:30PM local time on Sundays for the Fourth National Study. Interviewers can also schedule exact or general appointment times to accommodate respondents’ schedules.


Procedures to encourage participation—Perhaps the most significant technique for persuading reluctant individuals to participate is the interviewer training segment that encourages customer participation. Nearly as important is a well-planned and concerted effort to convert each refusal to final cooperation.


Refusal conversion--For each case in which the respondent refuses to participate, the interviewer will complete a Non-Interview Response Form (NIRF). The form will capture information about key characteristics of the refusing respondent and the stated reason(s) for refusing to participate.


Special interviewer training sessions led by highly experienced supervisors will be held for a select group of interviewers. The sessions will include participating in the analysis of survey-specific and generic reasons for refusal, preparing answers and statements that are responsive to the objections, effective use of voice and manner on the telephone, and role-playing of different situations. This team of customer cooperation interviewers will re-contact the reluctant respondents. For the 2002, 2004 and 2005 surveys, the refusal conversion rate was about 40 percent.


Use of proxies and interpreters-- Although Westat does not anticipate that many of the respondents will be unable to complete the questionnaire by themselves, the need to rely on some interpreters or proxies for interviews may be necessary (In the first three surveys, less than one percent of the interviews were proxy interviews. The surveys have always been offered in Spanish, to reduce the need for interpreters.). The respondent’s own responses are always preferable to those of a proxy. Therefore, Westat attempts to first determine if someone in the respondent’s household can act as an interpreter. If that is not possible, then a proxy can be interviewed. Westat will allow the use of proxies when the sampled persons cannot or will not respond for themselves. Interviewers will be trained to recognize situations where proxies are appropriate. However, the final decision on using a proxy to complete the interview will be made only by supervisory personnel. The AAA may also indicate the need for a proxy for a particular respondent on the selected participant list that is sent to Westat. If that happens, Westat is also requesting the contact information for the proxy or interpreter.


Quality Control--This survey will be conducted using Computer Assisted Telephone Interviewing (CATI) software. CATI is programmed to follow skip patterns. Interviewers are trained to probe to get complete answers to all survey questions. If a respondent does not know how to answer a question, or refuses to answer a particular question, those options are allowed on the questionnaire as well. However no question can be skipped.


For the survey, Westat will implement procedures to review and edit questionnaire responses. Westat maintains a large in-house data preparation staff experienced in performing tasks for all study types conducted at Westat. During a CATI study, data preparation staff checks the CATI responses for consistency and continuously monitor the data. Interviewer comments and problem sheets are reviewed daily and updates are made as necessary. Frequencies of responses to all data items are reviewed to ensure that appropriate skip patterns are followed by the CATI system. Each item is checked to make sure that the correct number of responses is represented. When a discrepancy is discovered, the problem cases are identified and reviewed.


Frequencies of responses to open-ended and other/specify responses are also run. These responses are reviewed and are either up-coded into existing response categories (for other/specify responses) or categories are developed (for both open-ended and other/specify responses) for analysis.


Cognitive issues related to telephone survey administration--The minimal use of proxies and the fact that the questionnaires have been well tested contributed to few problems with the respondents understanding the questions. Westat also conducted interviewer debriefings at the end of data collection for all three surveys. Interviewers were asked about problems and concerns they had, based on their experiences interviewing the respondents. Revisions were made to the questionnaire, based on the information from the debriefing. For instance, interviewers said that caregiver respondents in the first national survey sometimes were not sure they qualified as caregivers. Interviewers asked for a better way to help define who was a caregiver. For the Third National Study of Title III Service Recipients, only caregivers who were served by the National Family Caregiver Support Program were interviewed. The same will be true for the fourth national survey. Interviewers also indicated the need for additional response categories for different questions. For example, some respondents told us they do not eat certain types of food (e.g. fruit) every day. Response options have been added that capture that information.


B.4 Tests of Data Collection Instruments and Procedures

The sampling procedures and previous data collection instruments were successfully used.. The fourth national study instruments were developed and pre-tested by the POMP VI AAAs. The selection of respondents from numbered lists was successfully used for the previous three AoA National Surveys and for a recent OMB-approved Westat study conducted for the U.S. Department of Education’s Rehabilitative Services Administration. These performance measures were used for the previous three national surveys and revised versions will also be used for the Fourth National Study of Title III Service Recipients modules. The surveys have been revised to 1) allow respondents to assess the services they receive and to evaluate the combination of services received and 2) make the questionnaires less customer satisfaction oriented and more assessment/outcome based.


B.5 Use of Statistical Survey Methodology

The use of statistical sampling methods is critical to this study. Westat has developed the sampling plan for this survey as described in Section B.2.3, using standard statistical methods. Westat and the Administration on Aging are also responsible for selecting the sample, and carrying out the analyses. AoA has consulted with Dwight Brock, a Westat statistician, on developing the sampling plan for the selection of the agencies and the selection of the participants, as well as the survey methodology for the survey.


List of Appendices


Appendix


A Federal Register Notice


B Pertinent Legislation


C Data Collection Plan


D Westat Assurance of Confidentiality Agreement


E Prenotification Letters and Agency Sampling Instructions


F Question by Question Comparison: 2002, 2003 and 2004 National Surveys of OAA
Title III Service Recipients Survey instruments


G Research Triangle Institute, Evaluation of Title III-B Supportive Services Regression Analysis


H Survey Instruments


Appendix A


Federal Register Notice
Published by the Administration on Aging
For the Proposed Information Collection

[Federal Register: July 29, 2005 (Volume 70, Number 145)]

[Notices]

[Page 43871-43872]

From the Federal Register Online via GPO Access [wais.access.gpo.gov]

[DOCID:fr29jy05-105]


-----------------------------------------------------------------------


DEPARTMENT OF HEALTH AND HUMAN SERVICES


Administration on Aging


Agency Information Collection Activities; Proposed Collection;

Comment Request; Fourth National Survey of Older Americans Act Title

III Service Recipients


AGENCY: Administration on Aging, HHS.


ACTION: Notice.


-----------------------------------------------------------------------


SUMMARY: The Administration on Aging (AoA) is announcing an opportunity

for public comment on the proposed collection of certain information by

the agency. Under the Paperwork Reduction Act of 1995 (the PRA),

Federal agencies are required to publish notice in the Federal Register

concerning each proposed collection of information, including each

proposed extension of an existing collection of information, and to

allow 60 days for public comment in response to the notice. This notice


[[Page 43872]]


solicits comments on the information collection requirements contained

in the annual consumer assessment survey which is used by AoA to

measure program performance for programs funded under Title III of the

Older Americans Act.


DATES: Submit written or electronic comments on the collection of

information by September 27, 2005.


ADDRESSES: Submit electronic comments on the collection of information

to: Cynthia.Bauer@aoa.gov. Submit written comments on the collection of

information to Administration on Aging, Washington, DC 20201.


FOR FURTHER INFORMATION CONTACT: Cynthia Agens Bauer on 202-357-0145.


SUPPLEMENTARY INFORMATION: Under the PRA (44 U.S.C. 3501-3520), Federal

agencies must obtain approval from the Office of Management and Budget

(OMB) for each collection of information they conduct or sponsor.

``Collection of information'' is defined in 44 U.S.C. 3502(3) and 5 CFR

1320.3(c) and includes agency request or requirements that members of

the public submit reports, keep records, or provide information to a

third party. Section 3506(c)(2)(A) of the PRA (44 U.S.C. 3506(c)(2)(A))

requires Federal agencies to provide a 60-day notice in the Federal

Register concerning each proposed collection of information, including

each proposed extension of an existing collection of information,

before submitting the collection to OMB for approval. To comply with

this requirement, AoA is publishing notice of the proposed collection

of information set forth in this document. With respect to the

following collection of information, AoA invites comments on: (1)

Whether the proposed collection of information is necessary for the

proper performance of AoA's functions, including whether the

information will have practical utility; (2) the accuracy of AoA's

estimate of the burden of the proposed collection of information,

including the validity of the methodology and assumptions used; (3)

ways to enhance the quality, utility, and clarity of the information to

be collected; and (4) ways to minimize the burden of the collection of

information on respondents, including through the use of automated

collection techniques when appropriate, and other forms of information

technology.

Fourth National Survey of Older Americans Act Title III Service

Recipients--NEW--This information collection, which builds on earlier

national pilot studies and performance measurement tools developed by

AoA grantees in the Performance Outcomes Measures Project (POMP), is a

comprehensive recipient survey which will include consumer assessment

modules for the Home-delivered Nutrition Program, Congregate Nutrition

Program, Transportation Services, Homemaker Services and Chore

Services. Recipients of services from the National Family Caregiver

Support Program will also be surveyed. Copies of the POMP instruments

can be located at http://www.gpra.net. This information will be used by AoA to


track performance outcome measures; support budget requests; comply

with Government Performance and Results Act (GPRA) reporting; provide

information for OMB's Program Assessment Rating Tool (PART); provide

national benchmark information for POMP grantees and inform program

development and management initiatives. AoA estimates the burden of

this collection of information as follows:

Respondents: Individuals.

Number of Respondents: 6,000.

Number of Responses per Respondent: One.

Average Burden per Response: 30 minutes.

Total Burden: 3,000 hours.


Dated: July 26, 2005.

Josefina G. Carbonell,

Assistant Secretary for Aging.

[FR Doc. 05-15037 Filed 7-28-05; 8:45 am]

BILLING CODE 4154-01-P


Appendix B


Pertinent Legislation

Pertinent Legislation:


GPRA (P.L.103-62, http://thomas.loc.gov) and, more recently, PART (www.whitehouse.gov/omb/budget/fy2005/part.html) require Federal agencies to develop annual and long-term performance outcome measures and to report on these measures annually. Section 202(f) of the OAA (www.aoa.gov/about/legbudg/oaa/legbudg_oaa.asp) requires AoA to work collaboratively with States and Area agencies to develop performance outcome measures.


1 Centers for Disease Control and Prevention (n.d.). How does CDC measure population health-related quality of life?. Retrieved April 17, 2006 from National Center for Chronic Disease Prevention and Health Promotion, Health-Related Quality of Life: Methods and Measures Web site: http://www.cdc.gov/hrqol/methods.htm

2 Ware JE, Kosinski M, and Keller SD. A 12-Item Short-Form Health Survey: Construction of scales and preliminary tests of reliability and validity. Retrieved April 17, 2006. Medical Care, 1996;34(3):220-233. Web site: http://www.sf-36.org/demos/SF-12v2.html.

3 This percent range refers to the client response patterns that may occur; for example, in a yes/no question, it refers to the expected percent of respondents who will answer yes, versus no.

4 National Alliance for Caregiving and AARP (2004, April). Caregiving in the U.S. Appendix C. Retrieved July 27, 2004 from AARP, Web site: http://research.aarp.org/il/us_caregiving.pdf, pp. 16-17


File Typeapplication/msword
File Title7420.01: OMB Package. Section A. Introduction
AuthorMARKOVICH_L
Last Modified ByAdministrator
File Modified2007-09-19
File Created2007-05-17

© 2024 OMB.report | Privacy Policy