OMB-PRA Parts A_Redesigned NSOAAP_FINAL_16feb2018

OMB-PRA Parts A_Redesigned NSOAAP_FINAL_16feb2018.docx

National Longitudinal Survey of Older Americans Act Participants

OMB: 0985-0023

Document [docx]
Download: docx | pdf

Application for Data Collection



National Longitudinal Survey of

Older Americans Act Participants: Supporting Statement




February 6, 2018







Submitted by:


U.S. Administration for Community Living

Administration on Aging

330 C Street, SW

Washington, DC 20201




Table of Contents

Chapter Page


A. Justification 1


A.1 Circumstances Making the Collection of Information Necessary 1

A.2 Purpose and Use of the Information Collection 6


A.2.1 Cognitive Testing 6

A.2.2 The National Longitudinal Survey of Older Americans Act Participants 6


A.3 Use of Improved Information Technology & Burden Reduction 7


A.3.1 Cognitive Testing 7

A.3.2 The National Longitudinal Survey of Older Americans Act Participants 8


A.4 Efforts to Identify Duplication & Use of Similar Information 10

A.5 Impact on Small Businesses and Other Entities 10

A.6 Consequences of Collecting the Information Less Frequently 10

A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5 11

A.8 Comments in Response to the Federal Register Notice & Efforts to Consult Outside the Agency 11

A.9 Explanation of Any Payment or Gift to Respondents 13

A.10 Assurance of Privacy Provided to Respondents 13

A.11 Justification for Sensitive Questions 15

A.12 Estimates of Annualized Burden Hours and Costs 15

A.13 Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers 16

A.14 Annualized Cost to the Federal Government 17

A.15 Explanation for Program Changes or Adjustments 17

A.16 Plans for Tabulation & Publication and Project Time Schedule 17

A.17 Reason(s) Display of OMB Expiration Date is Inappropriate 18

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions 19


Table of Contents
(continued)

Chapter Page


B. Collection of Information Employing Statistical Methods 20


B.1 Respondent Universe and Sampling Methods 20

B.2 Procedures for the Collection of Information 22


B.2.1 Data Collection Procedures for the Cognitive Testing 22

B.2.2 Data Collection Procedures for the NLSOAAP 23


B.2.2.1 Telephone Contact with State and Local Agencies on Aging 23

B.2.2.2 Telephone Survey of Older Americans Act Participants and Caregivers 23


B.2.3 Sampling Plan 25


B.2.3.1 Sample Design 25

B.2.3.2 Sample Size for Estimation of Change 27


B.2.4 Older Americans Act Participant Survey Instruments 28


B.3 Methods to Maximize Response Rates and Deal with Nonresponse (NLSOAAP) 31

B.4 Tests of Procedures or Methods to Be Undertaken 33

B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 33


Appendices


A Pertinent Legislation A-1

B Baseline and Follow-Up Surveys B-1

C Invitation Letter for Cognitive Testing Participants C-1

D Instructions for AAA Access to the Survey Website & How to Submit Data D-1

E Federal Register Notice Published by ACL/AoA for the Proposed Information Collection E-1

F Westat Assurance of Confidentiality Agreement F-1

G Agency Information Packet` G-1

Table of Contents
(continued)

Appendices Page


H Reminder Card H-1

I Telephone Script I-1


Tables


A-1. Cognitive Testing Protocols 6

A-2. 60-Day Federal Register Comments and ACL Responses 11


B-1. Half-widths of 95 percent confidence intervals by various sample sizes and estimates of target characteristics (computed for a two-stage design with a design effect of 1.30) 26

B-2. Half-widths of 95 percent confidence intervals for the difference between two estimates by various sample sizes and for various averages of the two estimates (computed for a two-stage design with a design effect of 1.30) 28


Exhibits


A-1. Estimated Hour and Annual Cost Response Burden 16

A-2. Total Annualized Cost to the Federal Government [Based on Year 1] 17

A-3. Data Collection Timetable 18


B-1. Respondent Universe 21



A. Justification


A.1 Circumstances Making the Collection of Information Necessary


Introduction


This OMB package requests clearance to conduct two activities: (1) cognitive testing of an updated National Survey of Older Americans Act (NSOAAP) survey instrument and (2) a three-year longitudinal survey of Older Americans Act (OAA) participants using the updated survey instrument. The baseline survey will be the fourteenth in a series of national surveys of OAA clients. The Year 2 and Year 3 data collections will be the fifteen and sixteenth NSOAAPs, respectively, and will survey the same clients interviewed at baseline. The longitudinal survey will be called the National Longitudinal Survey of Older Americans Act Participants (NLSOAAP).


The first thirteen surveys provided important cross-sectional data on service recipients (e.g., consumer assessment of the quality of services, self-reported outcomes, physical functioning, health status, quality of life, and demographic information). The baseline longitudinal survey will continue to provide rich cross-sectional data. The results of the three-year longitudinal survey will provide ACL with the opportunity to examine the predictors of nursing home placement, the relationship of the receipt of OAA services to the delay in nursing home placement, and how a host of variables, including health and physical functioning, change over time. The longitudinal data will also allow ACL to examine how changes in physical functioning and measures of quality of life affect the assessment of the quality of OAA services.


This survey instrument has been updated since the last OMB approval on February 5, 2018; however, the sampling methodology and the data collection procedures are identical to the previous approved surveys.


ACL/AoA’s Strategy of Program Improvement


The Administration for Community Living’s Administration on Aging (ACL/AoA) has an ongoing strategy of program improvement through enhanced program performance measurement, in compliance with requirements of the Office of Management and Budget’s (OMB) program reviews, the GPRA Modernization Act of 2010 (GPRAMA), and the OAA Section 202(f), by proposing to conduct further studies of program outcomes (see Appendix A for the pertinent legislation).


Previously, ACL/AoA conducted 12 cross-sectional surveys and plans to collect the 13th in 2018. The 12 previous surveys and their OMB control numbers are listed below:


  • Two pilot studies of Older Americans Act Title III Service Recipients in 2003 and 2004 (OMB control numbers 0985-0014 and 0985-0017);

  • Third National Survey of OAA Title III Service Recipients conducted in 2005 (OMB control number 0985-0020);

  • Fourth National Survey of OAA Title III Service Recipients conducted in 2008 (OMB control number 0985-0023);

  • Fifth National Survey of OAA Title III Service Recipients conducted in 2009 (OMB control number 0985-0023).

  • Sixth National Survey of OAA Title III Service Recipients conducted in 2011 (OMB control number 0985-0023).

  • Seventh National Survey of Older Americans Act Participants conducted in 2012 (OMB control number 0985-0023).

  • Eighth National Survey of Older Americans Act Participants conducted in 2013 (OMB control number 0985-0023).

  • Ninth National Survey of Older Americans Act Participants conducted in 2014 (OMB control number 0985-0023).

  • Tenth National Survey of Older Americans Act Participants conducted in 2015 (OMB control number 0985-0023).

  • Eleventh National Survey of Older Americans Act Participants conducted in 2016 (OMB control number 0985-0023).

  • Twelfth National Survey of Older Americans Act Participants conducted in 2017 (OMB control number 0985-0023)


The surveys have enabled ACL/AoA to establish baselines and performance targets for annual and long-term outcome measures required by OMB and incorporate new performance information in agency budget justifications and performance plans through FY 2018. Further, the studies have demonstrated that services provided under Title III:


  • Are effectively targeted to vulnerable populations;

  • Are provided to individuals who need the services;

  • Are highly rated by recipients (quality); and

  • Provide assistance that is instrumental in enabling recipients to maintain their independence.

Performance Measurement Requirements


GPRAMA1 requires federal agencies to develop annual and long-term performance outcome measures and to report on these measures annually. Section 202(f) of the OAA2 requires ACL/AoA to work collaboratively with state and area agencies on aging (AAAs) to develop performance outcome measures.


Since the passage of GPRA in 1993, ACL/AoA has accepted GPRA and GPRAMA as an opportunity to document the yearly results produced through the programs it administers under the authority of OAA. It is the intent and commitment of ACL/AoA, in concert with state and local program partners, to use the performance measurement tools of GPRAMA to continuously improve OAA programs and services for the elderly.


As described on ACL/AoA’s website: “In order to gather information on the performance of its program, the Administration on Aging surveys the participants in its Older Americans Act programs. These national surveys provide a portrait of who receives these services and how they assess the quality of the services received.”3


OAA, Title III – Home and Community-Based Program


Title III of the OAA establishes a home and community-based care program for older persons and their caregivers, to enable them to live as independently as possible for as long as possible. States and local agencies are given much latitude to design services tailored to the needs of their regions and communities. One challenge for ACL/AoA is to devise a means to improve the performance of the program nationally, while preserving and promoting the diversity of program design. ACL/AoA has chosen to work toward improved program performance throughout the Aging Services Network by working collaboratively with states and AAAs to develop performance outcome measurement tools. The tools identify elements of service quality so that states and AAAs can improve service systems at the local level. These same tools can also be employed by ACL/AoA to measure program performance at the national level.


Performance Outcomes Measures Project (POMP)


From 1999 to 2011, ACL/AoA sponsored the Performance Outcomes Measures Project (POMP) demonstration, in which grants were awarded to states to work collaboratively to develop survey instruments that measure elements of service quality and consumer reported outcomes for various services provided under Title III of the OAA. Surveys were developed for the following topics:


Service Domains:

  • Nutrition (including congregate and home-delivered meals)

  • Transportation

  • Information and Assistance

  • Homemaker/Housekeeper

  • Personal Care

  • Caregiver Support

  • Case Management

  • Senior Centers


Client Characteristics:

  • Physical Functioning

  • Demographics

  • Emotional Well-Being

  • Social Functioning


POMP demonstrated the ability of states and AAAs to apply statistically sound sampling techniques to obtain numeric measures of program performance.

The survey instruments developed under POMP – along with various tools necessary for implementation – can be found at https://www.acl.gov/node/465 . These performance measurement surveys have enabled some local agencies to obtain additional financial support and improve program management. Examples of uses of performance measurement at the state and local level follow:


  • The Hawkeye Valley Area Agency on Aging in Waterloo, Iowa compiled information on the level of client support and satisfaction with services and received additional funding from the United Way for exemplary programs.

  • The Area Agency on Aging in Cincinnati, Ohio expanded the use of Home Care Client Satisfaction Measure (HCSM) and incorporated it into an ongoing part of its case management process for all clients to improve service quality.

  • The Florida Department of Elder Affairs developed a computer simulation model that demonstrated the impact of home care programs on reducing nursing home admissions and showed the savings in Medicaid funds


Advanced POMP


A subgroup of POMP grantees participated in the Advanced POMP project, which focused on modeling the extent to which the receipt of OAA services relates to the time delay in nursing home placement. The grantees from North Carolina, Georgia, New York, Iowa, and Rhode Island supplied Westat with administrative datasets of AAA clients. The datasets contained information about the specific services clients received, measures of activities of daily living (ADLs), instrumental activities of daily living (IADLs), and demographics (e.g., age, race/ethnicity, presence/absence of a caregiver, and living arrangements). The datasets also contained the date the client started receiving services and the date the client stopped receiving services (if indeed the client did stop receiving services), and the outcome (e.g., nursing home placement, mortality, continue to receive services, other).


The contractor analyzed the data using a Cox proportional hazards regression model that not only examined the risk factors for nursing home placement, but examined the time in the community as a result of receipt of OAA services. The results across all states showed that the more services clients received (controlling for ADLs), the longer they remained in the community. The contractor repeated the analysis with a subset of respondents in the Health and Retirement Study (HRS) who had similar characteristics to those of the clients in the administrative datasets (e.g., age, race/ethnicity, physical functioning, and receipt of services). The results of the analysis of the HRS were similar to the results of the OAA service recipients. The increase in the number of services received related to remaining for a longer time in the community.


The proposed longitudinal survey will provide ACL/AoA with an opportunity to model the relationship of the receipt of services to a delay in nursing home placement on a national level with increased precision. It will also provide an opportunity to assess individual change in responses over time with increasing age, and reduce bias due to differential selection or confounding factors.


Redesign of the National Survey of Older Americans Act Clients – Phases I and II


ACL/AoA supported two contracts to transform the NSOAAP from a cross-sectional to a longitudinal survey. The Phase I contract focused on developing and evaluating alternative designs for the longitudinal survey. The developmental work resulted in a report that addressed the following questions:


  1. How does the current design need revising to function as a longitudinal survey?

  2. What long are the intervals for data collection after the baseline?

  3. What is the total length of the survey period from baseline to the final data collection?

  4. What is the optimal sample size considering cost constraints?

  5. What is the estimated level of effort?

  6. What is the feasibility and cost of incorporating rotating topical modules?

  7. What is the feasibility of obtaining state-level estimates?

  8. What is the feasibility and cost of creating a core set of measures?

  9. What is the feasibility and cost of replacing the homemaker survey with a generic client survey?

  10. To what extent can the NSOAAP incorporate questions from other national surveys to use as comparisons?


Another Redesign Phase I activity was to update the home delivered and congregate meals survey questions as well as the caregiver questions on the NSOAAP survey instrument. This involved convening an expert panel to review candidate items for updating the survey instrument. The expert panel met over the course of ten months to provide feedback on candidate questions and to recommend additional or alternative items. In addition, the expert panel recommended questions that cover topics not previously covered in the NSOAAP: falls, life changes, social integration, and USDA food security questions. The result of the collaboration with the expert panel consisted of the three updated sets of questions for the ultimate longitudinal survey instrument. The contractor conducted cognitive testing with nine or fewer clients to determine the usability of the nutrition program questionnaires and to test the questions on the new topics. The contractor prepared a report for each nutrition program, with copies of the updated questions. The reports included a discussion of the questions that worked well and those that needed further refinement.


Phase II of the Redesign effort is ongoing. The first year of Phase II focused on finalizing the design for the longitudinal survey, testing the caregiver questions with nine caregivers, updating the caregiver questions, and cross-cutting survey modules. A draft longitudinal (baseline and follow-up survey) instrument resulted in this effort (see Appendix B).


The next Redesign Phase II activity is to conduct a cognitive test of all of the service-specific questions with approximately 120 clients. The contractor developed cognitive testing protocols for each service that include the cross-cutting modules. This cognitive testing is included under this PRA request. The longitudinal survey instrument will be updated based on the results of the cognitive testing.



A.2 Purpose and Use of the Information Collection


This PRA request covers cognitive testing of the updated survey instrument for the National Longitudinal Survey of Older Americans Act Participants (NLSOAAP). The request also covers the conduct of the NLSOAAP, a three-year longitudinal survey.


A.2.1 Cognitive Testing


ACL/AoA contracted with a research firm to assist with updating the NSOAAP survey instrument (Redesign I and II). The purpose of updating the survey instrument was to test questions for their relevancy to contemporary measurement needs of the agency. As discussed earlier, ACL collaborated with an expert panel (consisting of government staff, university experts, and the contractor) to update the survey instrument. In Phase I of the redesign contract, the congregate and home delivered meals questions underwent revisions and testing. In the Phase II Redesign project, the caregiver questions, which had been updated during Phase I, underwent testing, with nine individuals. For the proposed work, the contractor will test each set of updated service-related questions (along with the cross-cutting survey modules) with 20 clients for a total of 120 clients. The service specific questions and cross cutting modules will be tested with clients who receive the specific service. Table A-1 presents the service-specific questions and the relevant cross-cutting questions.


Table A-1. Cognitive Testing Protocols


Service Specific Questions

Additional Services Module

USDA Food Security

Falls

Life changes

Social Integration

Physical Social and Emotional Wellbeing

Demographics

Caregiver



Case Management

Congregate Meals

Home Delivered Meals

Homemaker

Transportation




A.2.2 The National Longitudinal Survey of Older Americans Act Participants


The NLSOAAP will be a three-year longitudinal survey. Data will be collected at baseline, with two additional data collection waves, with one-year intervals between each data collection wave. The results of the baseline will provide ACL/AoA with the following:


  • Performance results for FY 2019 as required by OMB.

  • Performance information for key demographic subgroups, geographical sub-regions, and different types of AAAs which will enable ACL/AoA to identify variations in performance and examine the need for additional targeted technical assistance.

  • Provide refined national benchmarks for use by states and AAAs.

  • Provide secondary data for analysis of various Title III program evaluations


The clients interviewed at baseline will be re-interviewed during wave 2 and wave 3 data collections.

The results of the three-year longitudinal survey will provide performance results for FY 2020 and FY 2021, and data to examine the following:


  • Changes in physical functioning over the three-year study period.

  • Changes in health status over the three-year study period.

  • The extent to which patterns of service utilization change over time and the factors associated with the changes.

  • The relationship of quality of life to physical decline over the three-year study period.

  • The relationship of measures of satisfaction with services to quality of life over the three-year study period.


Data from the NSOAAP and the NLSOAAP are primary sources for performance outcome measures in the Congressional budget justification; the HHS Annual Performance Plan and Report as well as the Annual Report to Congress.  ACL/AoA also uses the data to respond to inquiries from stakeholders, the public, and the press as well as program and policy decision makers. 


Information from the most recent NSOAAP is available on-line on the Aging Integrated Database (AGID) website (https://agid.acl.gov/).  Results are available annually. 



A.3 Use of Improved Information Technology & Burden Reduction


A.3.1 Cognitive Testing


Since a relatively small number of clients will participate in the cognitive testing, there will be very little burden placed on the area agencies on aging (AAAs). The contractor will contact approximately six AAAs to ask for volunteers. The contractor will select 20 respondents for each of the six services for a total of 120 respondents. Once the AAA director has selected participants for the cognitive testing, Westat will send the selected clients an invitation letter (see Appendix C). The contractor will schedule the cognitive testing during a brief phone call and conduct the interviews over the phone at a time that is convenient for the clients who volunteer. During the session, the cognitive interview will be audiotaped for analysis. The interviewer will not record the participants’ survey responses, but they will record notes for how the participants’ react to the posed survey questions as well as any issues participates have with understanding the intent of the questions and the flow of the questions.



A.3.2 The National Longitudinal Survey of Older Americans Act Participants


Use of Client Tracking Software to Generate Client Lists for Sampling (NLSOAAP only)


The proposed procedures and materials requesting information from the agencies, as well as the telephone surveys of respondents, have been designed in a way to minimize respondent burden.


To reduce the burden for the AAAs, the contractor developed procedures for client sampling that enable AAAs’ to use their own client tracking software. Since the implementation of the fourth national survey in 2008, the contractor has worked cooperatively with vendors of commercial off-the-shelf client tracking software programs most commonly used by the state and area agencies on aging to develop step-by-step instructions for the AAAs to generate client lists by service to use for a sample frame. It is estimated that over 95% of the AAAs now have this technological capability and are able to follow the instructions to produce their client lists by service. We will provide similar instructions for the baseline survey. Appendix D contains an example of instructions created for agencies which use a commercial client tracking software system known as “PeerPlace.”


In specific states that have their own proprietary client tracking software, the contractor has worked directly with an information technology specialist at the state-level to generate electronic client lists for all of the AAAs selected for the national survey. This further reduces the burden for AAAs in states that have their own proprietary software.


The proposed three-year study is longitudinal, the contractor will only need to ask the state or AAA contacts to supply the sampling frame once. In Years 2 and 3, the contractor will re-interview the clients selected at baseline. This will eliminate the need for AAAs to sample clients again during the period of the longitudinal survey, reducing the AAA burden from previous iterations of the NSOAAP.


Use of Survey Web Site (NLSOAAP only)


A National Survey web site application https://aoasurvey.org/default.asp has been developed to support and assist with data collection. For the 5th-12th surveys, the contractor designed and used the secure website which the AAAs used to upload their lists of selected clients. That website will be updated and further refined for the baseline data collection.


The web site is divided into two major sections: the public and the restricted-access sections. The public section is accessible to the general public, without restrictions. It includes background information, frequently asked questions, and links to results of previous AoA National Surveys. The purpose of the public section is to provide state and area agencies on aging, professionals in the field of aging, and service recipients and their families with information about the data collection effort and uses of the data.


The restricted-access section of the web site houses an electronic records receipt system. Area Agencies on Aging have the option of submitting private personally identifiable client data to Westat via electronic files using the project web site. The web site was written in Active Server Pages (ASP), HTML, and JavaScript and uses the industry-standard TLS (Transport Layer Security) 1.1/2 encryption for secure data submissions. Agencies choosing this option will receive usernames and passwords that enable their staff to sign on to the file upload utility on the web site. This system supports files in a large variety of file formats. Each agency's data file will be processed according to its structure and content.


Westat programming staff will manually map and convert the data items in each agency’s file to create standardized records for further processing. As each file is received, this system will log the source agency, date received, and file type.


Only agencies that have been selected to participate in the survey will have access to this area. Unique user IDs and passwords will be assigned to each AAA at the time they are selected into the sample. The ID and password will be provided with other survey materials to the AAA.


Appendix D contains instructions for AAA restricted access to the survey website and how to submit data.


Use of Computer Assisted Telephone Interviewing (CATI) (NLSOAAP only)


Westat (the contractor) will use computer-assisted telephone interviewing (CATI) technology to conduct the surveys of OAA service recipients and record the responses. The CATI capability includes customized software systems for scheduling, interviewing, and data handling and utilizes high-speed data networks and centralized voice and data monitoring. A single database is used to monitor and direct the interviewers. The scheduler, a computerized survey control system, makes interviewer assignments, records the disposition of sample cases, and helps survey managers monitor performance.


The contractor will attempt to contact each person in the sample, making multiple calls at different times and days when necessary. To reduce the burden for the respondents, the contractor will schedule appointments for calls at times that are convenient for them. For Spanish-speaking respondents, specially trained bilingual interviewers conduct the interviews in Spanish. If other special arrangements are necessary (e.g., interpreter, proxy needed, mail out requested, interview needed to be conducted over several sessions), the respondent can be further accommodated.


The contractor will take the ACL/AoA-approved final version of the survey instruments and program them into its CATI system. This involves:


  • Inserting specifications into the English version of the questionnaire;

  • Preparing the specifications for the CATI programmer;

  • Translating the questionnaire from the specifications into Spanish; and

  • Programming and testing both versions of the questionnaire into CATI.


Details of how skips will work in the questionnaire are included in the design document, as are the needed question variations. For example, some questions may need to be asked differently, depending on the answers to previous questions. In particular, if a respondent told us they live with others, the next question we would ask would be, “Do you live with your spouse?” However, if the respondent told us they lived alone, the follow-up questions will not be asked, and CATI will automatically skip to the next question.

The use of the CATI system in combination with the highly structured telephone interviewer training and procedures ensures that interviewers conduct the surveys in a professional, controlled, and consistent manner.


A.4 Efforts to Identify Duplication & Use of Similar Information


The cognitive testing will only occur once. The NLSOAAP data collection will be collected at three points in time. Every effort is being made to avoid duplication and minimize respondent burden. Over the last 12 years, Westat conducted the first through 12th National Surveys of Older Americans Act Participants, formerly known as the National Survey of OAA Title III Service Recipients. As a result of the information gathered, modifications have been made to the data collection procedures and to the survey instruments. We believe we have reduced agency and respondent burden to the minimum level possible to achieve the survey's objectives.


The NLSOAAP is not duplicative of other survey efforts because there is no other representative survey of Older Americans Act participants. The HRS (Health and Retirement Study) collects nationally representative data on older adults every two years; however, the HRS is not able to separate out data for OAA participants. The NLSOAAP is a random sample of Older Americans Act (OAA) service recipients only, and cannot be used to make assertions about the American population of older adults in general. The purpose of NSOAAP is to obtain performance outcome information that demonstrate the effect of services and illustrate client reported quality of service. ACL/AoA uses the results of the NLSOAAP to justify budget requests and for program planning. There is no other vehicle for obtaining this information.


A.5 Impact on Small Businesses and Other Entities


No small businesses will be involved in this study for either the cognitive testing or the NLSOAAP.


A.6 Consequences of Collecting the Information Less Frequently


It is important to follow the respondents over a three-year period to determine the extent to which OAA services help clients remain in the community. In the past, the survey instrument asked respondents about the extent to which the receipt of services helped them live at home longer than if they had not received the services at all. For all of the services, respondents indicated that the services did help them stay in the community longer. The longitudinal design, the 14th National Survey, will provide quantitative data to determine the extent to which the services do enable clients to remain in the community as measured in months and/or years.


Interviewers will ask respondents for the permission to conduct the telephone interview once each year for three years. Respondents that agree to participate in the longitudinal component of the study will receive reminder cards at 6-month intervals.


We believe that collecting the data at less than one-year intervals over a three-year period would not provide sufficient information to measure change over time in physical functioning, consumer assessment of services, and self-reported outcomes. Most importantly, it would not provide an opportunity to collect information on those clients who no longer receive services for a variety of reasons, including placement in a nursing home or assisted living facility.


In addition, collecting data less frequently would not allow us to meet required GPRAMA and OAA requirements of annual reporting of program performance. There are no legal obstacles to reduce the burden.


A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5


The cognitive testing and NLSOAAP efforts will be conducted according to the guidelines specified in 5 CFR § 1320.5. No special circumstances are known that would cause inconsistency with these guidelines.


A.8 Comments in Response to the Federal Register Notice & Efforts to Consult Outside the Agency


Comments in Response to the Federal Register Notice


In response to public comments and recommendations from an expert panel, a redesigned information collection tool was drafted and a 60-day Federal Register Notice was published in the Federal Register on September 26, 2017. (https://www.federalregister.gov/documents/2017/09/26/2017-20460/agency-information-collection-activities-public-comment-request-redesign-of-existing-data-collection) (see Appendix E).


ACL received comments from sixty-four organizations and fifteen individuals about the Redesigned NSOAAP. ACL reviewed all of the comments. Two of the comments were deemed not relevant. The first referenced other data collections and not the NSOAAP (i.e., Census), and the other was commentary without reference to the NSOAAP. For ease of review, the remaining comments and their responses have been grouped by topic or issue. The ACL responses for each topic/issue are detailed in Table A-2:


Table A-2. 60-Day Federal Register Comments and ACL Responses


Topic/Issue

Comment

ACL Response

Questions on gender identity


Over 80% of the submitted comments were about this issue. Specifically, many of the comments were that “we encourage ACL to adopt a measure of gender identity” or “improve the methodology for collecting information about the participation of transgender older adults.”



One comment also offered the specific recommendation of aligning with a federal standard on response options for recording gender identity: “recommend that the response options correspond with the national standards developed by the Office of the National Coordinator for Health Information Technology adopted as regulations by the U.S. Department of Health and Human Services (45 Code of Federal Regulations 170.207 - Vocabulary standards for representing electronic health information, Section (o)).”

ACL understands the suggested recommendations. The first step in improving the methodology for measuring gender identity in the survey will be to conduct cognitive testing of the redesigned information collection tool. The cognitive testing will include questions of how respondents feel about the gender question. In addition, respondents who respond “don’t know” or refuse to answer the question “What is your gender?” will be asked “what do you mean by ‘don’t know’?” or ”What do you mean by ‘refused’?”. Based on the cognitive testing of the information collection tool, ACL will work with OMB for final approval of the information collection tool.


ACL appreciates the recommendation of aligning any response options on gender identity in the survey with the responses options from the ONC national standards. Because the ONC standards are for electronic health information and not for survey data collection, further deliberation, informed in part by the previously noted cognitive testing, is needed to ensure that we include the most appropriate and universally accepted response options.

Longitudinal methodology

Several submitted comments supported the transition of the survey from cross-sectional to longitudinal. Specifically it was noted that “the longitudinal survey design proposed for 2019 will enhance ACL’s ability to identify the needs and goals of OAA participants.”

No action/change required to NSOAAP. ACL appreciates and values this feedback.

Questions on sexual orientation

Comments were also received supporting the collection of data on sexual orientation: “commends ACL for its decision to restore a demographic question about sexual orientation.”

No action/change required to NSOAAP. ACL appreciates and values this feedback.

Rotating modules

Three comments supported the inclusion of rotating modules. One organization noted: “We recommend soliciting input for these topical modules before each wave to help identify the most critical need for data collection.”


Another organization states: “… the opportunity to add a rotating topical module to collect information on emerging issues, such as client experiences with discrimination based on age, sexual orientation, race, or other characteristics. [Organization] applauds ACL for these advances.”


And the third organization commented: “We also support that the addition of the unique topical modules to collect additional information about experiences with discrimination related to sexual orientation… we also propose that the ACL add a discrimination based on gender identity question to the topical modules.”

No action/change required to NSOAAP. ACL appreciates and values this feedback and will take these recommendations forward during the planning for the rotating topical modules.

Burden on Area Agencies on Aging

One organization sent a comment that “ We commend ACL for its efforts to identify opportunities to reduce the reporting burden.”

No action/change required to NSOAAP. ACL appreciates and values this feedback.


Efforts to Consult Outside the Agency


ACL/AoA called upon the expertise of an expert panel to review NLSOAAP data collection tools and to make recommendations to ACL on selecting the best language to use for revising questions in the survey instruments. The NLSOAAP expert panel was comprised of experts on aging data and survey methodology.


The survey instruments for this proposed information collection are based on those developed by ACL/AoA POMP grantees representing State Units on Aging and AAAs. POMP grantees who have worked on the survey instruments include state and local level representatives from Arizona, Florida, Georgia, Massachusetts, New York, North Carolina, and Ohio. The development of the survey instruments has been an iterative process. There were no areas of disagreement during the latest POMP revisions.


The POMP grantees tested the instruments with service recipients at the local AAA-level using several methods:

  1. Field-tested the survey instruments with a sample of service recipients and revised the instruments based on their experience.

  2. Conducted cognitive testing to ensure that the items on the survey instruments were interpreted as intended.

  3. Conducted validity testing on the survey instruments.


A.9 Explanation of Any Payment or Gift to Respondents


No payments or gifts will be given to respondents for participating in either the cognitive testing or the NLSOAAP.


A.10 Assurance of Privacy Provided to Respondents


After review, ACL has determined that a Privacy Impact Statement (PIA) is appropriate for this collection. A PIA was submitted to the PIA Representative at the Administration for Community Living in 2017, and will be reviewed on an annual basis following established guidance.


. A pledge of privacy and anonymity is a major positive incentive for potential respondents to participate in the survey. Its absence would be a significant deterrent and could create complications in implementing the survey.


The contractor will take the following precautions to ensure the privacy and anonymity of all data collected:


  • All contractor project staff, including recruitment specialists, telephone interviewers, research analysts, and systems analysts, will be instructed in the privacy requirements of the survey and will be required to sign statements affirming their obligation to maintain privacy;

  • Only contractor staff who are authorized to work on the National Survey have access to client contact information, completed survey instruments, and data files.

  • Data files that are delivered will contain no personal identifiers for program participants; and

  • Analysis and publication of survey findings for the participant survey will be in terms of aggregated statistics only.


Appendix F presents the internal corporate “Assurance of Confidentiality Agreement” all contractor project staff must sign. This agreement requires the signer to keep confidential and private any and all information about individual respondents to which they may gain access. Any contractor employee who violates this agreement is subject to dismissal and to possible civil and criminal penalties.


Westat, the contractor for administering the survey instrument and collecting the data, has extensive experience in protecting and maintaining the privacy of respondent data collected from surveys. To ensure privacy, the contractor has drawn from its experience in designing the data collection procedures incorporated in this program. In addition to the corporate Assurance of Confidentiality Agreement, the contractor has implemented several other procedures to protect privacy of survey participants.


  1. Data is saved on secure network folders only accessible to authorized users. No data is ever stored on laptop computers. At the end of the survey, all private data is permanently deleted.

  2. For the baseline AAAs will be instructed to submit private personally identifiable client data to Westat via electronic files using the secure survey web site. This web site is written in Active Server Pages (ASP), HTML, and JavaScript and uses the industry-standard TLS (Transport Layer Security) 1.1/2 encryption for secure data submissions. Agencies will receive usernames and passwords that enable their staff to sign on to the file upload utility on the web site. The passwords are created by a password generator which creates random passwords that are highly secure due to a combination of lower and upper case letters, numbers and punctuation symbols. The database containing the client survey data is not accessible via the Internet; it resides on a server inside the Westat firewall. Only contractor Data Collection Program staff members have access to the master survey database.

  3. For AAAs that may experience problems with the survey website and wish to send client data electronically by email, we instruct the AAAs to password protect the file containing the data. Password protection of client data sent electronically by email is required not only for transmission between the AAA and the contractor, but even internally within the contractor organization. Additionally, we provide the AAAs with an email address to a secure dedicated project email box ([email protected]) which cannot be accessed remotely.

  4. For the small number of AAAs that are not able to generate client records by service electronically, they can submit client information in a hard copy format (fax, FedEx, U.S. Postal Service). Hard copies of client information are stored in locked filing cabinets within a locked room. At the conclusion of the survey, all hard copies of client data are shredded.

  5. A secure fax machine dedicated solely to this survey is used to receive faxes from AAAs that choose to transmit their data by fax. The fax machine is located within a locked project room. AAAs that need to transmit their data by fax are asked to call to Westat staff to alert them to watch for and intercept an incoming fax. If the fax machine is busy, it does not roll over to any other fax machine.


ACL/AoA will use the data provided by respondents for exclusively statistical purposes and will hold this information in confidence to the full extent permitted by law. Respondent data are aggregated and estimates are produced and published at both the national level and at the geographic regional or demographic sub-group level.


A pre-notification letter mailed to potential respondents contains essential survey information that enable the person to make an informed decision regarding his or her voluntary participation in the data collection effort. A sample of the pre-notification informational letter sent to potential survey participants appears in Appendix G as part of the information packet sent to the AAAs.


A.11 Justification for Sensitive Questions


This issue is not applicable to this data collection as there are no sensitive questions asked of respondents.

A.12 Estimates of Annualized Burden Hours and Costs


We estimated the respondent burden for the survey instruments based on our experience with the 1st – 12th National Surveys of OAA Participants. The value of the agency respondents’ time is valued at $20.00 per hour (i.e., the median hourly rate for Community and Social Service Occupations according to the Bureau of Labor Statistics4), plus $20.00/hour for the value of benefits and overhead (based on 100% of the hourly value), for a total of $40.00/hour (i.e., $160 for the agency respondent selection process [estimated at 4 hours of agency personnel time]).


The cost to respondents who participate in the Service Recipient and Caregiver surveys will be in terms of their time only because most of participants are retired and no benefits and overheard are applicable to this cost calculation. The Service Recipient and Caregiver survey instruments take about 40 minutes (.67 hour). Based on the valuation of a participant's time at $24.00 per hour as volunteer time5, the respondent burden for each individual participant will be $16.08 for the Service Recipient and $16.08 for the Caregiver surveys. Exhibit A-1 presents the estimated hour and annual cost response burden.




Exhibit A-1. Estimated Hour and Annual Cost Response Burden


Respondent/Data collection activity

Number of respondents

Responses per respondent

Hours per response

Annual burden hours

Cost per hour

Annual burden (cost)


Baseline

Area Agency on Aging: Respondent Selection Process

250

1

4.0

1,000

$40.00

$40,000

Service Recipients (i.e., case management; congregate nutrition; home delivered nutrition; homemaker; and transportation)

4,400

1

.6667

2,933

$24.00

$70,392

National Family Caregiver Support Program Clients

2,200

1

.6667

1,467

$24.00

$35,208

Year 2



Area Agency on Aging: Respondent selection process

0

0

0

0

$0

0

Service Recipients (i.e., case management; congregate nutrition; home delivered nutrition; homemaker; and transportation)

4,200

1

.6667

2,800

$24.00

$67,200

National Family Caregiver Support Program Clients

2,100

1

.6667

1,400

$24.00

$33,600


Year 3

Area Agency on Aging: Respondent selection process

0

0

0

0

0

0

Service Recipients (i.e., case management; congregate nutrition; home delivered nutrition; homemaker; and transportation)

4,000

1

.6667

2,667

$24.00

$64,008

National Family Caregiver Support Program Clients

2,000

1

.6667

1,333

$24.00

$31,992

Total

19,150

Varies

.710 (weighted mean)

13,600

Varies

$342,400

* It is important to note that not all of the individual respondents (6,600 for the national survey) will be asked to complete all of the questionnaire modules (see Sampling Plan).



A.13 Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers


Total annual cost burden excluding wages, benefits, and overhead is zero (see Exhibit A-1).




A.14 Annualized Cost to the Federal Government


The overall contract cost to the Federal Government is $1,491,397. This amount includes cost for personnel, telephone, and other direct and indirect costs (see Exhibit A-2).


Exhibit A-2. Total Annualized Contract Cost to the Federal Government [Based on Year 1]


Category

Costs

Personnel (T&M including staff & indirect costs)

$1,370,614

Telephone (long-distance telephone survey)

$60,573

Other direct

$42,953

Total direct charges

$1,474,139

Indirect charges

$17,268

Total

$1,491,397


The estimated expense for Federal staff related to this data collection is approximately 10% time for one social science analyst.  An average grade for a GS 13, step 3, was used for this evaluation ($102,126)6 which results in $10,213. Adding 100% for benefits and overhead comes to a total cost for Federal staff of $20,425.


A.15 Explanation for Program Changes or Adjustments


This is a proposed revision to a previously approved data collection. The previously approved data collection covered a cross-sectional survey. This PRA package covers a three-year longitudinal survey, as well as cognitive testing of an updated survey instrument. The results of the cognitive testing will be incorporated into the final survey instrument. The final survey instrument will be administered in the longitudinal survey. When respondents are unreachable at either Year 2 or Year 3 of the data collection, interviewers will administer a brief questionnaire to the respondents’ contact person to determine their status (e.g., currently receiving services or not currently receiving services and the reasons for the non-contact).


A.16 Plans for Tabulation & Publication and Project Time Schedule


Cognitive Testing


Cognitive testing focuses on how well the questions works in terms of clarity and flow. There will be no tabulation of responses other than respondents’ comments on how well the questions work. The contractor will provide recommended revisions to the NLSOAAP Tool to ACL based on the cognitive testing.


Plans for Tabulation


In this section, the range of analyses conducted on the NLSOAAP is described using the performance measurement data. The contractor will clean data, impute, and create variables as needed; prepare all data documentation, including quantitative codebooks; generate frequencies, means, and other descriptive analyses; and conduct any required inferential statistics. Data to be included will: (a) describe the characteristics of clients and the range of services provided by State Units on Aging (SUAs) and Area Agencies on Aging (AAAs); (b) provide a descriptive profile of OAA clients (including, number of activities of daily living (ADL) and number of instrumental activities of daily living (IADL) limitations, income, educational level, living arrangements, age cohort, gender, race and ethnicity, and area of residence by degree of urbanization); and (c) highlight the performance measures of OAA programs


Publication


The contractor will provide a report to ACL for each year of data collection. Results from the NSOAAP and the NLSOAAP are uploaded to the Aging Integrated Database (AGID) available on-line at https://agid.acl.gov/ . Results are available annually. 


Project Timeline


The timetable for the baseline data collection and the two follow-up data collections is shown in Exhibit A-3.


Exhibit A-3. Data Collection Timetable


Survey Cycle

Data Collection Activity

End dates

Cognitive testing

Cognitive testing survey instrument

June 30, 2018

Baseline

14th National Survey

Telephone/email contact with agencies to draw the sample

May 31, 2019

14th National Survey

Telephone survey of participants

September 30, 2019

14th National Survey

Data editing, coding, and data analysis

November 30, 2019

14th National Survey

Deliver data to ACL/AoA

December 31, 2019

14th National Survey

Final report on baseline data collection

February 28, 2019

Year 2

15th National Survey

Telephone survey of participants

September 30, 2020

15th National Survey

Data editing, coding, and data analysis

November 30, 2020

15th National Survey

Deliver data to ACL/AoA

December 31, 2020

15th National Survey

Final report on Year 2

February 28, 2021

Year 3

15th National Survey

Telephone survey of participants

September 30, 2021

15th National Survey

Data editing, coding, and data analysis

November 30, 2021

15th National Survey

Deliver data to ACL/AoA

December 31, 2021

15th National Survey

Final report on longitudinal survey

February 28, 2022


A.17 Reason(s) Display of OMB Expiration Date is Inappropriate


ACL/AoA is not seeking an exemption from displaying the expiration date of OMB approval.


A.18 Exceptions to Certification for Paperwork Reduction Act Submissions


ACL/AoA is not requesting any exceptions from OMB Form 83-I.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJackson, Caldwell (ACL)
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy