OMB Package Section A-FINAL-REVISED

OMB Package Section A-FINAL-REVISED.docx

Process Evaluation and Special Studies Related to the Long-Term Care Ombudsman Program

OMB: 0985-0055

Document [docx]
Download: docx | pdf




PROCESS EVALUATION AND SPECIAL STUDIES RELATED TO THE LONG-TERM CARE OMBUDSMAN PROGRAM:

SUPPORTING STATEMENT FOR REQUEST FOR CLEARANCE: SECTION A







Prepared for:


Office of Performance and Evaluation

Center for Policy and Evaluation

Administration for Community Living

US Department of Health and Human Services

Mary E Switzer Building

330 C Street, SW, Room 1243

Washington DC 20201




Prepared by:


NORC at the University of Chicago

4350 East-West Hwy

8th Floor

Bethesda, MD 20814



Contract No. HHSP233201500048I




TABLE OF CONTENTS




LIST OF TABLES




ATTACHMENTS



Attachment 1: Federal Staff Interview Protocol

Attachment 2: National Stakeholders Protocol

Attachment 3: State Ombudsmen Protocol

Attachment 4: State Ombudsmen Survey

Attachment 5: Local Directors/Regional Representatives Survey

Attachment 6: Local Representatives Survey

Attachment 7: Volunteers Survey

Attachment 8: Federal Register Notice

Supporting Statement



A. Justification


A.1. Circumstances That Make the Collection of Information Necessary

The Administration for Community Living/Administration on Aging (ACL/AoA) is requesting approval from the Office of Management and Budget (OMB) for data collection associated with the Process Evaluation and Special Studies Related to the Long-Term Care Ombudsman Program (LTCOP) (Contract # HHSP233201500048I). ). The Older Americans Act (OAA) Title III-C Elderly Nutrition Services Program (statutory authority is contained in Title II section 205(a)(2)(A) authorizes ACL to evaluate all named programs. The goal of the LTCOP (named in section 712 of the Older American’s Act) is to protect and promote the health, safety, welfare, and rights of long-term care facility residents. Administered by ACL/AoA, LTCOPs operate in all 50 states, the District of Columbia, Puerto Rico, and Guam. Each of these programs is led by a full-time ombudsman who is responsible for statewide program administration and oversight of representatives of the office and volunteers who assist in carrying out the core activities of the program, including resolving residents' complaints, advocating for systemic change, and providing information and consultation to residents and their families. In 2014, there were 1,293 paid program staff and 8,155 volunteers serving the program across the country and in Puerto Rico.


The purpose of the process evaluation is to obtain a thorough understanding of the LTCOP’s structure and operations at the national, state and local levels; use of resources to carry out legislative mandates; the nature of program partnerships; and processes for sharing information on promising program practices and areas for improvement. Data collection for the process evaluation consists of two rounds. ACL seeks clearance for round one and provisional clearance for Round 2 dependent upon receiving final marked up surveys. The first round focuses on obtaining information from three respondent categories at the national and state levels: Federal staff, national stakeholders, and State ombudsmen. Data collection from these respondents will help inform and refine the second round of data collection focusing on obtaining information from respondents at the local level. These include local directors/regional representatives, local representatives, and volunteers. For example, data collected from Round 1 will serve to inform the subsequent data collection for Task 7 of the project. This task focuses on investigating how the ombudsman program has been addressing and affecting the changing landscape of long term services and supports (LTSS). State ombudsmen responses to questions about reforms in LTSS and home and community-based care will help identify states for further study and obtain information at the local level. This package addresses these two rounds of the project, with an emphasis on the first round. The package includes the following data collection instruments to be reviewed:


  • Interview Protocols: Depending on their location, in-person or telephone interviews will be conducted with 20 selected Federal staff and national stakeholders. Respondents located in the Washington, D.C. region will participate in an in-person interview. Respondents located outside of the metropolitan area will participate in a telephone interview. Telephone interviews will be conducted with all 53 State ombudsmen.

  • Surveys: In addition to the telephone interview, State ombudsmen will complete a survey focusing on quantitative information about programs. In 27 sampled states, surveys also will be administered to local directors/regional representatives, local representatives, and volunteers. Approximately 300 local directors/regional representatives and local representatives and 400 volunteers will be asked to participate in the survey.


This information is not currently being collected in any other form, thus necessitating the current data collection request to achieve the goals of the evaluation.


A.2. Purpose and Use of Information Collection


As presented in Section A.1., the purpose of the process evaluation is to obtain a clear and comprehensive understanding of the LTCOP’s structure and operations at the national, state and local levels; use of resources to carry out legislative mandates; the nature of program partnerships; and processes for sharing information on promising program practices and areas for improvement. Throughout the evaluation, a key aim is to obtain stakeholder buy-in, feedback on, and participation in the study. ACL/AoA anticipates that this study will: provide practical and policy-relevant insight into LTCOP services and processes; highlight promising program practices; and ultimately, provide critical information to enable ACL/AoA to better protect greater numbers of vulnerable elders.


A secondary goal of the process evaluation is to lay the foundation for a possible future outcome evaluation. In this regard, ACL/AoA is relying on the process evaluation to provide essential information for designing a subsequent outcome evaluation that assesses the effect of program services for beneficiaries, including consumers, facilities, stakeholders, and/or programs themselves.


Data collection for the process evaluation will be carried out in two rounds involving personal interviews and web-based surveys. In the first round, in-person and telephone interviews will be conducted with 20 selected Federal staff and national stakeholders and all 53 State ombudsmen. The goals of the interviews are: to understand the long-term care context in which the program operates; program operations, monitoring, and interactions; successful programmatic approaches; challenges to program implementation; inter-organizational relationships; and implementation of new ombudsman regulations.


During the first round of data collection, State ombudsmen will also complete a web-based survey that will focus on collecting quantitative data about programs. The goal of the survey is to: understand program structures, operations, and interactions; staffing and management responsibilities; adequacy of resource; the nature of partnerships; data management capacity and practices; use of volunteers; program autonomy; unique features of programs; and demographic characteristics.


The second round of data collection involves the administration of a web-based survey to three respondent groups at the local level: local directors/regional representatives, local representatives, and volunteers. The goal of the surveys is to understand program operations at the local level. The surveys for local directors/regional representatives and local representatives include questions on the respondent’s background before joining the LTCOP; program operations, monitoring and interactions; adequacy of resources; main responsibilities; the nature of partnerships; successful practices and areas for improvement; training and technical assistance; data management capacity and practices; unique features of programs; and demographic characteristics. The surveys for volunteers include questions on the respondent’s background before joining the LTCOP; program operations and interactions; main responsibilities; the nature of partnerships; successful practices and areas for improvement; training and technical assistance; and demographic characteristics.


The contractor will use the data collected to answer the process evaluation’s key research questions provided below in Exhibit 1.

Exhibit 1: Key Evaluation Questions


  1. How is the LTCOP structured and how does it operate at the local, State, and Federal levels? Who does the program serve, how it is staffed, and what data are collected about activities and outcomes?

  2. How do LTCOPs use existing resources to resolve problems of individual residents and to bring about changes at the facility and governmental (local, State, and Federal) levels that will improve the quality of services available/provided?

  3. With whom do LTCOPs partner, and how do LTCOPs work with partner programs?

  4. How does the LTCOP provide feedback on successful practices and areas for improvement?


A.3. Use of Improved Information Technology and Burden Reduction

A self-administered, web-based survey will be used to gather data from State ombudsmen, local directors/regional representatives, local representatives, and volunteers. (Please see our sampling plan in Section B for more detail). We know that individual respondents will have access to and be familiar with the necessary technology to complete the survey through their experience working with the LTCOP. The survey will be administered electronically to minimize the burden on respondents. The web-based survey permits respondents to complete the survey at their preferred time. Respondents who begin the survey and are unable to complete it in one attempt will be able to save their responses and resume work on the survey at a later time. The web-based format will incorporate skip patterns that ensure that respondents automatically skip past sections of the survey that are not relevant to their experiences. The study will have a centralized case management system (CMS), linked to the web survey, as well as prompting and receipt control systems, thus allowing real-time case status reviews. The CMS will assist our follow-up efforts with non-respondents, ensuring that no sample member is prompted for again a survey response once they have completed the web survey. All respondent groups will be emailed an invitation letter with instructions on web survey access, including a unique Personal Identification Number (PIN) and password. This initial contact will be followed up with additional emails to maximize our response rate. If necessary, follow-up phone calls may be used to encourage participation when email prompts fail. For burden purposes, we will not call respondents more than twice.


For respondents who are not responsive to the web-based survey, the option of paper instruments also will be made available. Given that one of the respondent groups is volunteers, the option of paper-based surveys will help accommodate the variety of individuals who volunteer, some of whom are older adults themselves.


A.4. Efforts to Identify Duplication And Use of Similar Information

The information sought as part of this study is unique. The information necessary for this evaluation has not been collected elsewhere in any format that could be adapted to address the research objectives of the evaluation. Based on our thorough review of existing data sources, no survey or other mode of data collection has captured the needed information on the LTCOP’s processes and procedures, adequacy of resources, nature of partnerships, and processes for sharing successes and challenges. No other data are currently being collected to answer these specific research questions and to establish the framework needed for a future outcome evaluation. However, any existing information that might be useful to address the research questions can and will be used whenever possible.


A.5. Impact on Small Businesses or Other Small Entities

No small businesses are involved, as respondents are Federal staff, national stakeholders, State ombudsmen, local directors/regional representatives, local representatives, and volunteers.


A.6. Consequences if Information Collected Less Frequently

Both the interviews and surveys will be administered only once. This information is not currently being collected in any other form, making the current data collection request necessary for achieving the goals of the evaluation.


A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.8(d)

This data collection request is fully consistent with the guidelines in 5 CFR 1320.8(d). There are no special circumstances required for the collection of information in this data collection.


A.8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

A. In accordance with the paperwork Reduction Act of 1995, the notice required in 5 CFR 1320.8(d) has been published in the Federal Register announcing ACL/AoA’s intention to request an OMB review of data collection activities. This notice was published on June 28, 2016, in volume 81, number 124, on page 41975 and provided a 60-day period for public comment. A copy of the Federal Register notice is included as Attachment 8. ACL/AoA did not receive any comments on the 60-day notice for the LTCOP PRA.




The interview protocols and surveys were developed by ACL/AoA and its contractor, NORC at the University of Chicago. Input and feedback on the protocol instruments and surveys were sought from ACL/AoA staff, State ombudsmen, and local directors. During the Annual State Long-Term Care Ombudsman (SLTCO) Training Conference that was held in Lexington, KY in April 2016, State ombudsmen and local directors in attendance reviewed the protocols and surveys and provided feedback and suggestions to ACL/AoA’s contractor. Subsequently, meetings were held between ACL/AoA and its contractor to discuss and revise the interview protocols and surveys. In addition, the survey instruments were pretested with a small group of 5 local representatives and volunteers and 2 former State ombudsmen.


  1. Since September 2015, ACL/AoA has consulted with the following persons regarding this information collection:

  • Dr. Lauren Kojetin-Harris, National Center for Health Statistics

  • Dr. Ellen Kramarow, National Center for Health Statistics

  • Dr. Charlene Harrington, University of California, San Francisco

  • Deborah Merrill, National Association of States United for Aging and Disabilities

  • Dr. Brooke Hollister, University of California, San Francisco

  • Lori Smetanka, National Consumer Voice for Quality Long-Term Care

  • Robyn Grant, National Consumer Voice for Quality Long-Term Care


A.9. Explanation of any Payment or Gift to Respondents

On behalf of ombudsman programs, thousands of volunteers work in hundreds of communities throughout the country, assisting residents and their families and providing a voice for those unable to speak for themselves. Many are older adults themselves. A gift card of $25 will be offered to volunteer respondents for their participation in the data collection. No other respondent group will be offered any payments or gifts for their participation in the study.

A.10. Assurance of Confidentiality Provided to Respondents

Participation in this study is voluntary. Respondents will be told the purposes for which the information is collected and that, in accordance with this statute, any identifiable information about them will not be used or disclosed for any other purpose.


In the first round of data collection, ACL/AoA will contact and notify all Federal staff, national stakeholders, and State ombudsmen who have been identified to participate in the process evaluation. Notifications will take place in January of 2017. The project team will then call or email each individual about scheduling an interview. Verbal consent will be obtained when conducting face-to-face or telephone interviews with Federal staff, national stakeholders, and State ombudsmen. The project team member leading the discussions will be responsible for seeking consent from the subjects prior to asking any questions. Verbal consent will be obtained after the informed consent script is read verbatim to all interviewees. The consent script provides a brief overview of the project, informs the respondent that the interview is completely voluntary and they can skip any questions or terminate the interview at any point, and that the information that is provided to the contractor will be summarized in a report such that no individual respondents names will be identified in the report. If the subject “consents” to participate in the discussion, the interview will proceed. If the subject “declines to consent”, the interview will not be conducted.


State ombudsmen also will be asked to complete a web-based survey in order to obtain discrete data that are not being collected elsewhere. State ombudsmen will be contacted by email in January of 2017 requesting that they complete the survey. Respondents will click on a survey link in the email and will be required to enter a unique user ID and password. Survey participants will first see a screen that provides a brief overview of the study, informs participants about confidentiality and privacy, requests their voluntary participation, and a toll-free telephone number, and email address if participants have any questions about the survey. By clicking a button at the bottom of the consent screen, the survey participant is providing their voluntary consent to participate in the survey.


The second round of data collection will focus on gathering information from a representative sample of respondents at the local level, including local directors/regional representatives, local representatives, and volunteers. Proposed round two survey instruments are included in this package for context, but will be finalized based on the results of the round one data collection. Once finalized, ACL will publish another 30 day federal register notice to elicit further public comment. The final round two surveys will then be submitted to OMB for approval. These respondents will be contacted by email in offspring/Summer of 2017 requesting that they complete the survey. Respondents will click on a survey link in the email and will be required to enter a unique user ID and password. Survey participants will first see a screen that provides a brief overview of the study, informs participants about confidentiality and privacy, requests their voluntary participation, and provides a frequently asked questions link, a toll-free telephone number, and email address if participants have any questions about the survey. By clicking a button at the bottom of the consent screen, the survey participant is providing their voluntary consent to participate in the survey. For respondents who are not responsive to the web-based survey, particularly older volunteers, the project team will also offer the option of paper instruments. This option will be offered to respondents if requests for completing the web-based survey prove to be unsuccessful.


Data collection procedures will incorporate numerous safeguards for the data. While collecting data, information that could identify a particular sample member will be stored in a separate file from survey data collected from that person. Each sample member will be assigned a unique identifier, and this identifier will be used to store identifying information (such as name, address, etc.) in a separate database from the survey response data.


The contractor will be collecting names, telephone numbers, and email addresses as part of the survey. This information will be used for contacting respondents for a follow-up study, if necessary. If it is decided that respondents will not be contacted for the outcome evaluation, however, the team will destroy the personally identifiable information. With regard to confidentiality, responses will be de-identified and subsequently tracked by ID number only. The survey data will be tabulated and analyzed statistically with no individual names or responses ever identified. Names, telephone numbers, and email addresses will be retained in a secure location on NORC’s secure server farm, available only to authorized project staff to use as part of the future follow-up survey for the outcome evaluation. Data will be coded such that obvious identifiers will be substituted with a unique identifying number. The contractor will retain a master list linking study codes and direct identifiers. The master list will be saved on the contractor’s secure servers. All systems used to store electronic survey data are secure by design and protected by passwords only available to authorized study staff.


Special steps will be taken to ensure that data collected via the Web questionnaire are secure. First, access to the Web instrument is only allowed with a valid Personal Identification login user name and password. Second, data will be transmitted by the Secure Sockets Layer (SSL) protocol that uses powerful encryption during transmission through the Internet. If a respondent keeps a Web survey open without any activity, the Web server will close the survey after a short period of inactivity, thus preserving the data up to the break-off point and securely closing the connection. Both development and production servers are backed up nightly.


ACL/AoA and its contractor will publish aggregate statistics of the survey responses in a report, along with information obtained during the interviews. Individual respondents will not be identified in any report, publication, or presentation of this study or its results. Upon completion of the project, names, telephone numbers, and email addresses of the survey respondents will either be retained for the next round of the study, the outcome evaluation. This information will be used for contacting respondents for a follow-up survey. All interview notes will be destroyed upon completion of the project.


A.11. Justification for Sensitive Questions

The surveys will ask respondents to self-identify their race and ethnicity using the federally approved questions from the U.S. Census. These questions are necessary in order to conduct subgroup analyses and to understand the characteristics of State ombudsmen, local directors/regional representatives, local representatives, and volunteers.


No other questions of a sensitive nature are asked during the interviews or surveys.


A.12. Estimates of Annualized Hour Burden and Costs

The contractor will interview 20 Federal staff (60 minutes estimated burden) and national stakeholders (45 minutes estimated burden) and 53 State ombudsmen (75 minutes estimated burden). All 53 State ombudsmen also will be asked to complete a survey which is estimated to take 35 minutes to complete.


ACL/AoA estimates contacting approximately 600 local directors/regional representatives and local representatives to complete the web-based survey. Of this number, we anticipate obtaining responses from 50 percent of the sample (300 respondents). We consider this response rate to be a reasonable estimate because the survey takes a short amount of time to complete (35 minutes based on a pretest conducted with 3 local staff serving in the program), respondents are familiar with email and the Web, and there is a high-level of commitment among local staff for the program.


ACL/AoA estimates contacting approximately 2,000 volunteers to complete the web-based survey. Of this number, we anticipate obtaining responses from 20 percent of the sample (400 respondents). We consider this response rate to be a reasonable estimate because the survey takes a short amount of time to complete (30 minutes based on a pretest conducted with 2 volunteers serving in the program), and respondents may be more difficult to reach depending on their level of participation in the program.


Exhibit 2 presents estimates of the reporting burden for respondents and Exhibit 3 presents estimates of the burden cost.


Exhibit 2: Estimated Burden Hours







Forms

Type of Respondent

Number of Respondents

Responses per Respondent

Average Burden Hours per Response

Total Burden Hours


Interview Protocols

Federal Staff

6

1

1 hour (60 minutes)

6 hours

National Stakeholders

6

1

.75 hour (45 minutes)

4.5 hours

State Ombudsmen

53

1

1.25 hours (75 minutes)

66.25 hours










Surveys

State Ombudsmen

53

1

.58 hour (35 minutes)

30.92 hours

Local Directors/Regional Representatives and Local Representatives



300



1



.58 hour (35 minutes)



175 hours

Volunteers

400

1

.5 hour (30 minutes)

200 hours







Total


765 responses from 712 unique individuals



482.67 hours



Exhibit 3: Estimated Burden Costs







Type of Respondent

Number of Respondents

Total Burden Hours

Average Hourly

Wage Rate

Total Cost

Federal Staff

6

6

$67.30

$403.80

National Stakeholders

6

4.5

$67.30

$302.85

State Ombudsmen

53

66.25

$67.30

$4,458.63







State Ombudsmen

53

30.92

$67.30

$2,080.92

Local Directors/Regional Representatives and Local Representatives



300



175



$44.64



$7,812


Volunteers

400

200

-

-












*Note that the average hourly wages for State Ombudsmen and Local Directors and Representatives include overhead and benefits. These estimates were generated by the National Ombudsman Resource Center and were previously approved for NORS OMB # 0985-0005, with an expiration date of 01/31/2019.


A.13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

There are no annualized capital/startup or ongoing operation and maintenance costs involved in collecting the information. Other than their time to complete the survey or interview, which is estimated in Exhibit 2, there are no direct monetary costs to respondents.


A.14. Estimates of Annualized Costs to the Federal Government

The estimated cost to the Federal Government for the Process Evaluation and Special Studies Related to the Long-Term Care Ombudsman Program (LTCOP) data collection activities is $312,152,03. This is the cost to our Federal contractor, NORC at the University of Chicago, for data collection activities associated with this submission.


A.15. Explanation for Program Changes or Adjustments

No change in burden is requested. This submission to OMB is for an initial request for approval.


A.16. Plans for Tabulation and Publication and Project Time Schedule

A study report will be based on the findings from an analysis of the information obtained through the process evaluation, as well as other forms of information provided through program administrative data and a literature review. The final report will be published on the ACL website. The final report will include the following sections:

  • Executive Summary. The executive summary will be written in a manner that makes it useful as a stand-alone document for individuals who do not have time to review the entire report. It will highlight the objectives, key findings, and the implications of these findings for the program.

  • Methodology. This section will describe the methods used for developing, implementing and analyzing the interview protocols and surveys.

  • Key Issues and Findings. This section will discuss findings around each of the key research questions.

  • Conclusions. Conclusions will include recommendations or suggestions for future research and policy initiatives.


Analysis will begin shortly after the final data are collected in May 2017. The project team will analyze the data using basic frequencies and cross tabulations. Simple statistical testing also will be used (t-test and chi-square) to identify significant relationships between program characteristics and program processes.


Exhibit 4 provides the reporting schedule for the entire study.


Exhibit 4: Timetable for Data Collection and Publication for Other Data Collection Efforts


Activity

Estimated

Start Date

Estimated

End Date

Develop Instruments for Data Collection



Develop interview protocols

March 2016

May 2016

Develop surveys

March 2016

May 2016

Obtain IRB approval (received August 24, 2016)



Obtain OMB approval



Develop Sampling Plan



Develop sampling frame of states for study selection

January 2017

March 2017

Draft plan for local data collection

February 2017

March 2017

Finalize selection of states for process evaluation

February 2017

March 2017

Implement Data Collection



Conduct interviews

January 2017

February 2017

Survey State ombudsmen

January 2017

February 2017

Survey local staff (directors, representatives and volunteers)

April 2017

August 2017

Draft Reports



Topical briefs

July 2017

August 2017

Final report

July 2017

September 2017


A.17. Reason(s) Display of OMB Expiration Date in Inappropriate

All data collection materials will display the OMB expiration date.


A.18. Exceptions to Certification for Paperwork Reduction Act Submissions

ACL/AoA certifies that the collection of information encompassed by this request complies with 5 CFR 1320.9 and the related provisions of 5 CFR 1320.8(b)(3).




Page | 5

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleREQUEST FOR CLEARANCE FOR
AuthorDHHS
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy