OMB Package Section B_FINAL

OMB Package Section B_FINAL.docx

Process Evaluation and Special Studies Related to the Long-Term Care Ombudsman Program

OMB: 0985-0055

Document [docx]
Download: docx | pdf




PROCESS EVALUATION AND SPECIAL STUDIES RELATED TO THE LONG-TERM CARE OMBUDSMAN PROGRAM:

SUPPORTING STATEMENT FOR REQUEST FOR CLEARANCE: SECTION B







Prepared for:


Office of Performance and Evaluation

Center for Policy and Evaluation

Administration for Community Living

US Department of Health and Human Services

Mary E Switzer Building

330 C Street, SW, Room 1243

Washington DC 20201




Prepared by:


NORC at the University of Chicago

4350 East-West Hwy

8th Floor

Bethesda, MD 20814



Contract No. HHSP233201500048I






TABLE OF CONTENTS





B. Statistical Methods


B.1. Respondent Universe and Sampling Methods

ACL/AoA’s contractor, NORC at the University of Chicago, will collect information for the Process Evaluation and Special Studies Related to the Long-Term Care Ombudsman Program (LTCOP) (Contract # HHSP233201500048I) on behalf of ACL/AoA. The contractor is responsible for the design and administration of interview protocols and web-based surveys.


As presented in Section A, data collection is being conducted in two rounds. ACL seeks clearance for round one and provisional clearance for Round 2 dependent upon receiving final marked up surveys. Whereas Round 1 involves data collection from a selected number of Federal staff and national stakeholders as well as a census of State ombudsmen, Round 2 data collection will require a sampling plan for surveying local directors/regional representatives, local representatives, and volunteers.


This package includes the following data collection instruments:


  • Interview Protocols: In-person or telephone interviews will be conducted with 12 selected Federal staff and national stakeholders. Respondents located in the Washington, D.C. region will participate in an in-person interview. Respondents located outside the D.C. region will participate in a telephone interview. Telephone interviews will be conducted with State ombudsmen. The project team expects to speak with all 53 State ombudsmen.

  • Surveys: In addition to the telephone interview, State ombudsmen will complete a survey focusing on quantitative information about programs. In the round two data collection surveys also will be administered to local directors/regional representatives, local representatives, and volunteers in 27 sampled states. In 2014, there were 1,293 paid program staff and 8,155 volunteers serving the program across the country and in Puerto Rico. For this study, approximately 600 local directors/regional representatives and local representatives and 2000 volunteers will be asked to participate in the survey. We anticipate a 50 percent response rate for program staff and a 20 percent response rate for volunteers, which will enable us to meet our targeted sample size of 300 program staff and 400 volunteers to conduct the analyses.


B.2. Procedures for the Collection of Information

Data collection for the process evaluation will be carried out in two rounds and include interviews and web-based surveys. In the first round of data collection, in-person and telephone interviews will be conducted with a targeted group of 12 Federal staff and national stakeholders and all 53 State ombudsmen. State ombudsmen will also complete a web-based survey that will focus on quantitative program data.


ACL/AoA will first contact and notify all Federal staff, national stakeholders, and State ombudsmen who have been identified to participate in the process evaluation. Notifications will take place in January of 2017. The project team will then contact each individual by phone or email to schedule an interview. Verbal consent will be obtained prior to conducting face-to-face or telephone interviews with Federal staff, national stakeholders, and State ombudsmen. Interviews will last approximately 45 to 75 minutes. Federal staff, national stakeholders, and State Ombudsmen will not receive any form of monetary or tangible compensation for participation.


State ombudsmen also will be asked to complete a web-based survey to obtain discrete data that are not being collected elsewhere. State ombudsmen will be contacted by email in January of 2017 with a request to complete the survey. Respondents will click a survey link in the email and enter a unique user ID and password. They will then be presented with a screen that provides a brief overview of the study that informs them about confidentiality and privacy, requests their voluntary participation, and provides a toll-free telephone number and email address for questions about the survey. Participants provide consent by clicking a button at the bottom of the consent screen. It is anticipated that it will take State ombudsmen approximately 35 minutes to complete the survey.


The second round of data collection involves the administration of web-based surveys to three respondent groups at the local level: local directors/regional representatives, local representatives, and volunteers. A sample of 27 states will be randomly selected to participate in these surveys. Once local programs in the 27 sampled states have been identified for inclusion in Round 2 data collection, the project team will work with the 27 State ombudsmen to optimize response rates among local staff and volunteers. The project team will ask ombudsmen (1) for assistance in contacting local directors/regional representatives and local representatives, and (2) to communicate the importance of the project team’s data collection efforts at the program level.


After initial communications from their state ombudsman, the project team will follow-up with local programs immediately with an email or phone call to solidify the contact, begin relationship-building, explain data collection activities, and provide staff with an approximate date when the project team will reach out again to staff and volunteers for data collection. This initial contact will also provide the project team with an opportunity to ask local program staff about how to best communicate with staff and volunteers (including whether holding informational sessions would encourage participation), any issues that may impact the ability to collect Round 2 data from sampled persons in each program, as well as any other considerations that may impact Round 2 response rates. The project team will also use this opportunity to gather email addresses of both paid staff and volunteers (to the extent that this information is available). Following this call, the project team will mail one-page flyers to each local program that has been sampled for Round 2 data collection and ask program staff to distribute the flyers to all paid staff and volunteers to inform them of the upcoming data collection effort.


Respondents at the local level will be contacted by email in March of 2017 with a request to complete the survey. Respondents will click on a survey link in the email and enter a unique user ID and password. Survey participants will be presented with a screen that provides a brief overview of the study and informs them about confidentiality and privacy, requests their voluntary participation, and provides a frequently asked questions link, a toll-free telephone number, and email address for questions about the survey. Participants provide consent by clicking a button at the bottom of the consent screen. Although the actual time to complete the web-based survey will vary depending on the respondents’ role in the program, we estimate the survey will take approximately 35 minutes for local directors/regional representatives and local representatives and 30 minutes for volunteers.


For respondents who are not responsive to the web-based survey, particularly older volunteers, the project team will also offer the option of paper instruments. This option will be offered to respondents if requests for completing the web-based survey prove to be unsuccessful.


Paid staff (local directors/regional representatives and local representatives) will not receive any form of monetary or tangible compensation for participation. Volunteers will be compensated with a $25 gift card for their participation.


B.2.1. Statistical Methodology for Stratification and Sample Selection


Selecting Sites/Programs. A primary goal of the sampling strategy for the LTCOP Process Evaluation is to collect data that will generate robust estimates of key parameters in a manner that is generalizable to the LTCOP program nationwide. The sampling procedure also needs to ensure that all 10 Administration on Aging (AoA) regions, as well as large and small programs, are adequately represented. These elements require a multistage stratified sampling approach.


For the process evaluation, the sampling plan begins with stratifying programs by the 10 AoA regions. To ensure that the diversity of both state and local programs is captured, we will identify a sample of 27 states. We will allocate the state sample to each region proportionally based on the total number of facilities in each region, with at least 2 states allocated to each region. Within each region, the state sample will be selected randomly with probability proportional to size where size is defined as the number of facilities served by programs per state. We recommend including all programs in the sample from the 27 sample states. If program level sampling is necessary within each state, we recommend selecting an equal number of programs from each state with probability proportional to size where size is defined as the number of facilities within the program catchment area. If desired, we can supplement the sample with additional programs whose inclusion is desired by ACL. With this strategy, we simultaneously achieve two design objectives: (1) every region is represented in the sample; and (2) regions with more programs—even if they have fewer states—will be more heavily represented.  

RoundRound

Local Program Staff and Volunteers Survey. All local directors/regional representatives, local representatives, and volunteers in sampled programs will be asked to participate in the survey. Although definitive numbers will not be available until data on program size are obtained from State ombudsmen during Round 1 and programs are sampled, we estimate that approximately 600 subjects will be invited to participate in the local directors’/regional representatives’ and local representatives’ surveys and 2,000 subjects will be invited to participate in the volunteers’ survey. We base these estimates on the fact that 27 states will be included in the study (or approximately half the number of LTCOP programs) and there were 1,293 paid staff and 8,155 volunteers serving the program in 2014.


The table below provides an estimated timeline of data collection activities.


Activity

Estimated Start Date

Estimated End Date

Send Invitation Emails

January 2017

May 2017

Send Reminder Emails

March 2017

May 2017

Phone Prompting 1

April 2017

May 2017

Phone Prompting 2

April 2017

May 2017

Survey program staff and volunteers

July2017

August 2017



B.2.2. Estimation Procedure


In many research settings, a primary hypothesis is defined, that hypothesis is based on quantitative data, and sample size calculations are conducted to meet the specifications of that hypothesis. In the LTCOP evaluation, there is no single primary hypothesis, data collection consists of both qualitative and quantitative data, and in some cases, the goal of the research does not involve formal hypothesis testing. The evaluation is designed to answer multiple research questions using a variety of qualitative and quantitative research methods. To achieve this goal, our protocol calls for data collection from 12 Federal staff and national stakeholders, 53 State ombudsmen, 300 local program staff, and 400 volunteers, for a total of 765 individuals. The rationale for these sample sizes is provided below for each respondent class.


12 Federal Staff and National Stakeholders: Data collection from Federal staff is largely qualitative in nature and is not designed to generate population-based estimates, and it will not be used for hypothesis testing or inferential statistics. We selected 12 staff based on input and priorities from ACL/AoA, along with suggestions from other LTCOP stakeholders. The 12 staff that will be interviewed offer a sufficient level of information concerning the intra- and inter-agency relationships that are of interest for the LTCOP evaluation. These individuals were selected in a purposive manner based on the needs of the project.


53 State Ombudsmen: ACL/AoA seeks information on the full spectrum of programmatic experiences involving state-level leadership of the LTCOP program. This goal is driven by the high degree of variability across states in how the LTCOP is administered, and capturing the programmatic implications of this variability is one of the objectives of the Evaluation. We will therefore interview all State Ombudsmen to ensure that the LTCOP evaluation contains information describing all State programs.


300 Local Directors/Regional Representatives and Local Representatives: Program data from 2014 indicate that there were 1,294 paid staff in the LTCOP. Given that project resources preclude data collection from all of these individuals, we opted to design a protocol in which we would identify a random sample from this universe in a manner that would yield robust estimates of population parameters. To this end, we used simple sample size calculation software to explore various scenarios concerning the sample size needed to generate population proportions from a universe consisting of 1,294 individuals. These calculations require specifying several parameters: A margin of error, confidence level, population size, and the likely proportion of the parameter to be estimated. Conventionally, conservative margins of error and confidence levels are specified at 5 percent and 95 percent, respectively and the population is defined at 1,294. Given that we will be estimating many population proportions, there is no single parameter that is appropriate for sample size calculations. However, for the purposes of these calculations, we selected 50 percent, the most conservative proportion—the one that yielded the largest sample size—because this scenario would provide a sufficient sample size to meet all other proportions that we will encounter in the evaluation. With these parameters, the sample size needed to estimate a population proportion for program staff was 297. We are therefore targeting a sample of 300 program staff.


400 Volunteers: The same rationale that is described above for program staff applies to volunteers: A number of parameters will be estimated, and we chose 50% as the basis for calculating the needed sample size for robust population proportion estimates. Program data from 2014 indicated that there were 8,155 volunteer ombudsmen. Thus, with a 5 percent margin of error, 95% confidence, a population size of 8,155, and a population proportion of 50 percent, we will need a sample of 367 volunteers to generate robust population estimates. Based on this calculation, we are targeting 400 individuals in this class.


B.2.3. Degree of Accuracy Needed for the Purpose Described in the Justification


The contractor will collect data from 12 Federal staff and national stakeholders, 53 State ombudsmen, 300 local program staff, and 400 volunteers, for a total of 765 individuals.

Sample Size Needs. An important consideration prior to developing our sampling plan for selecting the program study sites was the number of local program staff and volunteers required to generate robust estimates for the research questions outlined in the process and (future) outcome evaluation. Our calculations indicate that a sample size of approximately 300 paid staff and 400 volunteers will allow us to generate accurate estimates of key population-level parameters as well as to conduct bivariate analyses linking predictors and outcomes. The sample will then be used to generate frequencies and means that are representative of the population. We also will perform statistical tests (t-test and chi-square) to identify significant relationships between program characteristics and program process measures.


As noted above, we anticipate a 50 percent response rate for program staff and a 20 percent response rate for volunteers for the web-based survey. For program staff, we consider this estimated response rate to be reasonable because the survey takes a short amount of time to complete (35 minutes based on a pretest conducted with 2 local staff serving in the program), respondents are familiar with the use of email and the Web, and there is a high-level of commitment among local staff for the program. For volunteers, we consider this estimate response rate to be reasonable because the survey takes a short amount of time to complete (30 minutes based on a pretest conducted with 2 volunteers serving in the program), and respondents may be more difficult to reach depending on their level of participation in the program. However, we are prepared to handle a lower-than-expected response rate. If a lower response rate occurs in either group, we will conduct non-response bias tests to determine if any bias may have been introduced into the sample.


B.2.4. Unusual Problems Requiring Specialized Sampling Procedures

There are no unusual problems requiring specialized sampling procedures.


B.2.5. Use of Periodic (Less Frequent Than Annual) Data Collection Cycles

There are no periodic data collection cycles associated with this study. The process evaluation is a one-time data collection.


B.3. Methods to Maximize Response Rates and Deal with Nonresponse Since the start of the evaluation planning process, a key objective has been to ensure internal and external support for the study to maximize participation in the evaluation. To optimize State ombudsmen response rates for Round 1, two strategies will be used. We will (1) leverage professional conferences to elicit support and (2) use existing key dissemination outlets to share information on and engage ombudsmen about the evaluation. Each strategy is discussed in more detail below.


  1. Professional Conferences: To attain a high response rate, ACL/AoA’s contractor has identified and presented on the evaluation at relevant conferences. In 2015 and 2016, the project team presented at the Annual State Long-Term Care (SLTCO) Training Conference and the National Consumer Voice for Quality Long-Term Care Conference to obtain advance buy-in from study participants. The project team will continue to engage in these activities throughout the life of the evaluation.


  1. Existing Dissemination Outlets: ACL/AoA’s contractor also will use key dissemination outlets, including newsletters and membership meeting calls, to optimize participation from State ombudsmen. The National Ombudsman Resource Center is the main training and technical assistance provider for the LTCOP. The Center produces Ombudsman Outlook, a quarterly e-newsletter for state and local/regional ombudsmen that is designed to share resources, tips, and news on the program. The Center has offered the project team space on the newsletter to share information on the evaluation and to address any questions that ombudsmen may have. In addition, the National Association of State Long-Term Ombudsman Programs (NASOP) is the membership organization for State ombudsmen. NASOP has offered the project team the opportunity to use the organization’s regular calls to discuss the evaluation with State ombudsmen.


To optimize response rates in Round 2 among local program staff and volunteers, several strategies will be used, including (1) enlisting the support of State ombudsmen; (2) communicating the importance of the study to participants and ensuring ease of survey administration and completion; (3) offering monetary incentives to volunteers; (4) offering a paper-based option for completing the survey; (5) administering surveys at meetings and conferences attended by program staff and volunteers; and (6) communicating answers to frequently asked questions (FAQs) and other information on the evaluation through existing dissemination outlets. Each of these strategies is discussed in more detail below.


  1. Support of State ombudsmen: Once the local programs in the 27 states have been identified for inclusion in Round 2 of the data collection, the project team will work with the 27 State ombudsmen to optimize response rates among local staff and volunteers. The project team will ask ombudsmen (1) for assistance in contacting local directors/regional representatives and local representatives, and (2) to communicate the importance of the project team’s data collection efforts at the program level.


After initial communications from their State ombudsman, the project team will follow-up with local programs by email or phone call to solidify the contact, begin relationship-building, explain the data collection activities, and provide local staff with an approximate date when the project team will reach out again to staff and volunteers for data collection. This initial contact will provide the project team with an opportunity to ask local program staff about how best to communicate with staff and volunteers (including whether holding informational sessions would encourage participation), any issues that may impact the ability to collect Round 2 data from sampled persons in each program, as well as any other considerations that may impact Round 2 response rates. The project team will also use this opportunity to gather email addresses of both paid staff and volunteers to the extent that this information is available. Following this call, the project team will mail one-page flyers to each local program that has been sampled for Round 2 data collection and ask program staff to distribute the flyers to all paid staff and volunteers to inform them of the upcoming data collection effort.


  1. Communications and Ease of Administration: ACL/AoA also will craft messages to program staff and volunteers to assist in maximizing our final response rate. In April of 2017, ACL/AoA will send an email to program staff and volunteers that describes the study and its importance and how the information will be used. The email will invite program staff and volunteers to complete the survey by clicking on a survey link. When they click on the survey link, the first screen will ask them to enter a user ID and password. The next screen provides a brief overview of the study, asks for voluntary participation, informs participants about confidentiality and privacy, and provides a frequently asked questions link, a link to a detailed description of the project, and a toll-free telephone number and email address if participants have any questions. Among sampled persons who have not responded, a reminder invitation will be sent after two and three weeks with information about the importance of their participation, why the results are important, and how the information will be used. The reminder email will also include the information necessary for them to complete the survey. At four weeks, reminder telephone calls will be placed to sampled persons who have not yet responded to the survey. During the final two weeks of data collection, sampled persons who have still not responded to the survey will be called again, and encouraged to participate. Highly experienced interviewers will initiate these calls to optimize cooperation among non-respondents.


  1. Monetary Incentive for Volunteers: Program staff will not receive any form of monetary or tangible compensation for their participation in the study. Among volunteers, a monetary incentive ($25 gift card) will be used to encourage participation.


  1. Option of Paper-Based Surveys: For sampled persons who are not responsive to the web-based survey, the project team will offer the option of paper instruments to increase response rates. This option may be particularly important to older volunteers serving the program.


  1. Professional Conferences: The project team will identify relevant meetings and conferences directed at program staff and volunteers where the surveys can be administered in groups. These include regional conferences (depending on the sampled states selected for study) for local ombudsmen as well as more nationally oriented conferences such as the National Association of Area Agencies on Aging Conference.


  1. Existing Dissemination Outlets: Similar to the approach undertaken with State ombudsmen, ACL/AoA’s contractor will use key dissemination outlets, including newsletters and membership meeting calls, to optimize participation from regional and local ombudsmen. The National Ombudsman Resource Center is the main training and technical assistance provider for the LTCOP. On a quarterly basis, the Center produces Ombudsman Outlook, a quarterly e-newsletter for state and local/regional ombudsmen that is designed to share resources, tips, and news on the program. The Center has offered the project team space in the newsletter to share information on the evaluation and to address any questions that ombudsmen may have. In addition, the National Association of Local Long-Term Ombudsmen (NALLTCO) is the membership organization for local/regional ombudsmen. NALLTCO has offered the project team the opportunity to use the organization’s regular calls to discuss the evaluation with local/regional ombudsmen. NALLTCO also produces a quarterly newsletter for local ombudsmen which may be used to disseminate information about evaluation activities.


B.4. Test of Procedures or Methods to be Undertaken


The interview protocols and survey instruments have been drafted and undergone three reviews: (1) an internal review conducted by NORC’s Institutional Review Board, (2) a review by State ombudsmen at the Annual SLTCO Training Conference, and (3) a pre-test with five paid staff and volunteers currently serving in program who represent diverse roles and program structures to assess the reliability of the instrument. The pretest was also administered to determine the burden placed on respondents. Slight revisions were made to the order and wording of a small number of questions based on comments received from these reviews.

Modifications to the content, structure, and length of the surveys have been made based on feedback received as well as results of survey pretest interviews. Respondents provided generally positive feedback indicating that they could readily answer the questions and that the time to complete the survey was not burdensome (approximately 30-35 minutes for all surveys).


The process evaluation interview protocols and web-survey instruments have been drafted, reviewed and approved by all project staff and have undergone an internal review conducted by NORC’s Institutional Review Board. After programming the instrument for the web, the survey instrument underwent beta testing by all project staff prior to the launch of the questionnaire.


B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data The information for this study is being collected by NORC, a research organization, on behalf of ACL/AoA. With ACL/AoA’s oversight, the contractor is responsible for the study design, instrument development, data collection, analysis, and report preparation.


The instrument for this study and the plans for statistical analyses were developed by ACL/AoA and its contractor. The staff team is composed of Dr. Kim Nguyen, Project Director, and a team of senior-level staff including Tim Mulcahy, Dr. Michael Yang, and consultant, Dr. Helaine Resnick. Contact information for these individuals is provided below.



Name

Phone Number

Kim Nguyen, PhD

301-634- 9495

Tim Mulcahy, MA

301-634-9352

Michael Yang, PhD

301-634-9492

Helaine Resnick, PhD

202-329-8616


Page | 9

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleREQUEST FOR CLEARANCE FOR
AuthorDHHS
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy