CMS-10404 Supporting Statement part B rev 2-10-12

CMS-10404 Supporting Statement part B rev 2-10-12.docx

CMS National Balancing Indicators Project (NBIP) Direct Service Workforce (DSW) Data Collection Effort

OMB: 0938-1160

Document [docx]
Download: docx | pdf



CMS National Balancing Indicators Project (NBIP) Direct Service Workforce Data Collection Effort

OMB Supporting Statement for
Paperwork Reduction Act Submissions

Supporting Statement – Part B

Collections of Information Employing Statistical Methods

B. Statistical Methods

  1. Respondent Universe and Sampling Methods

To ensure an adequate number of responses across worker types, recipient populations, and types of workplace settings, in each of the SPT states, the approach is to survey the entire universe of eligible individual workers and agencies. The only exception will be the Michigan survey of individual independent providers. Michigan surveyed IPs in the state’s waiver program for seniors and people with physical disabilities in 2010. They may resurvey a sample of individual workers in other programs, specifically, Medicaid personal care services, waivers for people with intellectual and developmental disabilities. The budget is based on 5% sample of 55,000.

The Individual Worker Survey will be mailed to individual workers in each state who provide services to older adults, people with physical disabilities, and people with intellectual/developmental disabilities who are enrolled in “participant-directed” services funded by Medicaid. In cases where contact information for the workers is not available, states will send the survey to the person receiving services and ask them to distribute it to their worker(s). This will be a census of individual workers in each state, so the numerical population will differ from state to state.

The Employer Organization Survey will go to organizations in each state that employ individuals in the direct service workforce who provide services to people enrolled in Medicaid-funded home and community based or personal care programs. Specifically, the surveys will go to organizations that employ direct service workers and serve older adults, people with physical disabilities, and people with intellectual or developmental disabilities in home and community settings. At a minimum, this will include providers serving individuals in Medicaid waiver programs for the aging and people with physical disabilities, waiver programs for people with intellectual and developmental disabilities, and Medicaid state plan personal care services in states that offer the personal care option. This will be a census of provider/ employer organizations in each state that fall into any of the categories described above, so the numerical population will differ from state to state.

Each SPT grantee will compile the listings of individual workers and employer organizations in their states who should receive the surveys. The DSW Resource Center team is assisting the states to compile the lists. The mailing lists should include:

  • Federal EIN# (for employer organization list only)

  • Unique provider ID#

  • Provider/employer organization name

  • Name of individual within the organization who will receive the survey (if available)

  • Provider/employer organization mailing address

  • Waiver program/Medicaid program

  • Service(s) provided

  • Phone number (if available)

  • Email address (if available)

Wherever possible, surveys will be sent to state corporate headquarters for organizations that operate multiple sites around the state. The DSW Resource Center will work with each SPT grantee to develop a system for attaching a unique identifier to each provider in their list, to allow for targeted follow-up to non-responders and prevent respondents from completing more than one survey.

The estimated universe of workers and agencies to be surveyed in each state is summarized in Exhibit 6.

Exhibit 6: Data Collection Target Survey Participants by Type

 

Individual Worker Survey

Aging/PD Waiver Providers and Personal Care

IDD Waiver Providers

Arkansas

Approximately 4,500 workers

Approximately 4,840 employer organizations

Approximately 600 employer organizations

Florida

Approximately 2,500 workers

Approximately 8,200 employer organizations

Approximately 6,000 employer organizations

Kentucky

Approximately 3,500 workers

Approximately 1,000 employer organizations

Approximately 1,000 employer organizations

Massachusetts

N/A

Approximately 5200 employer organizations

Approximately 1,000 employer organizations

Maine

Approximately 800-1,000 workers

Approximately 1,100 in Aging/PD and 8,000 in Personal Care

Approximately 520 employer organizations

Michigan

Approximately 2,700 workers

Approximately 1,300 employer organizations

Approximately 1,600 employer organizations

Minnesota

Approximately 2,500 workers

Approximately 8,400 employer organizations

Approximately 2,800 employer organizations

Total

~16,600 individual workers

~51,560 employer organizations



  1. Recruitment Strategy

The DSW Resource Center Team will produce electronic versions of the postcards, surveys, cover letter, and instructions for each SPT grantee to print and mail. SPT grantees will perform a mail merge function to individualize the cover letter and attach the unique identifier to surveys before printing. In addition, we will suggest that states also email potential respondents for whom they have email addresses. The SPT grantees will have completed all editing and customizing of the survey questions, answer options, definitions and no further edits will be made at that point. Surveys may be mailed to respondents either as stand-alone mailings or included with other correspondence that is sent directly to potential respondents. For example, the SPT grantee may include the independent provider surveys with the provider paychecks when they are mailed.

The survey packet that all potential respondents receive should include:

  • An individually addressed cover letter explaining that the survey is voluntary, explaining the purpose of the survey, how the information will be used, and assuring that individual responses will be kept private.

  • The survey tool and the instructions for completing the survey online.

  • A self-addressed and stamped envelope to return completed survey to SPT grantee (or other organization designated by SPT grantee).

We expect a response rate of 40%, based on our inspection of the results of similar surveys of direct service workers and employer agencies. Past survey experience from similar studies of direct service workers show evidence of poor prior response rates, although response rates have generally been higher with the use of incentives.1.

  1. Methods to Maximize Response Rates and Deal with Non-Response

To ensure a minimum of 40% response rates for both surveys in all states, the DSW Resource Center Team will work with the SPT grantees to take a number of steps to maximize response rates, including but not limited to the following.

In Advance of Surveying:

  • We will encourage SPT grantees to communicate and collaborate with stakeholders such as state agencies, waiver agents, Area Agencies on Aging, provider associations, financial management services vendors, worker associations and consumer groups about the purpose of the survey. Stakeholders such as these can play an important role in encouraging survey recipients to respond.

  • SPT grantees will send a “heads-up” postcard or announcement two weeks in advance of the survey to let respondents know it is coming and to test their mailing list. This will allow mailing lists to be corrected before surveys are mailed. The letter will be signed by a senior agency official and personally addressed to the respondent whenever possible. The letter will explain the importance of the survey, how long it will take, a contact number for more information, and assurance of privacy and that participation is voluntary and will have no impact on their status as a vendor or provider of services.

  • Mailing lists will be updated with address corrections based on “heads-up” postcard based on USPS updates.

Surveying:

  • Although survey recipients will be asked to respond within two weeks, both surveys will be in the field for eight weeks to allow for completion and follow-up to non-responders.

  • Independent providers who complete the survey will be sent a $10 debit card. Employer organizations will not receive monetary incentives, but will be offered copies of the final reports summarizing survey findings.

Following Surveying:

  • The SPT grantees will mail all non-respondents a follow-up postcard two weeks after the initial survey packet is mailed to remind and encourage providers to complete and return the survey by the due date.

  • Two weeks after the reminder postcard is mailed, the DSW Resource Center Team will work with each SPT grantee to review and analyze the group of responders and develop a targeted follow-up strategy if responders are not representative of the census in terms of region and consumer population served.

  • Depending on response rates for each survey in each state, SPT grantees will make up to two follow-up telephone calls to approximately 20% of non-respondents. If the DSW RC’s analysis of non-responders indicates no bias and it appears that the initial survey response is representative of providers and workers in the population, then 20% of non-responders will be randomly selected to receive follow-up. If the initial response is not representative (for example, if a particular geographic area of the state or segment of providers are significantly under-represented) we will select a targeted group of non-responders for follow-up, to increase the representativeness of the sample.

  • The protocol for follow-up phone calls will be provided to SPT grantees by the DSW Resource Center Team. The calls will entail reminding potential responders to complete the survey, answering questions about the survey, and providing additional instruction on how to complete the survey as necessary. Messages should be left if necessary reminding them to complete the survey and providing a number to call with questions. Reminder messages might also be sent by email where email addresses are available.

The DSW Resource Center team will calculate a nonresponse rate for the overall survey. To evaluate nonresponse bias, we will compare respondents and nonrespondents on information available from the sampling frames compiled by the states with assistance from the DSW Resource Center, as described above, to determine whether response rates vary on those attributes or whether respondents and nonrespondents differ on those characteristics. We will also assess potential nonresponse bias by analyzing differences between respondents and initial refusals who later responded, by levels of effort to obtain the response (e.g., the number of mail reminders sent / telephone calls made).

In addition, we will evaluate nonresponse bias at the item level for each item.

The data entry system will provide special administrative features for each question for data enterers to designate non-responses and confusing/contradictory responses. Missing data will be assigned a “missing” code, and the number of respondents will be noted for each statistic reported.

  1. Tests of Procedures or Methods to be Undertaken

Feedback from States

In April and May, 2011, we asked the grantees to distribute drafts of the surveys to their stakeholders and gather feedback. We also received feedback from states about the survey in individual phone calls with the SPT grantees.

The DSW Resource Center Team worked with each SPT grantee to customize the terminology in the Individual Worker and Employer Organization surveys, the cover letters, and the instructions provided with the surveys for each state to reflect the terms used to describe direct service workers and settings in which services are delivered in the state. While terminology will differ across surveys, the states will be collecting consistent information about core workforce indicators. The DSW Resource Center Team consulted with the SPT grant lead in each state to determine the appropriate terminology to use in all written materials.

Reading level testing

We used Microsoft Word’s reading level tool to test the reading level of the survey questions. The tool uses the Flesch-Kincaid Grade Level Test, a validated and published test that rates text on a U.S. school grade level. We modified the question and instruction wording until they were at a 6th to 8th grade reading level, based on previous research indicating that this is the reading level needed to ensure that the surveys are understandable by this workforce.

Pre-testing

Pretesting has been completed in all participating states and general descriptions of edits made to surveys based on pretesting are provided below. Please note that in addition to the general edits described below, each state’s survey instruments vary to some degree from the main survey templates because some states elected not to ask all the questions on the original survey instruments and states made minor edits to terminology and answer options to be more consistent with terminology used to describe the workforce and types of services provided in their states. Surveys customized for each state are provided separately in an appendix.

General edits made to Independent Provider Survey based on Pretesting:

  • Edited instructions to note that their responses will not affect the services the person they support receives, to highlight the purpose of the survey, to emphasize that responses will be kept private and identifying information will be kept separate from their responses, and to note that they will receive a gift card. Noted that the survey will most likely take 10-15 minutes to complete to reflect pretest results. Re-iterated some of these items in instructions for specific questions.

  • Edited wording of several questions for clarity. In the question on length of time worked as an independent provider, clarified that we want them to count time at all jobs they have had doing this work. In the question on number of hours worked in the last week, clarified that we want them to answer about the person who they have worked the most hours for in the last week. Defined the term “paid time off”. Defined “high school” as grades 9-12. Defined “household” as including yourself and everyone who lives with you. Changed “consumer” to “person receiving support”.

  • Expanded answer options for some questions, to better capture all possible responses. In the question on why they decided to start providing services and supports, added an “other” option. In the question on training, split responses to capture whether they received the training in a class or training program, or from the person they support or their family. In question on how long they have worked for this person, added missing option for 2-3 years. In question on level of education completed, added option for grades 1-8. In question on who pays their medical bills, added options for “my spouse or family member’s insurance pays some or all” and “other”.

General edits made to Employer Survey based on Pretesting

  • Edited instructions to emphasize that responses will be kept private and will not affect their status as a provider, that identifying information will be kept separate from their responses and used only for survey tracking and follow-up to encourage response, and that results will be reported only in the aggregate and their organization will not be identified in any way, and to highlight the purpose of the survey.

  • Edited wording of several questions for clarity. Changed “free standing (i.e., the CEO/Director at our facility has ultimate decision making authority” to “independent entity (i.e., not part of a chain)”. Changed “If your organization is part of a chain” to “If your organization has multiple sites in this state”. Added “assisted living facilities” to the list of example service locations. Added definitions for ethnic origins. Rephrased the vacancies question to ask “How many direct service workers do you need to hire this week?” Revised wording of question on number of workers enrolled in health insurance. Changed “finding people willing to give up their unemployment benefits” to “finding people who can give up their unemployment benefits.”

  • Edited wording of instructions to clarify that employees of institutional settings should be included if they work with people living in home and community based settings. Edited instructions to highlight that we want them to use their organization’s definition of full-time and part-time and to report average amounts across their organization. Edited instructions to emphasize that if their organization is part of a larger organization, we ask them to send the survey to their organization’s headquarters or contact them for answers to any questions they do not know.

  • Expanded answer options for some questions, to better capture all possible responses. In question on the structure of the organization, added option for “government operated”. In question on how many DSWs the organization employed on the last day of the past month, added option for respondent to use a different day. Separated out responses for “part-time DSWs” vs.” on-call on intermittent DSWs”.

Individual Worker Survey Pretest – we recommended to states that they conduct the pretest for this survey in person with individual workers using a focus group format, to allow for capturing respondents’ facial expressions and tone of voice and to provide opportunity for them to talk with each other. SPT grant leads should invite more participants than are actually needed to ensure adequate turnout and they may want to offer a small cash incentive (e.g. $10-25) and refreshments to individual workers for their participation. We recommended the SPT grantees gather respondents together for about 90 minutes, explain the purpose of the survey and the focus group, ask them to complete the survey, and then gather feedback about the process in a group discussion. Depending on the size of the focus group, grantees may also want to make individual phone calls to follow up with respondents who are unable to attend the focus group.

Also, in pretesting with individual workers, workers were asked what type of incentive they would prefer (gift certificate, debit card with PIN, inexpensive gift item that might be of interest to independent providers). Examples of incentives used in previous surveys of direct service workers include $25 cash gifts and gift cards to Wal-Mart.

Employer Organization Survey Pretest – we recommended that the pretest for this survey be conducted “virtually” so that respondents are asked to complete the survey from their offices (so they can access their organization’s workforce data) on a set day and then join a conference call to discuss their feedback, allowing for at least two hours for participants to complete the survey. The SPT grantees should offer the pretest participants the option of providing feedback individually in case they do not feel comfortable discussing their responses or their feedback with “competitor” organizations. Employer organizations who cannot participate in a feedback conference call may provide feedback in writing via email to the SPT grant lead. The feedback discussion with each participant should take place within two days of survey completion at the latest.

In developing the survey instruments, every attempt was made to replicate or adapt existing instrumentation.

While the seven states will collect the same basic information, the survey instruments will be customized to reflect terminology used in each state. To the greatest extent possible, interview questions are modeled after or replace existing and tested surveys and well-established national data collections, to help assure comparability of results.

  • For both surveys, questions about workforce volume, stability, and turnover were based on recommendations in the 2009 DSW Resource Center data collection white paper.2

  • The Independent Worker Survey includes questions modeled after questions in a survey conducted by PHI Michigan on Individual Workers in Self-Direction in Michigan.

  • Other sources of questions for the Independent Worker Survey were a survey by The Lewin Group for the Minnesota Department of Human Services of long-term care workers in Minnesota3 and a study on health insurance and the recruitment and retention of direct service workers in Virginia.4

  • Many of the questions in the employer survey were based on a survey conducted by PHI for the state of Michigan. PHI tested the items with a focus group of providers. Although several of the questions were specific to Michigan, we made the questions for this data collection effort as consistent as possible with questions from the Michigan survey.

  • In addition, the questions on training on the employer survey included items suggested at a recent summit on training for paid and unpaid caregivers sponsored by CMS.5

  • Other sources of survey questions included surveys used by North Carolina and Maine in the CMS direct service workforce demonstration project, and the CMS caregiving leadership summit. Maine’s evaluation was based on a survey of 25 home care agencies and over 800 employees at those agencies. Employees were interviewed in 2005 with a follow-up interview in 2007.6 North Carolina’s survey was part of a federal grant focused on the caregiving profession aimed at “enhancing the job satisfaction and career opportunities for Direct Service Workers.” “Direct Service Worker” was defined to include “Certified Nursing Assistants, In-Home Aides, Personal Care Assistants, Care Givers, etc.”7

  • Results from the organizational cultural competence questions, on the employer survey, will be incorporated into the larger NBIP data collection effort managed by IMPAQ for CMS. The questions are aligned with indicators of organizational cultural competence developed for a 2002 project conducted for HRSA by The Lewin Group on Indicators of Cultural Competence in Health Care Delivery Organizations: An Organizational Cultural Competence Assessment Profile.8

The experience gained in collected workforce data in this small group of seven states will then be used to refine the instruments and recommended procedures and develop a Toolkit of data collection advice and tools that can be used by all states.

  1. Individuals Consulted on Statistical Aspect and Individuals Collecting and/
    or Analyzing Data

The SPT grantee leads are considered the primary investigators for this study. They are:

CMS has engaged the National Balancing Indicators Project (NBIP) team from IMPAQ International and the National Direct Service Workforce Resource Center team to support the SPT grantees in the design, administration, and analysis of the survey findings. The NBIP DSW Data Collection TA Team includes:

  • Oswaldo Urdapilleta, Ph.D., IMPAQ International, LLC, 443-539-1394, [email protected], Project Director of NBIP

  • Jennifer Howard, IMPAQ International, LLC, [email protected]

  • Lisa Alecxih, Senior Vice President, The Lewin Group, 703-269-5542, [email protected], Project Director of DSW Resource Center

  • Carrie Blakeway, Senior Consultant, The Lewin Group, 703-269-5711, [email protected], Project Manager of DSW Resource Center

  • Bernadette Wright, PhD, Consultant, The Lewin Group, 703-269-5716, [email protected]

  • Ashley Tomisek, The Lewin Group, 703-269-5632, [email protected]

  • Amy Hewitt, Senior Research Associate, Research and Training Center, Institute on Community Living, University of Minnesota, 612-625-1098, [email protected]

  • Sheryl Larson, PhD, Senior Research Associate, Research and Training Center, Institute on Community Living, University of Minnesota, 612-624-6024, [email protected]

  • Lori Sedlezky, MSW, Project Coordinator, Research and Training Center, Institute on Community Living, University of Minnesota, 612-624-7668, [email protected]

  • Steve Edelstein, National Policy Director, PHI, 718-402-7766, [email protected]

  • Hollis Turnham, Midwest Director, PHI, 517-327-0331, [email protected]

  • Tameshia Bridges, MSW, MI Senior Workforce Advocate, PHI, 517-372-8310, [email protected]

  • Joe Angelelli, PA State Director, PHI, 412-304-1463, [email protected]

  • Dorie Seavey, PhD, Director of Policy Research, PHI, 617- 630-1694, [email protected]

The CMS Project Officer for this project is:

  • Jean C. Accius, Ph.D., Division of Community Systems Transformation, Disabled & Elderly Health Programs Group, CMS Center for Medicaid, CHIP and Survey & Certification, 410-786-3270, [email protected]



1 Pavelchek, D and Mann, C. April 2007. Home Care Quality Authority: Phone Survey of Independent Providers. SERSEC – Pudget Sound Division. (This phone survey of independent providers had a 42% response rate. In the HCQA Individual Provider Mail Survey, the response rate was 26%.). In VA’s final report: “Several thousand surveys were mailed or distributed throughout Virginia. The return rate for the survey to consumer-directed service workers was 36%; for agency directors, 72%; and for individuals who received the ECAT training, 62%. (The response rate for surveys that employers distributed to agency-based direct service workers could not be determined.)”

The agency director response rate could be high because, “Employers who agreed to distribute surveys to their employees received a $50 gift card, and an incentive of a $10 gift card was offered to each DSW who completed a survey.” The ECAT training response rate was higher because this was a survey sent to people who had completed the ECAT training already and a $20 gift card incentive was provided.

The consumer-directed service workers survey had the lowest response rate (36%). Only a $10 gift card was mailed to these workers who completed the survey. This survey has a similar methodology to our proposed methodology: “Surveys were mailed to all those CD workers for whom we had current addresses (N = 3,965), and an incentive of a $10 gift card was offered to each person who completed the survey. In order to boost the response rate, a second mailing was sent to a random sample of individuals who had not responded within the initial specified time frame. Through these procedures, 1,429 surveys were returned (a response rate of 36%).”

2 Direct Service Workforce Resource Center. February 2009. The Need for Monitoring the Long-Term Care Direct Service Workforce and Recommendations for Data Collection. Washington, DC: Author. http://www.dswresourcecenter.org/tiki-index.php?page=Data+Collection

3 The Lewin Group. October 2009. Costs and Options for Insuring Minnesota’s Long-Term Care Workforce. Report for Minnesota Department of Human Services. http://www.dhs.state.mn.us/main/idcplg?IdcService=GET_DYNAMIC_CONVERSION&RevisionSelectionMethod=LatestReleased&dDocName=dhs16_147640

4 The Partnership for People with Disabilities at Virginia Commonwealth University, Health Insurance and the Recruitment and Retention of Direct Service Workers in Virginia: Final Report, Study for the Virginia Department of Medical Assistance Services, October 2007. http://hchcw.org/wpcontent/uploads/2008/07/dmas_final_reportoct2007.pdf

5 The Direct Service Workforce Resource Center. (2011). Capacity and Coordinating Support for Family Caregivers and the Direct Service Workforce: Common Goals and Policy Recommendations Emerging from the CMS Leadership Summit on the Direct Service Workforce and Family Caregivers. Washington, DC: Author.

6 University of Southern Maine. (2007). Providing Health Coverage and Other Services to Recruit and Retain Direct Service Community Workers in Maine: The Dirigo Difference. http://www.mainerealchoices.org/workforce_workdemo.htm.

7 North Carolina DSW Survey

8 Karen W. Links, Sharrie McIntosh, Johanna Bell, and Umi Chong. Indicators of Cultural Competence in Health Care Delivery Organizations: An Organizational Cultural Competence Assessment Profile. The Lewin Group. Report for The Health Resources and Services Administration, U.S. Department of Health and Human Services. http://www.hrsa.gov/CulturalCompetence/healthdlvr.pdf



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement – Part A
AuthorCMS
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy