SSA_AOT Evaluation_ASPE Passback_Clean

SSA_AOT Evaluation_ASPE Passback_Clean.docx

Evaluation of the Assisted Outpatient Treatment Grant Program for Individuals with Serious Mental Illness

OMB: 0990-0465

Document [docx]
Download: docx | pdf








Evaluation of the Assisted Outpatient Treatment Grant Program for Individuals with Serious Mental Illness







Supporting Statement – Section A







Submitted – May 12, 2018



















Program Officer/Project Officer

Joel Dubenitz, Ph.D. – Social Science Analyst

U.S. Department of Health and Human Services

Office of the Assistant Secretary for Planning and Evaluation

200 Independence Avenue SW, Washington DC 20201


Part A: Justification


  1. Circumstances Making the Collection of Information Necessary


  1. Overview


The Office of the Assistant Secretary for Planning and Evaluation (ASPE) at the U.S. Department of Health and Human Services (HHS) is requesting Office of Management and Budget (OMB) approval for data collection activities to support the evaluation of the Substance Abuse and Mental Health Services Administration’s (SAMHSA’s) Assisted Outpatient Treatment (AOT) Grant Program for Individuals with Serious Mental Illness.


Enacted into law on April 1, 2014, Section 224 of the Protecting Access to Medicare Act

(PAMA) (Public Law 113-93) mandated a 4-year pilot program of grants to implement AOT programs nationwide [Attachment A]. Section 224(e) required a program evaluation to examine the impact of AOT on cost savings and public health outcomes, incarceration, homelessness, and patient and family satisfaction with program participation.


AOT grants were awarded to 17 grantees on October 1, 2016. On October 4, 2016 ASPE awarded task order #16-233-SOL-00683 to RTI International, in partnership with Duke University and Policy Research Associates (PRA), to design and conduct implementation and outcome evaluations of six of the 17 grantees of the AOT program. These in-depth sites, presented below, were selected by the HHS AOT program advisory committee per several criteria, including but not limited to geographic diversity, AOT program type, AOT program size, data availability, and suitability for the outcome evaluation.


Grantee

Implementing Location(s)

AltaPointe Health Systems, Inc.

Baldwin Co, AL

Cook County Health & Hospital System

Chicago, IL

Hinds County Mental Health Commission

Jackson, MS

Doña Ana County

Las Cruces, NM

Alcohol, Drug Addiction & Mental Health Services Board of Cuyahoga County (ADAMHSBCC)

Cleveland, OH

Oklahoma Department of Mental Health and Substance Abuse Services (ODMHSAS)

Oklahoma City, Tulsa, and Rogers, Washington, Ottawa, and Delaware Counties, OK


The implementation evaluation of the six AOT grantees listed above was conducted from November 2016 to August 2017, with the purpose of gathering information related to the processes and practices of AOT in seven domains:


  1. AOT programs and court processes

  2. Target populations

  3. Service infrastructure and clinical approaches

  4. Stakeholder involvement

  5. Person-centered approaches and procedural justice

  6. Innovation

  7. Evaluation capacities


A final report of the implementation evaluation was accepted by ASPE on October 18, 2017. An overview of the results was integrated into the 2018 Report to Congress for PAMA 224


The outcome evaluation will examine which characteristics of AOT programs influence health and social outcomes for people under AOT orders, as well as the use of services, associated costs, and patient and family satisfaction with the AOT process. The ability of HHS to provide Congress with the information they need to understand the impact of AOT relies, in part, on a rigorously designed, independent evaluation of the AOT grant program. This evaluation will take place across two phases: Phase 1, taking place through October 29, 2018, and Phase 2, extending through September 25, 2020. Findings will inform future reports to Congress. Evaluation data collection that requires OMB approval will not begin until ASPE receives final OMB approval.


  1. Study Design and Evaluation Questions


Informed by limitations of prior research as well as relevant findings from the implementation evaluation, the study team seeks to implement a mixed-methods approach that will suitably address the challenges inherent in multi-site data collection and analysis to address eleven evaluation questions in the following domains:


  • Domain 1: AOT program outcomes

EQ1 Does AOT affect treatment, clinical functioning, and social functioning outcomes?

EQ2 Do outcomes differ by AOT duration and/or scope of services (e.g., Assertive Community Treatment [ACT] alone vs. ACT supplemented by other evidence-based practices [EBPs])?

EQ3 What do patients report about AOT?

EQ4 Are families satisfied with AOT? With treatment providers? With judicial/legal personnel?


  • Domain 2: AOT models and interventions

EQ5 Which models of AOT lead to better outcomes?

EQ6 Which specific intervention components, including the use of EBPs, lead to better outcomes?

EQ7 Do outcomes differ based on the intensity of services offered, or intensity of resources available to providers?


  • Domain 3: AOT resources and costs

EQ8 What resources are available to support AOT?

EQ9 What costs are associated with the implementation of AOT?

EQ10 Are cost savings associated with outcomes attributable to AOT?

EQ11 How do sites plan to fund AOT, including any system liaison positions (e.g., with hospitals, jails), once funding ends?


The outcome evaluation design and analysis plan was developed in consultation with the HHS AOT program advisory committee (ASPE, SAMHSA, and NIMH) and an AOT Technical Advisory Group (TAG). Specifically, a meeting to solicit feedback on the preliminary design plan was held on October 5th, 2017 in Washington, DC and included participants from the HHS committee, TAG, and evaluation team. Input and recommendations from meeting attendees were presented according to their feasibility and cost-benefit in a design options memo, and subsequent analytic decisions from the project COR and HHS committee were integrated into the existing design and analysis plan.


The evaluation framework is presented below. The intervention process illustrated in the left half of the figure provides some examples of program elements associated with selection pathways into AOT and the subsequent treatment and service package. The outcome evaluation will focus on program elements that exhibit a sufficient level of variation in and across sites in order to generate testable hypotheses around their moderating impacts. The right half of the figure outlines the outcomes of interest, separated into stages of treatment experiences, clinical outcomes, and social functioning. The evaluation will also focus on identifying significant client- (e.g., patient demographic characteristics) and program-level (e.g., mental health system characteristics) moderators of AOT effectiveness.



  1. Purpose and Use of Information Collection


Section 224(e) of PAMA requires annual reports to Congress that include evaluation of: cost savings and public health outcomes such as mortality, suicide, substance abuse, hospitalization, and use of services; rates of incarceration by patients; rates of homelessness among patients; and patient and family satisfaction with program participation. The data collected under this submission will help HHS address the evaluation questions listed above and inform the required reports to Congress. Each proposed data collection is described below, along with how, by whom, and for what purpose the collected information will be used.


Structured client interview. Beginning in Phase 1 of data collection, non-primary treating clinicians at each of the six in-depth sites will use the structured client interview instrument to conduct baseline and follow-up (e.g., 6- and 12-month while on AOT, 6-month post-AOT) interviews with AOT program participants. (It should be noted that the six in-depth sites that were part of the implementation evaluation may not be the same six sites that are selected for the outcome evaluation study. Outcome evaluation site selection, and any changes from the implementation sites, will be determined by the HHS Advisory Committee. As with the implementation evaluation, these sites will be selected per criteria including geographic diversity, AOT program type, AOT program size, data availability, and suitability for the outcome evaluation. Please see Supporting Statement B for information on comparison group selection, inverse probability of treatment weighting, and regression modeling to account for remaining differences between sites) These interviews will gather information across numerous factors, including housing, perceived functioning and well-being, clinical symptoms, treatment history and service use, medication use, substance use, satisfaction with treatment, perceived coercion to adhere to treatment, general pressures to adhere to treatment, AOT experiences, and criminal justice experiences. Data gathered during the structured client interview will inform responses to Evaluation Questions 1, 2, 3, 5, 6, and 7. This interview will be administered every six months and take approximately 60 minutes to complete. The structured client interview instrument is provided in Attachment B.


Family satisfaction survey. The family satisfaction survey instrument will be administered to a sample of AOT program participants’ family members in order to gather information related to their perceptions of AOT, involvement in and satisfaction with the civil process, satisfaction with treatment services received under the AOT court order, client behaviors, and family well-being. Data gathered using the family satisfaction survey will assist in answering Evaluation Question 4. The interview will be administered roughly six months following the AOT program participant’s entry into AOT and will take approximately 15 minutes to complete. The family satisfaction survey instrument is provided in Attachment C.


Cost questionnaire. Cost data will be collected in two phases. In Phase 1, we will administer a cost questionnaire with representatives of each sites’ court systems, state psychiatric hospitals, and program directors at the community-based treatment facilities. Once respondents have completed these questionnaires to the best of their ability, we will review the information provided. In Phase 2, after reviewing the provided information, we will set up a short follow-up telephone interview to discuss any outstanding questionnaire items. This will ensure that the data are of a high quality, as some cost questionnaire items will require estimating pro-rated costs to only those costs associated with AOT. Data collected using the cost questionnaire will allow the evaluation team to answer Evaluation Questions 8, 9, 10, and 11. The cost questionnaire instrument is provided in Attachment D.


Docket case monitoring form. Information on the hearings for each respondent will be documented on a periodic basis, based on the frequency of hearings at each site. The form will collect information on the date of court hearing, presiding judge, and court location for the day’s hearings. For each respondent with a hearing on a given day, the hearing type, respondent attendance, hearing representatives, hearing length, verbal interaction between judge and respondent, hearing outcome, warnings and/or reminders, response to noncompliance, and next hearing date will be documented. The information will be gathered by the site evaluator through observation or provided by the probate liaison who is present at each hearing. Information on each respondent’s hearing on a given day will be recorded as a row on the docket case monitoring form, with an additional row added for each court hearing held on a given day. Data collected using the docket case monitoring form will allow the evaluation to answer Evaluation Question 5. The docket case monitoring form instrument is provided in Attachment E.


AOT characteristics form. The AOT characteristics form will be completed by site evaluators once a month in order to provide site-level information regarding target populations, initiation, and post-initiation of AOT. Items are intended to track the civil and legal processes of AOT across sites and over time. Data gathered using this form will assist in answering Evaluation Questions 5, 6, and 7. The AOT characteristics form is provided in Attachment F.


Secondary, administrative data. While the above data collection elements apply to the six in-depth sites that will participate in the outcome evaluation, the collection of secondary, administrative data, as is described in detail in 4a. and 4b. below, will apply to all funded sites.


  1. Use of Improved Information Technology and Burden Reduction


AOT program sites will use SAMHSA’s Performance Accountability and Reporting System (SPARS) to facilitate submission of select administrative data collections, including progress reports and TRAC-NOMS data. RTI will additionally permit electronic submissions of both primary information and secondary, administrative data collected by the sites.


  1. Efforts to Identify Duplication and Use of Similar Information


In formulating the evaluation plan, the research evaluation team has carefully considered how to minimize burden by supplementing existing administrative and secondary data sources with targeted primary data collection. To this end, the evaluation incorporates the following approaches to reducing duplication to employ similar information.


  1. Use of Administrative Data


We will use pre-existing administrative data generated by grantees to supplant data that would otherwise be obtained through primary data collections. Administrative data collections will include use of mandated TRAC-NOMS data; grantee applications, reports, and other administrative documents; drug testing data; evidence-based practice fidelity assessments; integrated reporting system extracts; and site-maintained participating tracking database extracts.


  1. Use of Secondary Data


We will use secondary data maintained by non-grantee agencies to measure population outcomes, including Medicaid and non-Medicaid services encounter data and local and state arrest records (see table below for data elements that will be requested).


However, there are a number of evaluation questions that cannot be answered with administrative and secondary data sources described above, particularly related to site-level AOT characteristics and clinical services delivered over time, clinical functioning outcomes, patient and family perceptions, and some program costs. As a result, the data instruments included in this submission have been developed to obtain complementary and augmented information.


Data Sources

Data Elements


AOT ID




Site-generated AOT linking ID


Treatment services data




Date(s)



Procedure code



Diagnosis code



Length/time of service (hr./min.), if applicable



Drug name, if applicable



Drug supply, if applicable


Criminal justice data




Date(s)



Charge



Jail housing status



Jail treatment records, including medications



  1. Impact on Small Businesses or Other Small Entities


The six AOT sites in the outcome evaluation are either state mental health authorities, local government entities, or, in one instance, a regional provider organization. Per the SAMHSA FOA for the AOT grants (SM-16-011), no more than twenty percent of the grant award must be applied to performance measurement, including participation in the cross-site evaluation. The use of administrative and secondary data collections will be employed to reduce any associated burden of the primary data collections on the sites.


  1. Consequences of Collecting the Information Less Frequently.


Each of the primary data collection instruments described above will obtain information necessary to answer evaluation research questions. The inclusion of all planned data sources is needed to glean information about AOT program outcomes, models and interventions, and resources and costs. Primary data collections have been designed to gather those data which are not obtainable by administrative or secondary means. Efforts also have been made to reduce the burden of primary data collection on participants through revision of the structured client interview based on consultation between ASPE, NIMH, and SAMHSA.


  1. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5


The information collection complies with 5 CFR 1320.5(d)(2).


  1. Comments in Response to the Federal Register Notice and Efforts to Consult outside the Agency.


On January 31, 2018, a 60-Day Federal Register Notice was published at Register Volume 83, Page #4496, Number 21. No comments were received.


The outcome evaluation design has been informed by the HHS AOT Advisory Committee, which includes members from SAMHSA and NIMH in addition to ASPE. In addition, an AOT Technical Advisory Group of outside subject matter experts was convened in October 2017 to provide guidance on the design and analysis plan for the outcome evaluation.


  1. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


We will not be providing any remuneration or incentive to respondents. Some sites have elected to use a portion of their own grant funds to compensate respondents. Across all sites, however, we are providing technical assistance and training via webinar to help sites successfully track participants and administer instruments.


  1. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


AOT participants at each of the sites will be informed about confidentiality and the exceptions to confidentiality (i.e., cases in which potential risks outweigh a participant’s right to confidentiality). Sites will maintain confidentiality of the research data by eliminating any reference to specific individuals and coding all information by number; personal identifiers will be confined to consent and locator forms. Specifically, locator forms will contain the participant’s name, contact information, and the contact information of other people who might know how to get in touch with the participant for follow-up interviews. Locator forms will be stored separately from all other data, and participants will be assured that researchers will not disclose the reason for calling any provided contacts. Local evaluators at each of the sites will oversee linkage of survey data to secondary, administrative data in order to ensure that all datasets will be stripped of identifiable information. Additionally, we will work with sites and their respective IRBs to develop procedures and protocols to minimize risks that participant data (i.e., survey responses) would ever be shared with primary treating clinicians or other treatment staff, nor judicial representatives of the AOT program.


Furthermore, numerous steps will be taken to ensure the protection of those AOT recipients who decline to participate. First, the consent disclosure will be reviewed with the potential participant alone. No treatment staff will be present, other than the non-primary treating clinician conducting the interview. Further, whether an AOT recipient chooses to participate following the consent disclosure will not be reported to any staff working in the facility. Because it is possible that someone (e.g., Project Director, treating clinicians, case managers) might conceivably infer nonparticipation by the relatively brief amount of time a nonparticipating AOT recipient spends with the interviewer before leaving, the interviewer also will inform all AOT recipients that they can choose to remain in the room for the approximate amount of time that the research protocol takes to complete—without actually participating in the study. This will further insulate nonparticipants from any potential inferences being drawn that they did not cooperate with the research team, if this is a source of concern on their part. Although we are doubtful that many nonparticipants will elect to remain with the interviewer for this time period, sites will give them this option.


  1. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


Some of the questions included in the structured client interview and family satisfaction survey are sensitive in nature, such as those related to criminal activity and drug and alcohol use. Participant responses to these questions are important to the cross-site evaluation, in part because they pertain to outcomes that AOT is designed to affect. Moreover, criminal activity and substance use are important outcomes from a public safety and policy standpoint, and findings will have implications for subsequent policy decisions.


AOT recipients will be informed prior to participation in the interview that they will be asked about sensitive subjects, including feelings and emotions, and past behaviors, including criminal activity and drug and alcohol use. The interviewer will explain that information will be used for this research only, and will not be shared with treatment providers unless necessary for the safety of the participant or others. Again, the consent disclosure will emphasize that the participant will not be identified by name in anything written about the study, and that the information he or she provides will be combined with results from many other people and will be reported in such a way that no one’s individual answers can be identified. The participant will be advised that they are free to

withdraw from the interview at any time without prejudice and without any consequences to their treatment or court order. Furthermore, the participant may refuse to answer any question or may take a break at any time during the interview.


  1. Estimates of annualized burden hours and costs


The following table provides estimates of the average annual burden for collecting the proposed information.


Estimated Annualized Burden Hours and Costs to Respondents

Forms

Respondents

Number of respondents

Number of responses per respondentc

Average burden per response (hours)

Total annual burden (hours)

Average Hourly Wage Rate ($)

Total Annual Respondent Cost ($)

Client Interview Instrument

Program Participant

520a

3

1.00

1560.00

$22.66d

$35,349.60


Comparison Subjects

520a

3

1.00

1560.00

$22.66d

$35,349.60

Family Satisfaction Survey

Program Participant's Family Member

173b

1

0.25

43.25

$22.66d

$980.05

Cost Questionnaire

Program Administrator

6

1

1.25

7.50

$29.42e

$220.65


Other Site Representatives

12

1

1.25

15.00

$22.66d

$339.90

Docket Case Monitoring Form

AOT Local Evaluator

6

390

0.10

234.00

$22.66d

$5,302.44

AOT Characteristics Form

AOT Local Evaluator

6

12

0.50

36.00

$22.66d

$815.76

Total


1243

411

0.76

3455.75

$23.63

$78,358.00

a The estimated number of respondents is based on the average targeted intake for each site across their third and fourth grant years, assuming a 100% intake rate and a 100% evaluation participation rate. For comparison subjects, we are assuming that we will collect data from a comparison population on the same number of subjects as within each intervention site.

b The estimated number of respondents is one-third of the number of respondents for the Client Interview Instrument, rounded to the nearest whole number.

c Over the course of the evaluation, the client interview instrument will be administered at baseline and every six months during AOT, as well as six months post-AOT, for an average of three responses per year. The family satisfaction survey will be administered once. The cost questionnaire will be administered twice, once annually. The docket case monitoring form will be completed at each hearing; we are assuming that, across sites, participants will average 4-5 hearings (e.g., petition, termination, and 1-2 status/non-compliance/treatment modification hearings). The AOT characteristics form will be completed once every month.

d Average hourly wage rate is estimated by taking the Annual Average Weekly Wage for the six states (AL, IL, MS, NM, OH, OK) in 2016 according to the Bureau of Labor Statistics (Quarterly Census of Employment and Wages, 2016, Annual Averages, All Establishment Sizes) and converting to an hourly wage assuming 38.7 hours per week worked by labor force participants as reported by the Bureau of Labor Statistics (Labor Force Statistics, Current Population Survey, Household Data, Annual Averages, 19. Persons at Work in Agriculture or Non-Agricultural Industries by Hours of Work [February 8, 2017]).

e Average hourly wage rate drawn from 11-9151 Social and Community Service Managers for the six states (AL, IL, MS, NM, OH, OK) in 2016 according to the Bureau of Labor Statistics (Occupational Employment and Wages, May 2016, 11-9151 Social and Community Service Managers, by State)


The following table provides estimates of the average annual burden for obtaining, preparing, and deidentifying the secondary, administrative data for each of the 17 AOT sites.


Estimated Annualized Burden Hours to Secondary, Administrative Data Record Keepers

Data Sources

Record Keepers

Number of Record Keepers

Annual Cuts of Datab

Average Burden per Cut of Data (hours per record keeper)

Total Annual Burden (hours per site)

Average Hourly Wage Rate ($)

Total Annual Record Keeper Cost ($)

Medicaid Services

AOT Project Director/AOT Local Evaluator/State Medicaid Representative

51a

3

40.00

120.00

$22.66c

$46,226.40

Criminal Justice

AOT Project Director/AOT Local Evaluator/State DOC Representative

51a

3

40.00

120.00

$22.66c

$46,226.40

Non-Medicaid Treatment Services

AOT Project Director/AOT Local Evaluator/Local Service System, Managed Care Representative

51a

3

40.00

120.00

$22.66c

$46,226.40

Total


153

9

40.00

360.00

$22.66

$138,679.20

a The estimated number of record keepers is based on the AOT program site likely needing to involve the project director, the local evaluator, and the external (if necessary) keeper of the data, across each of the 17 AOT sites.

b The estimated number of cuts of data represents the final three years of the grant program.

c Average hourly wage rate is estimated by taking the Annual Average Weekly Wage for the six states (AL, IL, MS, NM, OH, OK) in 2016 according to the Bureau of Labor Statistics (Quarterly Census of Employment and Wages, 2016, Annual Averages, All Establishment Sizes) and converting to an hourly wage assuming 38.7 hours per week worked by labor force participants as reported by the Bureau of Labor Statistics (Labor Force Statistics, Current Population Survey, Household Data, Annual Averages, 19. Persons at Work in Agriculture or Non-Agricultural Industries by Hours of Work [February 8, 2017]).


  1. Estimates of other total annual cost burden to respondents or record keepers.


There will be no direct costs to the respondents or recordkeepers other than the time spent engaged in data collection. This information has been captured in the two tables above under item #12.


  1. Annualized cost to the government.


We estimate that one SAMHSA employee will spend roughly one hour for each of the six sites to gather relevant administrative documents (e.g., budgets, reports) each year. Annual costs of staff time are estimated to be $317.28. Additionally, there are costs associated with conducting the outcome evaluation. Specifically, $1,550,298, or roughly 75% of the total contract ($2,084,407) awarded for the cross-site evaluation by ASPE will be used to support the outcome evaluation. The total estimated average cost to the government per year of the outcome evaluation is $387,574.


  1. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I.


This is a new data collection.


  1. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


We will incorporate results from the cross-site outcome evaluation in the following documents and presentations:


  • An outcome evaluation memo to be submitted to ASPE in September 2018, providing a detailed status of the evaluation and preliminary findings to support the FY2018 report to Congress.

  • A briefing for HHS, other government staff, and other interested parties to be held in October 2018.

  • An updated outcome evaluation design and analysis plan to be submitted to ASPE in December 2018 and revised following receipt of comments.

  • An outcome evaluation final report to be submitted to ASPE in September 2020, providing an overview of the outcome evaluation, including the design, evaluation questions, methods and analysis, findings, conclusions, and implications for strengthening AOT programs.

  • A summary evaluation report to be submitted to ASPE in October 2020, describing the methods used and findings from both the implementation and outcome evaluations, as well as cross-cutting themes to have emerged from both evaluations. Based on these results, we will identify lessons for providing AOT to adults with SMI and ensuring their access to needed services, as well as policy options available to federal and state programs and policy makers to promote these goals. Finally, the summary report will discuss any needs for additional data to assess the impact of AOT.

  • An additional briefing for HHS, other government staff, and other interested parties to be held in November 2020, presenting an overview of major findings and facilitating a discussion on implications of the findings for public policy.


The following table provides an overview of the evaluation tasks and the years in which the tasks will be conducted. The research team may also incorporate the aggregate results from the cross-site evaluation into journal articles, scholarly presentations, and congressional testimony related to the outcomes of the AOT grant program.


Evaluation timeline

2017

2018

2019

2020

Development of outcome evaluation design & analysis plan

X




OMB Submission

X

X



Data collection and analysis, phase 1


X



Outcome evaluation memo


X



Briefing for HHS officials


X


X

Outcome evaluation design & analysis plan update


X

X


Data collection and analysis, phase 2



X

X

Outcome evaluation final report




X

Summary evaluation report




X



  1. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


We are requesting no exemption.


  1. Explain each exception to the certification statement identified in Item 19, "Certification for Paperwork Reduction Act Submissions," of OMB Form 83-I.


There are no exceptions to the certification. These activities comply with the requirements in 5 CFR 1320.9.


LIST OF ATTACHMENTS – Section A


Note: Attachments are included as separate files as instructed.


  1. Protecting Access to Medicare Act Section 224 (Attachment A)

  2. Structured Client Interview (Attachment B)

  3. Family Satisfaction Survey (Attachment C)

  4. Cost Questionnaire (Attachment D)

  5. Docket Case Monitoring Form (Attachment E)

  6. AOT Characteristics Form (Attachment F)

14


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJohnson, Kiersten
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy