CHIPRASupporting Statement Part A -- Revised Feb 14 2014

CHIPRASupporting Statement Part A -- Revised Feb 14 2014.docx

Evaluation of the Children's Health Insurance Program Reauthorization Act of 2009 (CHIPRA) Quality Demonstration Grant Program

OMB: 0935-0190

Document [docx]
Download: docx | pdf



SUPPORTING STATEMENT



Part A









Evaluation of the Children’s Health Insurance Program Reauthorization Act of 2009 (CHIPRA) Quality Demonstration Grant Program

OMB# 0935-0190







Version: February 14, 2014

Revision of Previously Approved Protocol #0935-0190



Agency for Healthcare Research and Quality (AHRQ)

CONTENTS

A. Justification 3

1. Circumstances that Make the Collection of Information Necessary 3

2. Purpose and Use of Information 8

3. Use of Improved Information Technology 9

4. Efforts to Identify Duplication 10

5. Involvement of Small Entities 10

6. Consequences If Information Is Collected Less Frequently 10

7. Special Circumstances 11

8. Register Notice and Outside Consultations 11

9. Payments/Gifts to Respondents 11

10. Assurance of Confidentiality 12

11. Questions of a Sensitive Nature 12

12. Estimates of Annualized Burden Hours and Cost 13

13. Estimates of Annualized Respondent Capital and Maintenance Costs 15

14. Estimates of Annualized Cost to the Government 15

15. Changes in Hour Burden 15

16. Time Schedule, Publication and Analysis Plans 15

17. Exemption for Display of Expiration Date 15

List of Attachments ………………………………………… ………………….11


  1. Justification

Overview of Revision

The Agency for Healthcare Research and Quality (AHRQ) requests to extend and revise a currently approved qualitative study to evaluate the Children’s Health Insurance Program Reauthorization Act of 2009 (CHIPRA) Quality Demonstration Grant program. The first phase of qualitative data collection (OMB# 0935-0190) was approved by the Office of Management and Budget with an expiration date of February 28, 2015. Key informant interviews conducted during the first round of qualitative data collection provided information on early demonstration strategies and lessons learned by the demonstration States during the first two years of their grant projects. This revised information collection request seeks approval to: (1) conduct a second round of key informant interviews; and (2) conduct parent and adolescent focus groups in selected demonstration States.


The qualitative data collected through this request will contribute to a study that uses a pre-post mixed methods design. The information gathered from the second round of qualitative data collection will build on the first round to provide a longitudinal understanding of how implementation of grant-funded projects evolved over time and how participants and stakeholders perceive the impact of the demonstration. The information collected under this request will be analyzed in conjunction with progress reports and other documents submitted by grantees, Medicaid and CHIP administrative and claims data, and provider survey data to gain a comprehensive understanding of the impacts of the demonstration and the strategies employed by States for achieving those impacts. The provider survey was approved under a separate information request (OMB# 0935-0215).


1. Circumstances that Make the Collection of Information Necessary

The mission of the Agency for Healthcare Research and Quality (AHRQ), set out in its authorizing legislation, The Healthcare Research and Quality Act of 1999 (see http://www.ahrq.gov/hrqa99.pdf), is to enhance the quality, appropriateness, and effectiveness of health services, and access to such services, through the establishment of a broad base of scientific research and through the promotion of improvements in clinical and health systems practices, including the prevention of diseases and other health conditions. AHRQ shall promote health care quality improvement by conducting and supporting:

  1. research that develops and presents scientific evidence regarding all aspects of health care; and

  2. the synthesis and dissemination of available scientific evidence for use by patients, consumers, practitioners, providers, purchasers, policy makers, and educators; and

  3. initiatives to advance private and public efforts to improve health care quality.

Also, AHRQ shall conduct and support research and evaluations, and support demonstration projects, with respect to (A) the delivery of health care in inner-city areas, and in rural areas (including frontier areas); and (B) health care for priority populations, which shall include (1) low-income groups, (2) minority groups, (3) women, (4) children, (5) the elderly, and (6) individuals with special health care needs, including individuals with disabilities and individuals who need chronic care or end-of-life health care.

Section 401(a) of the Children’s Health Insurance Program Reauthorization Act of 2009 (CHIPRA), Pub. L. 111-3, amended the Social Security Act (the Act) to enact section 1139A (42 U.S.C. 1320b-9a). AHRQ is requesting approval from the Office of Management and Budget (OMB) to conduct focus groups and a second round of in-depth interviews to support a comprehensive, mixed-methods evaluation of the quality demonstration grants authorized under section 1139A(d) of the Act (Attachment A). Evaluating whether, and through what mechanism, projects funded by the CHIPRA demonstration grants improve the quality of care received by children in Medicaid and CHIP aligns with AHRQ’s mission of improving the quality and effectiveness of health care in the United States.

CHIPRA included funding for five-year grants so that States can experiment with and evaluate several promising ideas related to improving the quality of children’s health care in Medicaid and CHIP.1 In February 2010, the U.S. Department of Health and Human Services announced the award of 10 demonstration grants to States that convincingly articulated an achievable vision of what they could accomplish by the end of the five-year grant period, described strategies they would use to achieve the objectives, and explained how the strategies would achieve the objectives. Applicants were encouraged by CMS to address multiple grant categories (described below) and to partner with other States in designing and implementing their projects.

Of the 10 grantee States selected, six are partnering with other States, for a total of 18 demonstration States. The demonstration States are: Colorado (partnering with New Mexico); Florida (with Illinois); Maine (with Vermont); Maryland (with Wyoming and Georgia); Massachusetts; North Carolina; Oregon (with Alaska and West Virginia); Pennsylvania; South Carolina; and Utah (with Idaho).

These demonstration States are implementing 52 distinct projects in at least one of five possible grant categories, A to E. Category A grantees are experimenting with and/or evaluating the use of new pediatric quality measures. Category B grantees are promoting health information technology (health IT) for improved care delivery and patient outcomes. Category C grantees are expanding person-centered medical homes or other provider-based levels of service delivery. Category D grantees will evaluate the impact of a model pediatric electronic health record. Category E grantees are testing other State-designed approaches to quality improvement in Medicaid and CHIP.

AHRQ’s goal in supporting an evaluation of the CHIPRA Quality Demonstration Grant Program is to provide insight into how best to implement quality improvement programs and to provide information on how successful programs can be replicated to improve children’s health care quality in Medicaid and CHIP.2 To meet these goals, the evaluation has the following requirements:

  1. to identify CHIPRA State activities that measurably improve children’s health care, especially as it pertains to those enrolled in Medicaid and CHIP.

  2. to develop a deep, systematic understanding of how CHIPRA demonstration States carried out their grant-funded projects.

  3. to understand why the CHIPRA demonstration States pursued certain strategies.

  4. to understand whether and how the CHIPRA demonstration States’ efforts affected outcomes related to knowledge and behavior change in targeted providers and consumers of health care.

To meet AHRQ’s goals and carry out the requirements, the agency’s evaluation contractor, Mathematica Policy Research, with its subcontractors The Urban Institute and AcademyHealth, has designed a comprehensive evaluation that will make the best possible use of quantitative and qualitative research methods, including the following activities and data collections:

  1. Key Informant Interviews. Under the previously approved information data collection, researchers visited each of the 18 demonstration States in 2012 to conduct in-person interviews. Under the revised data collection, researchers will conduct a second round of in-person interviews in each demonstration State in 2014. In 2014, we plan to interview the same individuals who were interviewed in 2012, unless they have had little or no project involvement since 2012, as well as others who may have become involved since 2012. The 2014 protocols follow the same structure and cover the same topics as the protocols approved for 2012. For 2014, we refined the 2012 protocols to focus on changes in program implementation and resulting outcomes since our last round of data collection. This approach will further enable our ability to conduct longitudinal analyses.

    1. Key Staff Interviews – Key staff members are staff directly involved in the design and oversight of grant-funded activities. The purpose of these interviews is to gain insight into the implementation of demonstration projects, to understand contextual factors, and to identify lessons and implications for the broad application and sustainability of projects. Semi-structured interviews were conducted with up to 4 key staff members per State in 2012 and will be completed with up to 4 key staff members per State in 2014. The key staff interview guide for 2012 is included as Attachment B and the interview request and confirmation emails as Attachment C. The revised key informant interview guide for 2014 is included as Attachment D.

    2. Implementation Staff Interviews – Other implementation staff are staff involved in the day-to-day implementation of grant-funded projects. These staff members include State agency employees, provider trainers or coaches, health IT vendors, and project consultants. The purpose of these interviews is to gain insight into the opportunities and challenges related to key technical aspects of project implementation. Semi-structured interviews were conducted with up to 16 other implementation staff members per State in 2012 and will be completed with up to 16 other implementation staff members per State in 2014. The implementation staff interview guide for 2012 is included as Attachment E and the interview request and confirmation emails as Attachment C. The revised implementation staff interview guide for 2014 is included as Attachment F.

    3. Stakeholder Interviews – External stakeholders have a direct interest in children’s care quality in Medicaid and CHIP. Stakeholders include representatives of managed care organizations, State chapters of the American Academy of Pediatrics, advocacy organizations for children and families, and social service agencies. These stakeholders are likely to be familiar with the CHIPRA projects and may serve on advisory panels or workgroups related to one or more projects. The interviews will gather insight into the opportunities and challenges related to project implementation, stakeholder satisfaction with their project involvement, and contextual factors. Semi-structured interviews were conducted with up to 8 external stakeholders per State in 2012 and will be completed with up to 8 external stakeholders per State in 2014. The stakeholder interview guide for 2012 is included as Attachment G and the interview request and confirmation emails as Attachment C. The revised external stakeholder interview guide for 2014 is included as Attachment H.

    4. Health Care Organization Staff Interviews – Depending on the projects a State is implementing, health care organizations participating in demonstration activities can include private health care provider practices, public health clinics, federally qualified health centers, care management entities, or school based health centers. Staff members include physicians, nurse practitioners, care managers, physician assistants, practice managers, health care organization administrators, and other staff involved in demonstration activities. Interviews will capture information about project-related activities, staff perceptions of outcomes and impacts, and the organizations involvement in other quality-improvement initiatives. Semi-structured interviews were conducted with up to 12 staff members per State in 2012 and will be completed with up to 12 staff members per State in 2014. The health care organization staff interview guide for 2012 is included as Attachment I and the interview request and confirmation emails as Attachment C. The revised health care organization staff interview guide for 2014 is included as Attachment J.

  2. Focus Groups. Under the revised data collection request, AHRQ is seeking approval for researchers to conduct parent and adolescent focus groups in selected demonstration States. Focus groups were not conducted in the first round of data collection. The original information collection request therefore did not seek approval for focus groups.

    1. Parent Focus Groups – We will hold a total of 16 in-person focus groups with parents, guardians, or other caregivers of children who are enrolled in Medicaid or CHIP and are served by the medical practices involved in the CHIPRA demonstration. Specifically, we will conduct 4 focus groups each in Oregon, Utah, Florida, and South Carolina, 4 of the 12 States that have implemented a patient-centered medical home demonstration project. The number of participants per focus group will range from 8 to 10, resulting in a maximum of 160 adults participating. The parent focus groups will be conducted in English and in Spanish. The parent focus group protocol is included as Attachment K, the parent focus group recruitment materials as Attachment L, the parent telephone screening script as Attachment M, the parent pre-focus group interview script as Attachment N, and the informed consent forms as Attachment O.

    2. Adolescent Focus Groups – We will hold four in-person focus groups with adolescents who are enrolled in Medicaid or CHIP and are served by school-based health centers involved in the CHIPRA demonstration. We will hold the focus groups in New Mexico, one of two States that have implemented a school-based health center project. The number of participants per focus group will range from 8 to 10, resulting in a maximum of 40 adolescents participating. The adolescent focus group protocol is included as Attachment P, the adolescent focus group recruitment materials as Attachment Q, the adolescent telephone screening script as Attachment R, the adolescent pre-focus group interview script as Attachment S, and the informed consent forms as Attachment O.

  3. Project documents produced by the grantees. We are conducting an ongoing review of the following project documents submitted by States as part of their grant requirements: (1) CHIPRA grant applications; (2) grantees’ final operating plans; (3) grantees’ semi-annual progress reports; and (4) reports produced by State-based evaluation teams. These documents give AHRQ an excellent basis for an informed discussion during in-depth interviews about project resources, evolving strategies, and contextual environments. These documents also provide a useful complement to the in-person interviews by providing written documentation of complex contextual circumstances, including States’ prior experience with grant-related initiatives and current health IT initiatives going on in the States. This activity does not impose a burden on the public, does not require OMB clearance, and is not included in the burden estimates in Section 12.

  4. Pediatrician and Family Physician Survey. The pediatrician and family physician survey was approved under a separate information collection request (OMB# 0935-0215).This survey, which will be fielded in 2014, will include a random sample of pediatricians and family physicians in two grantee states and one comparison state. The questionnaire includes questions that support an analysis of (1) physician attitudes towards specific strategies and resources aimed at improving the quality of care provided to pediatric patients, (2) the extent to which physicians’ practices have attempted to implement changes in order to improve the quality of care provided to pediatric patients, (3) physician attitudes towards the utility of receiving performance feedback on quality measure relevant to primary care for children, (4) perceived usefulness of quality-of-care reports received by physician practices, (5) current practices and attitudes towards pay-for-performance financial incentive systems based on quality measure outcomes, (6) physicians’ uses of and attitudes towards electronic health records in quality measurement and improvement, (7) current and expected medical home accreditation processes, and (8) physician and practice demographic information. These data will provide insight on physician perspectives on quality measures and quality reporting and foster understanding of the strategies and resources that seemed to contribute most (or least) to those outcomes.

  5. Medicaid and CHIP Administrative and Claims Data – Select CHIPRA demonstration States are sharing Statewide Medicaid and CHIP administrative and claims data on all publicly-covered children and youth, ages 0-21. Claims from outpatient, inpatient, long term care, and pharmacy services will be used to create outcome measures of access, quality, and Medicaid expenditures. Claims will also be used for claims-based attribution of children to intervention and comparison practices. The administrative files will provide a limited amount of basic information on child-level demographics as well as define the enrollment periods. Our cross-State quantitative analysis will examine the impact of CHIPRA demonstration funding on the adoption or improvement of an advance model of primary care, the patient-centered medical home, and subsequently, on the access to care, quality of care, and health care expenditures among publicly-insured children. This activity does not impose a burden on the public, does not require OMB clearance, and is not included in the burden estimates in Section 12.

As noted above, this revised information collection request seeks approval to conduct the second round of interviews and the first round of focus groups only. The remainder of this Supporting Statement, as well as the Supporting Statement Part B, pertains only to the interviews and focus groups. The following section provides a description of how AHRQ will use these data to address critical research questions.

All members of the evaluation team who have or will have access to all of the above-noted project documents have signed AHRQ’s standard confidentiality and non-disclosure agreement.

This study is being conducted by AHRQ through its contractor, Mathematica Policy Research Inc., and their subcontractors, the Urban Institute and AcademyHealth, pursuant to AHRQ’s statutory authority to conduct and support research on health care and on systems for the delivery of such care, including activities with respect to the quality, effectiveness, efficiency, appropriateness and value of healthcare services and with respect to quality measurement and improvement. 42 U.S.C. 299a(a)(1) and (2).

2. Purpose and Use of Information

This evaluation uses a mixed-method approach based on a pre-post design. We will analyze the information we collected and will continue to collect pending OMB approval to address specific questions related to the evaluation goals (see Attachment T). Depending on the specific question, we will assemble data from the several sources listed above. We will analyze these data using appropriate quantitative or qualitative methods, synthesize the results, and present integrated findings through issue briefs, manuscripts, or other types of reports.

For example, we expect to address the following questions related to the State Category A projects that focused on improving quality of care by using the core quality measure set for children developed by the Centers for Medicare and Medicaid Services (CMS):

  • Did the collection and reporting of the core measure set have an impact on other quality measurement activities within the State? If so, what was the impact?

  • Did the use of the core measure set increase evidence-based decision making by providers, the State, or other stakeholders?

  • What has been the impact on the quality of care for children enrolled in Medicaid or CHIP from any quality improvement activities based on the core measure set?

To address these questions, we expect to analyze information from: (1) key informant interviews in the 10 States that implemented Category A projects, (2) the physician survey, and (3) reports generated by the States themselves. Findings from these analyses will be reviewed and synthesized by members of the project team who will write the report on this issue.

Similarly, to address questions related to the impact of the patient-centered medical home projects (Category C), we expect to analyze information from: (1) Medicaid administrative and claims files for selected States, (2) key informant interviews in the 12 States that implemented such projects, (3) focus groups held in 4 States, and (4) reports generated by the States themselves. Thus, depending on the particular evaluation question that we are addressing, we will assemble and analyze different data. For example, we will use Medicaid administrative and claims data and focus group data to assess the impact of the patient-centered medical home projects in demonstration States. We will then use key informant interview data to gain insight into how successful projects were implemented and to understand barriers in projects that do not achieve results. The longitudinal interview data will help us understand how those strategies and lessons learned evolved over the course of the demonstration and to what ends.

Developing a comprehensive understanding of the CHIPRA quality demonstration grants and their impacts requires analysis of a range of data sources. The unique contributions of the data collected under this information collection request are described below.

  • Key informant interviews. Collecting high quality, timely interview data from a wide range of sources and knowledgeable respondents directly serves AHRQ’s goal of understanding project implementation and of identifying activities and resources that contributed to any observed improvement in children’s health care quality.

  • Focus groups. The focus groups with parents and adolescents will provide evidence on the experiences of patients and families with demonstration interventions across grantees and provide insight into the success or failure of projects to improve health care in a manner that is responsive to the unique needs of individual families. Collecting information directly from those intended to benefit from the CHIPRA demonstrations will explicitly serve AHRQ’s goal of understanding whether and how the CHIPRA quality demonstration projects are impacting the way parents, adolescents, and families experience care.

3. Use of Improved Information Technology

Notes from interviews and focus groups will be taken electronically using password-protected, encrypted laptops on site. In addition, Federal Information Processing Standards (FIPS)-compliant digital audio recording of all interviews and focus groups (with respondents’ permission) will be the primary electronic method for ensuring the completeness and quality of interview data. Recording also enhances efficiency and reduces respondent burden by allowing researchers to review and edit their written or typed notes without calling respondents for clarification or to check quotes.

Obtaining high-quality data through semi-structured interviews requires a flexible exchange and a conversational rapport between interviewer and respondent. While information technology can greatly enhance the smooth administration of large-scale surveys with complex skip patterns, in qualitative interviewing it is often best to avoid complex skip patterns in the first place. For this data collection, we will minimize the skip patterns an interviewer must navigate during interviews by customizing the protocols in advance. The protocols accompanying this package all consist of multiple modules, including modules for each of the five grant categories for which States may receive funding. Because we will know before we visit a State which grant categories that State is pursuing, we will pare down protocols so they include only the relevant category-specific modules. In addition, the site visit teams will be led by trained, experienced interviewers. The interviewers will be thoroughly familiar with protocol content so they can readily move back and forth within the protocol without disrupting the conversational flow or asking questions the respondent has already answered.

After information collection, researchers will use an electronic software program, NVivo, that enables systematic coding and retrieval of textual data according to a specified scheme.

4. Efforts to Identify Duplication

The evaluation of the CHIPRA quality demonstration grants will not duplicate any prior evaluation efforts. No other data collection effort currently exists to collect and analyze data across all of the demonstration grant States and across all grant categories.

The Centers for Medicare & Medicaid Services (CMS), however, does allow grantees to engage contractors to conduct independent evaluations of the grant-funded projects in their States. Eight grantees (Colorado, Florida, Maine, Maryland, Massachusetts, Oregon, South Carolina, and Utah) have allocated funds for independent, State-level evaluations. AHRQ’s contractor is working closely with these State-based evaluators to coordinate data collection activities, avoid duplication, and ensure that the combined cross-State and State evaluations are more comprehensive than either would be alone. For States with independent evaluation teams, sharing of data by the State-based evaluators with AHRQ’s evaluation contractor will reduce duplication of efforts to access and prepare data sets. Semi-structured interviews will be used only to collect evaluation information that cannot be obtained from other sources. Where possible, AHRQ will use existing administrative data and secondary data sources, such as States’ written progress reports to CMS and Medicaid and CHIP administrative, claims, and encounter data to address its research questions.

Information collected through the focus groups will be the only data that AHRQ collects directly from Medicaid and CHIP beneficiaries. For States with independent evaluation teams, AHRQ’s evaluation contractor will continue to coordinate with State-based evaluators to prevent duplication of efforts. Based on our current assessment, only two demonstration States are conducting focus groups with parents and adolescents, and the information derived from those focus groups have been used solely for purposes such as planning the intervention (for example, what adolescents want from a school-based health center) or learning whether and how the initial set of CHIPRA core quality measures is being used to improve public reporting of performance on the initial set of core quality measures and related topics (Category A projects). The timing and content of the focus groups conducted by the State-based evaluators does not overlap with the focus groups conducted by AHRQ’s evaluation contractor.

5. Involvement of Small Entities

Providers in small private practices and school based health centers may be asked to help recruit focus group participants or participate in interviews. The researchers will make every effort to ensure the recruitment burden on participating small practices is minimal. Every effort will be made to schedule interviews at the convenience of these respondents. In addition, the interviews with these respondents will be short relative to interviews with other respondent types, in part to accommodate small entities. Interview staff will ensure that each interview lasts no more than 45 minutes. Furthermore, to gain a broad picture of participating physicians’ perspectives, the respondents will be distributed across multiple practices. Thus, the burden on whole entities will be small. The information being requested will be held to the minimum required for the intended use.

6. Consequences If Information Is Collected Less Frequently

If the second round of interview data and only round of focus group data are not collected, AHRQ will not be able to evaluate the barriers and facilitators to implementing the demonstration projects or be able to understand whether patients and their families observed any changes to how care was delivered as a result the CHIPRA demonstration. The first round of in-person interviews in the 18 demonstration States were completed in 2012, approximately two years into implementation of demonstration projects. This time frame was most appropriate for learning about the early implementation experience. It provided AHRQ an opportunity to provide feedback to the States about their implementation processes and to inform other States about early implementation strategies. The second round of in-person interviews in the 18 demonstration States will be completed in the fourth and final year of demonstration operations. At this point, interview respondents will understand how the demonstration changed over time, barriers and facilitators to implementation, and perceived outcomes. The parent and adolescent focus groups proposed under this information request will provide the only source of information on perceived impacts of the demonstration by patients and their families. Without this data collection, AHRQ cannot comprehensively evaluate the impact of the program that would inform CMS’ decisions regarding funding for similar initiatives.

7. Special Circumstances

This request fully complies with the general information collection guidelines of 5 CFR 1320.5(d)(2). No special circumstances apply.

8. Register Notice and Outside Consultations

a. Federal Register Notice

As required by 5 CFR 1320.8(d), notice was published in the Federal Register on July 31, 2013 for 60 days (see Attachment U). No comments were received.

b. Outside Consultations

AHRQ’s contractors continually consult individuals outside the agency about the research and data collection activities for this evaluation. These individuals include the CMS personnel who oversee and monitor grant planning and implementation in the demonstration States: Karen Llanos and Elizabeth Hill (CMS/CMCS). AHRQ’s evaluation contractor also consults a 14-member technical expert panel on design, measurement, and analytical challenges. The panel meets annually, but members have agreed to also be available for individual consultation. Attachment V lists the members of the technical expert panel and their professional affiliations. There are no unresolved issues stemming from these consultations.

9. Payments/Gifts to Respondents

No payments or gifts will be provided to interview respondents.

Practices and SBHCs will receive a $500 gift card for their assistance with focus group recruitment, identification of a convenient meeting space, and for logistical support during the focus groups (See Attachment W for the provider site recruitment letters). We arrived at this incentive level after consultation with the State demonstration leaders about the level of effort required to assist with focus group recruitment. Participating in the demonstration requires a significant amount of work on the part of practice staff, and many of the practices and SBHCs participating in the demonstration have not been compensated for their time to participate and have many competing demands. Based on the demonstration leaders input and the contractor’s prior experience, a smaller incentive is expected to decrease the number of practices willing to assist with recruitment for the focus groups. We would expect to need to expend more resources on recruitment if the incentive level is lowered.

Each focus group adult participant will receive a $50 gift card for their participation in the focus group. Adolescent participants will receive a $25 gift card. We will also hold the focus groups at convenient locations and times, increasing individuals’ willingness to attend a group discussion.

10. Assurance of Confidentiality

Individuals and organizations will be assured of the confidentiality of their replies under Section 934(c) of the Public Health Service Act, 42 USC 299c-3(c) with requirements that information collected for research conducted or supported by AHRQ that identifies individuals or organizations be used only for the purpose for which it was supplied.

Interview respondents and focus group participants will be given this assurance during recruitment and again immediately before their participation. They will further receive assurance that the information being gathered is for research purposes only. Respondents will also be asked if they give permission to have the conversation audio-recorded solely for the purpose of filling in any gaps in the research notes. Participants’ will be referred to by their first names during focus groups and basic demographic information will be collected to describe the demographic composition of the focus groups; this information will not be used to identify focus group respondents on transcripts. We will not collect social security numbers, home contact information, and similar information that can directly identify the respondent. Moreover, focus groups will not be held at physician practices to avoid risk of violating participant confidentiality since practice staff may see them and know they have participated.

Safeguarding Data. The contractor has established data security plans for the handling of all interview notes, coded interview data, and data processing for the interviews and focus groups that it conducts. Its plans meet the requirements of U.S. federal government agencies and are continually reviewed for compliance with new government requirements and data collection needs. Such security is based on (1) exacting company policy promulgated by the highest corporate officers in consultation with systems staff and outside consultants, (2) a secure systems infrastructure that is continually monitored and evaluated with respect to security risks, and (3) secure work practices of an informed staff that take all necessary precautions when dealing with confidential data.

During site visits, evaluation researchers will at all times keep notebooks and laptop computers on their persons or in secure, locked locations.

All contractor staff members sign a pledge of confidentiality. A copy of this text is in Attachment X. Confidential data are kept in study-specific folders that only a minimum number of staff members may access. All typed or electronically coded qualitative data are periodically backed up and preserved on secure media.

11. Questions of a Sensitive Nature

AHRQ is not collecting information of a sensitive nature from interview respondents. Questions will elicit information and perspectives about how the CHIPRA demonstration grants are being implemented in the respondent’s State.

Information collected in the focus groups is not intended to be of a sensitive nature. Focus group questions are confined to participant’s general experiences, opinions, and perspectives regarding the care received from practices or clinics and clinicians participating in the CHIPRA demonstration. Focus group facilitators will tell participants at the beginning of the group that they are not specifically interested in the details of their child’s or their own medical condition. However, some participants may choose to share information about their child’s or their own health or medical condition to illustrate how it shaped their experience with their providers. Participants will be asked not to share any personal information about other participants outside of the room. Some focus group participants might have critical views of State or Federal initiatives or of particular participating organizations (for example, health plans, health systems, community organizations, and the practice they go to and the clinician they see). We will handle such insights with sensitivity and will not share or attribute these comments to individuals in an identifiable way in any written or oral communications.

12. Estimates of Annualized Burden Hours and Cost

Exhibit 1 shows the estimated annualized burden hours for respondents’ time to participate in both round 1 and round 2 of qualitative data collection. Key staff interviews were conducted in 2012 and will be conducted in 2014 with up to four persons from each of the 18 CHIPRA demonstration States (72 total in each time period) and will last up to 1 ½ hours. Implementation staff interviews were conducted in 2012 and will be conducted 2014 with up to 16 persons from each of the 18 CHIPRA demonstration States (288 total in each time period) and take an hour to complete. Stakeholder interviews were conducted in 2012 and will be conducted 2014 with up to 8 persons from each of the 18 CHIPRA demonstration States (144 total in each time period) and also take an hour to complete. Health care organization staff interviews were conducted in 2012 and will be conducted 2014 with up to 12 persons from each of the 18 CHIPRA demonstration States (216 total in each time period) and will last 45 minutes.

For the parent and other caregiver focus groups, we estimate that 229 parents and other caregivers will need to be screened to recruit a maximum of 160 parents and other caregivers to participate in 16 focus groups across 4 States (70% screen-in rate expected). The screener takes 5 minutes to complete, the pre-focus group interview for eligible participates takes 20 minutes to complete, and the focus group will last one and a half hours. The burden estimate of 2.5 hours includes one hour for travel time to and from the focus group site.

For the adolescent focus groups, we estimate that 57 adolescents will need to be screened to get up to 40 adolescents to participate in four focus groups completed in one State (70% screen-in rate expected). The screener takes 5 minutes to complete, the pre-focus group interview for eligible participates takes 20 minutes to complete, and the focus group will last one and a half hours (travel time does not apply because the focus groups will be held on school premises).

The total burden for this evaluation is estimated to be 1,955 hours, including the 2012 and 2014 data collections. Of this total, the burden estimate for the 2014 interviews and focus groups requested in this revised information collection request is 1,253 hours. The burden hours for the 2014 data collection are higher than the 2012 data collection because focus groups will only be completed in 2014.









Exhibit 1. Estimated Annualized Burden Hours

Data Collection

Number of respondents a

Number of responses per respondent

Hours per response

2012 burden hours

2014 burden hours

Total burden hours*

Key Staff Interviews

72

2

1.5

108

108

216

Implementation Staff Interviews

288

2

1

288

288

576

Stakeholder Interviews

144

2

1

144

144

288

Health Care Provider Interviews

216

2

45/60

162

162

324

Parent Focus Group Screener

229 b

1

5/60

0

19

19

Parent Pre-Focus Group Interview

160

1

20/60

0

53

53

Parent Focus Groups

160

1

2.5

0

400

400

Adolescent Focus Group Screener

57 b

1

5/60

0

5

5

Adolescent Pre-Focus Group Interview

40

1

20/60

0

13

13

Adolescent Focus Groups

40

1

1.5

0

60

60

Total

1,006c

na

na

702

1,253

1,955

a The number of respondents that will be interviewed in each state will vary depending on the number, scope, complexity, and nature of the projects implemented. This table reflects upper-bound estimates of total burden hours and the number of respondents per type per state.

b Based on an expected 70% screen-in rate

c Parent and adolescent focus group respondents will complete the screener, pre-focus group interview, and participate in a focus group. They were only counted once in the total number of respondents.



Exhibit 2 shows the estimated total cost burden associated with respondents’ time to participate in both round 1 and round 2 of qualitative data collection. Of this amount, the total cost burden is estimated to be $75,203. The annualized cost burden associated with the 2014 interviews and focus groups requested in this revised information request is estimated to be $42,796. The cost burden for the 2014 data collection is higher than the 2012 data collection because focus groups will only be completed in 2014.

















Exhibit 2. Estimated Total Cost Burden




Number of respondents

2012 burden hours

2014 burden hours

Average hourly wagea

2012 cost burden

2014 cost burden

Total cost burden

Key Staff Interviews

72

108

108

$55.22b

$5,964

$5,964

$11,928

Implementation Staff Interviews

288

288

288

$30.99c

$8,925

$8,925

$17,850

Stakeholder Interviews

144

144

144

$30.99c

$4,463

$4,463

$8,925

Health Care Provider Interviews

216

162

162

$80.59d

$13,056

$13,056

$26,111

Parent Focus Group Screener

229 g

0

19

$22.01e

$0

$418

$418

Parent Pre-Focus Group Interview

160

0

53

$22.01e

$0

$1,167

$1,167

Parent Focus Groups

160

0

400

$22.01e

$0

$8,804

$8,804

Adolescent Focus Group Screener

57 g

0

5

$0f

$0

$0

$0.00

Adolescent Pre-Focus Group Interview

40

0

13

$0f

$0

$0

$0.00

Adolescent Focus Groups

40

0

60

$0f

$0

$0

$0.00

Total

1,006h

702

1,253

na

$32,407

$42,796

$75,203

a National Compensation Survey: Occupational Wages in the United States May 2012.” U.S. Department of Labor, Bureau of Labor Statistics.

b Based on the mean wages for general and operations manager (11-1021)

c Based on the mean wages for social and community service managers (11-9151)

d Based on the mean wages for general pediatricians (29-1065)

e Based on the mean wages for all occupations

f Wage rates for adolescents are assumed to be zero.

g Based on an expected 70% screen-in rate

h Parent and adolescent focus group respondents will complete the screener, pre-focus group interview, and participate in a focus group. They were only counted once in the total number of respondents.



Throughout the information collection process, we will monitor the length of the interviews, comments received from participants and field interviewers, and the number of individuals who refuse to be interviewed. If this information indicates that the burden on participants is so great as to undermine the collection of high quality data, we will revise our procedures accordingly. For example, we may reduce the length of the semi-structured interviews. If we need to revise our procedures, we will work with OMB to implement specific changes.

13. Estimates of Annualized Respondent Capital and Maintenance Costs

Capital and maintenance costs include the purchase of equipment, computers or computer software or services, or storage facilities for records, as a result of complying with this data collection. There are no additional costs to the respondents.

14. Estimates of Annualized Cost to the Government

Exhibit 3 shows the total and annualized cost for this evaluation. The total cost to the government of the entire evaluation contract is $8,258,311 (including a base period and four option periods); the annualized cost is $1,651,662 per year (Exhibit 3). These costs will be incurred from 2010 to 2015.

Exhibit 3. Estimated Total and Annual Cost

Cost Component

Total Cost

Annual Cost

Administration

$571,422

$114,284

Coordination

38,003

7,601

Stakeholder Feedback

201,637

40,327

Technical Expert Panel

359,276

71,855

Evaluation Design & Implementation

3,981,390

796,278

Technical Assistance Plan

934,440

186,888

Data Collection Instruments

138,997

27,799

OMB Clearance

35,617

17,808

Section 508 Compliance

13,883

2,777

Data and Analysis Reports

735,426

147,085

Interim Evaluation Reports

408,803

81,761

Dissemination

736,149

184,037

Final Report

103,269

103,269

Total

$8,258,311

$1,651,662

15. Changes in Hour Burden

Under the revised information collection request, we are requesting to conduct a second round of interviews with all interview respondents as well as a round of focus groups. Therefore, the number of responses per respondent increased from one to two for interview respondents, and burden hours were added to account for focus group recruitment and participation. In the original information collection request, we requested to conduct follow up telephone interviews with key informant staff and telephone interviews with staff in non demonstration States. We did not collect this information, and those hours are no longer included in the burden estimates.

16. Time Schedule, Publication and Analysis Plans

The first round of data collection started in March 2012. Products based on analyses of these data are publicly available on the demonstration evaluation website: http://www.ahrq.gov/policymakers/chipra/demoeval/index.html. AHRQ expects the second round of data collection to begin in March 2014, pending OMB approval of the revised information data request, and to be completed by August 2014. AHRQ’s contractor will continue to prepare communication materials for a range of audiences (State policymakers, State agency staff, Medicaid and CHIP providers, and academics) until the contract ends on September 8, 2015. The effort to publish will include preparing and submitting manuscripts to peer-reviewed publications. Interview data described in this clearance package will be analyzed to address the research goals described in Section 1.

  • The first round of interview data collected under the previously approved package was used to assess the use, availability, and perceived importance (that is, the importance as perceived by the respondents) of resources in the preceding year; the selection and implementation of early strategies; resulting outputs; and perceptions of short-term outcomes.

  • The second round of interview data collected under this revised information collection package will be used to assess changes in State demonstration strategies and context since the last round of interviews; barriers and facilitators to implementing demonstration strategies; State-and provider-level changes made in response to the demonstration; and the perceived impact of those changes on access, quality, and cost of care.

  • The focus group data collected under this revised information collection package will provide the patient and family perspective on how providers changed the way they deliver care and the impact of those changes on their experience of care.

Notes from all interviews and focus groups will be typed, uploaded to qualitative data analysis software (NVivo), and coded according to a specified scheme. Analysis of the interviews will emphasize fidelity to implementation plans and progress in implementation. The analysis will include identification of themes within and across States by grant category. Throughout the process of gathering, reviewing, and analyzing qualitative data, quotations will be noted that capture a point of view or an experience particularly well. For each project in each of five grant categories, findings from the implementation analysis will be used to interpret findings about outcomes and to help establish a basis for causal inference. In brief, the interview data collected under this clearance package, when combined with evaluation data from other sources, will directly support an analysis of (1) the implementation of specific strategies related to quality measurement and reporting, health IT, provider-based models, and pediatric electronic health records; (2) whether and how implementation seemed to affect children’s care quality in Medicaid and CHIP; (3) the likelihood that quality improvements will be sustainable when grant funding ends; and (4) the potential for other States to replicate the achievements of the demonstration States.

17. Exemption for Display of Expiration Date

AHRQ does not seek this exemption.

List of Attachments:

Attachment A – Children’s Health Insurance Program Reauthorization Act of 2009
Attachment B – 2012 Key Staff Interview Guide

Attachment C–Interview Request and Confirmation Emails
Attachment D – 2014 Key Staff Interview Guide

Attachment E – 2012 Implementation Staff Interview Guide
Attachment F – 2014 Implementation Staff Interview Guide

Attachment G – 2012 Stakeholder Interview Guides
Attachment H – 2014 Stakeholder Interview Guide

Attachment I – 2012 Health Care Organization Staff Interview Guides
Attachment J – 2014 Health Care Organization Staff Interview Guide

Attachment K – 2014 Parent Focus Group Protocol

Attachment L – 2014 Parent Focus Group Recruitment Materials

Attachment M – 2014 Parent Telephone Screening Script

Attachment N – 2014 Parent Pre-Focus Group Interview Script

Attachment O – 2014 Parent and Adolescent Informed Consent Forms

Attachment P – 2014 Adolescent Focus Group Protocol

Attachment Q – 2014 Adolescent Focus Group Recruitment Materials

Attachment R – 2014 Adolescent Telephone Screening Script

Attachment S – 2014 Adolescent Pre-Focus Group Interview Script

Attachment T- CHIPRA Quality Demonstration Evaluation Research Questions

Attachment U – Federal Register Notice
Attachment V – TEP Members

Attachment W -- Provider Site Recruitment Letters

Attachment X – Confidentiality Pledge


1 Department of Health and Human Service, Centers for Medicare & Medicaid Services. Medicaid and Children’s Health Insurance Programs: Children’s Health Insurance Program Reauthorization Act of 2009: Section 401(D). Invitation to Apply for FY2010 CHIPRA Quality Demonstration Grants. September 30, 2009, CFDA 93.767.

2 Ibid.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSheena Flowers
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy