Summary of Public Comments

Attachment F Summary of Public Comments.xlsx

Maternal and Child Health Bureau Performance Measures for Discretionary Grant Information System (DGIS)

Summary of Public Comments

OMB: 0915-0298

Document [xlsx]
Download: xlsx | pdf
Number Date Received Division Commenting Organization Commenter Location Performance Measure Measure Name Comment Summary Resolution Decision Final Resolution
1 11/17/2015 MIECHV/ Home Visiting Healthy Starts Program Coordinator NH Not Related to Measure
Overall theme of seeking approval of referral or approval of visit by PCP/MCO/HV when measures are self determined. As well as revisiting and defining best practices recommendations.

Recommend identifying smoking measure in terms of progress.

#10 - need to look at cultural sensitivity and other factors to rework measure.

There is a lot of mention of data compared/needed in ETO report.

Due to the increase of data collection and required interventions/screenings, funds will be required (see list).
Not related to this OMB Package No changes neccessary
2 12/10/2015
Townhall Participants
WMH 2 Perinatal/ Postpartum Care Question: For the program specific measures is there an expectation about how many would be assigned.
Answer: Haven’t mapped it out that far, want to make sure there is some flexibility and to figure out how it’s most effect. The biggest effort is on making the utility work better. They haven’t set a firm number.

3-5 Domain measure, those with program-specific measures will have more.

No changes neccessary
3 12/10/2015
Townhall Participants- MCHB Staff
Not Related to Measure
Question: Process that PO will use to assign the measures to their grants (during the FOA?)
Answer: When they are developing the FOA, that is when they’ll select the measures they plan to use, similar to what is done now.

Question: Will grantees be able to add options under tier 2? For example in the perenatal care example, for MIECHV, would they be able to add ‘home visiting services’? or instead, would they need to try to fit HV into the outreach or other category provided? i.e. who/how do the options within each of the tiers get created?
Answer: MCH would leave the option for the grantees to add something if it’s not already provided.

Question: What project measures will grantees starting june-sept 2016 use for reporting current or new PMs?
Answer: MCH will probably – for grantees we will have them (we’ll have to work on a transition plan) have the new measures, or at least provide the link to the measure package. If we have OMB approval, then MCH will assign the measure, but it won’t be in DGIS quite yet. But we should plan on transitioning off the old measures shortly.

Question: Population domains, asking the grantees to report in different programs/initiatives, do they have to stick within that (CSHN –family engagement strategy) this may also be available in adolescent health, etc. is there flexibility
Answer: The answer is yes.

Question: Can grantees make recommendations now through the town halls and formal comment to add additional options in the tier 2 and 3 lists?
Answer: Yes, now is the time to make recommendations!

Resolution provided in comment summary. No changes neccessary
4 12/11/2015 EMSC PA Emergency Health Services Council- EMS for Children Project Director PA Emergency Health Services Council EMSC 01 NEMSIS Submission Overall, we are very supportive of the proposed measures and the overarching goals each of them seek to achieve. That stated, we have concerns with some of the details included in each of the three new proposed PMs and have included specific comments below.



EMSC 01

While we fully support the utilization of and reporting to NEMSIS, we are concerned with the specific language in this PM. The Commonwealth of Pennsylvania currently does not have the resources to move our data to NEMSIS 3.x and will be remaining on NEMSIS 2 compliant data for the short-term future. There is currently no plan and no time line to advance our NEMSIS data collection to make it NEMSIS 3.x compliant. As Pennsylvania has ~1,000 ambulance services reporting data, this transfer will be incredibly time and resource consuming, and Pennsylvania’s capacity at this time is not sufficient to support such a transfer. Therefore, we have grave concern that the current language of this PM which requires submission of “NEMSIS complaint version 3.x data� will not be achievable by the PA EMSC Program.


We suggest rewording this PM to make the PM broader by striking the words “version 3.x� and wording this PM as follows: “The degree to which EMS agencies submit NEMSIS compliant data...�. As a majority of states and territories in the United States currently are not able to submit NEMSIS 3.x compliant data, we believe this change would be beneficial to a significant portion of grantees within the overall EMSC SP grant program.

------
These comments are submitted on behalf of the Pennsylvania Emergency Health Services Council and were written by the EMS for Children Program Manager for Pennsylvania. Any comments, questions, or concerns should be directed to myself by using the contact information in my signature below.

Tom Winkler
EMS for Children Project Director
Pennsylvania Emergency Health Services Council
600 Wilson Lane, Suite 101
Mechanicsburg, PA 17055
Phone: (717) 795-0740|Ext. 118
Fax: (717) 795-0741
[email protected]
EMSC has responded to all comments on Measure EMSC 01 with revisions and the following FAQs:
NEMSIS-SPECIFIC Performance Measure:

Q: Aren’t these new EMS for Children performance measures unfunded mandates?_x000D_


A: No, these measures were developed so the national EMSC program can obtain baseline data on how the EMS system is operating on a national level and are intended to assist states and territories in showing improvement in these areas over the lifespan of the measures. The EMSC program recognizes that not every state or territory may meet these measures. _x000D_

Q: Why can’t the measure be “NEMSIS v3 data is being collected on 90% of the call volume in the state rather than from 90% of agencies? “
A: The program did consider using call volume versus percentage of agencies. One of the reasons that the program decided on using the percentage of agencies was that many states and territories may not know the total call volume but could reasonably know the number of agencies in their state. The program is concerned that if this measure captured the percent of call volume rather than the percent of agencies, the program would not understand what was happening at small agencies. For example, by using call volume, the measure could be considered biased toward rural states with a few urban areas that potentially have 90% of the call volume. This data is important for developing program planning to address the needs of all EMS agencies.
Q: My state may never convert to NEMSIS v3, so can the measure be rewritten to eliminate the version 3 part of the measure?
A: Unfortunately, no, the program is committed to assisting our federal partner, the National Highway Traffic Safety Administrations, (NHTSA) in their efforts to create a national databank of EMS data. In addition, beginning January 1, 2017, the NEMSIS Technical Assistance Center (TAC) will no longer accept NEMSIS v2 data from states and territories. As a result, the program wants to be current with national standards.
Q: My state does not license EMS agencies; does this mean that I don’t have to report data on this measure?
A: Even if your state or territory does not license EMS agencies, you still have to report on this measure. The intent of this measure is to determine how many agencies in the country are submitting NEMSIS v3 data, whether those agencies are licensed at a state, local, or some other level. If the EMS agencies in your state submit NEMSIS v3 data in 2017 (the expected first round of data collection on these new EMS for Children measures) then you should report the number of EMS agencies that submit NEMSIS v3 data to the State EMS office. Your state EMS Data Manager should be able to assist you with these numbers.
Q: The NEMSIS measure does not go far enough to improve the quality of the EMS data that is being submitted. Can the measure be revised to include a list of pediatric data elements, data validation, and scoring tools?
A: This is a great idea and is something that will be considered for development in the future. For the next five years, the EMS for Children program is interested in knowing how many agencies submit v3 data in order to have a baseline of EMS data collection numbers.
Q: This measure is not pediatric-specific and is out of scope of the EMS for Children program. Can it be eliminated as a measure?
A: No. The EMS for Children Program has participated in the development of a national EMS data system since its inception, and believes this is an important area for performance measurement and improvement. Past Funding Opportunities Announcements (FOA) for the State Partnership Grants have allowed grantees to use grant funds to support the EMS data infrastructure in their states, so this measure is in line with past EMS for Children efforts and within the scope of the program. In addition, since NEMSIS collects data on patients of all ages, and does contain pediatric-specific variables, pediatric patients are included
Change/ addition to wording,Definition added
5 12/11/2015 EMSC PA Emergency Health Services Council- EMS for Children Project Director PA Emergency Health Services Council EMSC 02 Pediatric Emergency Care Coordination We support the development of a designated position at an EMS agency to improve pediatric emergency medical care. However, we have multiple concerns with this PM in its current form. We are significantly concerned about the current definition of an “EMS agency�. The Commonwealth of Pennsylvania does not have any designation as to whether or not an EMS agency responds to emergency calls. Additionally, Pennsylvania licenses non-transporting Quick Response Services that provide lower level EMS care designed to get an EMS provider to the scene more quickly than an ambulance could arrive to the scene. Therefore, the current definition, when applied to Pennsylvania, covers approximately 1,600 different EMS agencies, making surveying and logistical considerations for this PM a significant and perhaps unattainable challenge. Current estimates suggest that there are >10% non-emergency transport-only EMS agencies in Pennsylvania, very few of which EVER see a pediatric patient and will be very resistant to implementing such a program. In addition, many of our rural EMS agencies are mostly/totally volunteer services, and have extreme difficulty providing even minimal staffing for their ambulances. Adding additional requirements on these agencies that already have incredibly limited resources would not be received well and could result in political struggles for the program.





We suggest amending this PM to reference specifically to EMS agencies that respond to emergency calls and are transport-capable. We believe these EMS agencies are the ones who a) will benefit the most from an EMS agency PECC and b) will be the most willing to comply with the creation of such a position. In addition, we believe this PM will require significant support from both HRSA and the soon-to-be awarded EIIC to help make this proposed PM become a reality.
Definitions are provided throughout as well as in program-specific information. No changes neccessary
6 12/11/2015 EMSC PA Emergency Health Services Council- EMS for Children Project Director PA Emergency Health Services Council EMSC 03 Use of Pediatric-Specific Equipment We fully support the verification that EMS providers are able to use pediatric-specific equipment on pediatric patients. That said, we severely disagree with the metrics used for evaluation suggested in this proposed PM. There is an ongoing shortage of EMS personnel in Pennsylvania, with ever growing demand on the EMS system and limited increases in the number of providers, so to be able to perform evaluations via field encounters would be not possible given the current political and social environment. Therefore, we suggest the full removal of that portion of the PM. We also worry that requiring skills stations and/or case scenarios twice or more per year would be very difficult for EMS agencies to manage, especially our large agencies with significant numbers of personnel. We believe lowering this number to once per year, at a maximum, would be much more attainable. Many of our ALS agencies already require their ALS practitioners to complete annual skills reviews, allowing this PM to be added to those requirements. As with EMSC 02, we worry that the volunteers will feel like an undue burden has been placed on them, and worry about the political implications associated with that issue. EMSC has responded to all comments on Measure EMSC 03 with revisions and the following FAQs:
Q: To achieve the skill-checking measure a state would have to reach an ‘8’ on the scale. This score is too high and unrealistic. Can you consider lowering the score for achievement?

A: Yes. After reviewing the measuring scale and supporting evidence, the program has decided that a state/territory would achieve this measure by scoring a ‘6’ or higher on the scale.

Q: The skill-checking measure is too broad. Can it be revised to include the specific pieces of equipment that the program is interested in?

A: No. At one point in the development process, the measure did include specific pieces of life-saving equipment but as we field-tested the data collection instrument, we learned that there was variability among agencies as to what pieces of equipment were considered out of scope or if the medical director allowed agencies to use that equipment. As a result, the program determined that knowing if a process to skill-check, rather than which specific pieces of equipment they tested on, would help to understand how prepared agencies are to care for children. We assume that agencies who invest the resources to skill-check will select the pieces of equipment that are most crucial or very rarely used in the field to care for children.

Q: Our state certifies EMS instructors and only state certified EMS instructors can teach EMS providers, how does this figure into how EMS agencies will respond to the questions in the skill-checking measure?

A: EMS agencies in states which require certified EMS instructors will need to take state regulations into consideration when they respond to the survey questions which ask about the process their agency uses to skill-check.

Q: How do national courses such as PEPP and PALS and certifications like NREMT CCP fit into the skill-checking measure?

A: As long as it is an in-person course, an agency would respond in a way that reflects the methods used in the course or certification (i.e. simulation, skill-station) and frequency of the skill-checking activities whether the process occurred via a national course or not.

Q: Does an agency have to use or adopt all three of the methods listed (skill-station, simulation, and field encounter) to meet the measure?

A: No, the measure can be achieved by using only two of the three described methods.

Q: Can HRSA consider using the state’s recertification/re-licensure period as the time period for when skills check should occur?

A: At this time, a couple of states recertify/re-license annually so this would make the performance measure target burdensome. In addition, a couple of states recertify/relicense EMS providers every five years which would include a very long period of time before pediatric skills check would be required. The comment was considered but may not work well for the states with shorter and longer periods of recertification/re-licensure.

Change/ addition to wording,Definition added
7 12/11/2015 EMSC PA Emergency Health Services Council- EMS for Children Project Director PA Emergency Health Services Council EMSC 04 Pediatric Medical Emergencies We thank HRSA for extending the deadlines related to each of these PMs. We have no further comments on these PMs.

No resolution needed No changes neccessary
8 12/11/2015 EMSC PA Emergency Health Services Council- EMS for Children Project Director PA Emergency Health Services Council EMSC 05 Pediatric traumatic emergencies We thank HRSA for extending the deadlines related to each of these PMs. We have no further comments on these PMs.

No resolution needed No changes neccessary
9 12/11/2015 EMSC PA Emergency Health Services Council- EMS for Children Project Director PA Emergency Health Services Council EMSC 06 Inter-facility transfer guidelines containing all components We thank HRSA for extending the deadlines related to each of these PMs. We have no further comments on these PMs.

No resolution needed No changes neccessary
10 12/11/2015 EMSC PA Emergency Health Services Council- EMS for Children Project Director PA Emergency Health Services Council EMSC 07 Inter-facility transfer guidelines covering pediatric patients We thank HRSA for extending the deadlines related to each of these PMs. We have no further comments on these PMs.

No resolution needed No changes neccessary
11 12/11/2015 EMSC PA Emergency Health Services Council- EMS for Children Project Director PA Emergency Health Services Council EMSC 08 Established Permanence of EMSC We have no comments on this PM. No resolution needed No changes neccessary
12 12/11/2015 EMSC PA Emergency Health Services Council- EMS for Children Project Director PA Emergency Health Services Council EMSC 09 Established permanence of EMSC by integrating EMSC priorities into statutes/ regulations We thank HRSA for extending the deadlines related to each of these PMs. We have no further comments on these PMs.

No resolution needed No changes neccessary
13 12/16/2015
Townhall Participants
All/ Most Domain Measures
Need for direct service as an option for tier 2 and 3 of the measures. Direct service added to Tiers 2 and 3 connected to Form 7 Change/ addition to wording
14 12/16/2015
Townhall Participants
All/ Most Domain Measures
Debbie Mays – is it possible to provide some additional definition (ex. Table 1) Columns that differentiate local partners from national partners…
Answer: Submit that for comments, then they will take that into consideration for the official OMB package.

Resolved in individual comments.

Addressed in other similar or identical comment.
15 12/17/2015
Townhall Participants
Not Related to Measure
Question: Is there a reason why the new measures are requiring so much detail? It seems like this level of detail belongs in the narrative.
Answer: There reason there is more detail, we wanted to create a data system where we could collate the data quickly to be able to tell MCH’s story more easy. We get questions from Congress, and we need to be able to summarize data and responses more quickly.



Question: When implemented
Answer: around October 1, 20116



Question: Will DGIS be a “new� reporting requirement/system for Title V state grantees, in addition to the reporting provided via TVIS?
Answer: TVIS is dedicated to the state block grant. If you as a state have a discretionary grant, you’ll be using DGIS to report on that grant.

No resolution needed No changes neccessary
16 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC CB 2 Technical Assistance This measure captures reasonable domains. The metric, i.e., # of participants, exhibits the same challenge as described for T8.









No resolution needed. No changes neccessary
17 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC CB 2 Technical Assistance CB2 - This measure captures reasonable domains. The metric, i.e., # of participants, exhibits the same challenge as described for T8.

- Note that Injury Prevention is duplicated in the list on page 41.
Fix: Injury Prevention is duplicated in the list on page 41 "Grammar/ spelling/ error issue, now fixed"
18 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC CB 3 Impact Measurement CB3 - This is a useful and valuable measure. Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
19 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC CB 4 Sustainability CB4 - Sustainability is relevant in certain projects and not in others. This should be made clear. For example, some projects are meant to be demonstrations or tests without any sustainability intent. More importantly, sustainability capacity differs based on the inherent activity. For example, an important dimension of the public health training programs is that they are a “public good,� meaning that beyond students and participants in technical assistance, the contributions cannot be limited to users and the contributions are not depleted if more users take advantage of the education and research produced by the programs. While it is important for training programs to attend to developing resources, the goal to be independent of MCHB or government support at some level does not seem reasonable. Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
20 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC CB 5 Scientific Publications CB5 - Articles and in press seems much too narrow. The universe of scholarly work is much broader. Scholarly products, including official reports, monographs, etc. Published needs to be added after in press.

Change/ addition to wording
21 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC CB 6 Products CB6 - This measure is closely related to CB5. It would be more illuminating to create a single measure that clearly captures the domains of scholarly work. No. This is intended to separate those things that are adding to the scholarly body of work from the catchall of products. No changes neccessary
22 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC Core 1 Grant Impact Grant impact is clear and appropriate.




No resolution needed No changes neccessary
23 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC Core 2 Quality Improvement Core 2 - QI is clear and appropriate. No resolution needed No changes neccessary
24 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC Core 3 Health Equity – MCH Outcomes Core 3 - Health equity is an important measure. The Tier 2 items do not capture the breadth of this domain. For example, factors like first in family to attend school, first generation in the U.S., and other examples of social determinants would enrich the picture provided by this measure Leave as is, but add 'other'. Change/ addition to wording
25 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC LC 1 Adequate Health Insurance Coverage LC1 - Tier 3 activities are relevant to training grants. It is not clear, however, how to measure the # receiving TA training or the # receiving professional/organizational development training.

- The Data Collection form should be illuminating overall.
Addressed elsewhere. Addressed in other similar or identical comment.
26 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC LC 2 Tobacco and eCigarette Cessation LC2 - While important, individual training programs may or may not have any individuals directly engaged in tobacco cessation. We assume this activity is not an expectation for all programs.

- The Data Collection form should be illuminating overall.
Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
27 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC LC 3 Oral Health While important, individual training programs may or may not have any individuals directly engaged in oral health. We assume this activity is not an expectation for all programs.
LC2 is unlikely to be required of training programs No changes neccessary
28 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC Training 1 MCH Training Program Family Member/Youth/Community Member participation T1 - Useful PM. Each of the 5 items are valuable for programs to think about, but it’s not clear that there will be much variation in the table of metrics, given yes/no responses. This is not to suggest that the effort to further delineate these categories would be worth it, however.














No resolution needed No changes neccessary
29 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC Training 2 MCH Training Program Cultural Competence T2 - Useful PM. Each of the 6 items are valuable for programs to think about, but it’s not clear that there will be much variation in the table of metrics, given yes/no responses. This is not to suggest that the effort to further delineate these categories would be worth it, however. Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
30 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC Training 4 MCH Pipeline Program – Work with MCH populations T4 - The Significance is missing a sentence. MCHB places special emphasis on improving service delivery to women, children and youth from communities with limited access to comprehensive care. One goal of pipeline programs is to increasing the pool of students who seek to provide services to the MCH population._x000D_
Data Form - The data collection form seems restrictive. We would consider it a success if pipeline graduates bring insights about the MCH population to whatever professional setting they are in, even if not strictly defined as an MCH program.
This measure is specific to the MCH Pipeline Training Programs and is aligned with the specific goals and expected outcomes of the program outlined in the FOA.

No changes neccessary
31 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC Training 6 Demonstrate Field Leadership T6 - The relevance of the Benchmarks is not clear._x000D_
Data Form - Data Collection Section A: The categories are reasonable. If this PM is meant to get a snapshot it is useful. If, however, programs will be measured, either explicitly or implicitly, 2 years is a very short window for demonstrating meaningful leadership.
Benchmarks revised. Change/ addition to wording
32 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC Training 7 Diversity of Long-Term Trainees T7 - The Significance would be appropriate for T4 as previously noted._x000D_
- A broader definition of diversity would be illuminating: first in family in graduate school, gender identity, first generation in U.S. are some examples.
The race and ethnicity categories reflected in this measure align with the data collected as part of the U.S. Census data and adhere to the 1997 Office of Management and Budget (OMB) standards on race and ethnicity which guide the Census Bureau data collection. The race and ethnicity categories will not be revised for this measure. No changes neccessary
33 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC Training 8 Title V Collaboration T8 - The use of these Benchmarks is not clear. The Significance is clear._x000D_
Data Form - Data Collection contains a reasonable set of types of activities. The quantification of activities is a problem, however. For example, the process count of the # of activities can be interpreted for a statewide training of all local health departments on a particular topic to be 1 collaborative CE or TA activity – which could be reported similarly if it was a training directed at the State Health Department (1 activity) or at an interdisciplinary group of MCH stakeholders (1 activity). The metrics does not capture the magnitude of potential or actual impact on the practice of MCH or the potential to actually affect population outcomes in MCH. We appreciate that the Bureau is challenged to ‘quantify’ these measures, but we lose much in the translation.
Progress reports can be used to capture detailed information on more intensive technical assistance activities. No changes neccessary
34 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC Training 9 Interdisciplinary Practice T9 - The Significance is unnecessarily narrow, because care implies clinical care. At a minimum, the wording should be changed to “care/services� or “care/practice.�_x000D_
Data Form - The Data Collection captures important dimensions of interdisciplinary practice. Ideally, the question would be time-limited. For example, “during the past 3 months, how often have trainees sought information from other professions or disciplines.� As the question stands, all the responses are likely to be very high._x000D_
- While we appreciate the value of 10 year follow-up, the costs of ascertaining this information are quite high, especially when considering the 5 year duration of the training grants.
Significance language has been modified slightly. Change/ addition to wording,Definition added
35 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC Training 12 Work with MCH Populations T12 - Straightforward and valuable measure No resolution needed No changes neccessary
36 12/28/2015 Workforce Development/ Training UNC Gillings School of Global Public Health Center of Excellence and the National MCH Workforce Development Center Chapel Hill, NC Training 13 Policy Development, Implementation, and Evaluation T13 - Straightforward and valuable measure No resolution needed No changes neccessary
37 12/28/2015 EMSC University of Colorado Denver School of Medicine and the Colorado Emergency Medical Services Program Denver, CO EMSC 03 Use of Pediatric-Specific Equipment We are generally in agreement with the measures listed, and excited at the progress they will enable within our state moving forward. We are concerned, however with the narrow construction of performance measure EMSC 03 regarding the use of pediatric equipment. Overall, while we find the conceptual justification for this measure to be very reasonable, we are concerned that a significant amount of effort will be expended to acquire incomplete and ineffectual information based on how this measure is currently constructed. As such, we would encourage changes to this performance measure as currently drafted in order to provide a more comprehensive and useful measurement of the systems in place to ensure EMS provider competency in pediatric care for the following reasons:



First and foremost, we anticipate this measure will be assessed through the electronic surveying of EMS services within our state. While this responsibility will not fall directly on our state EMS for Children program, we anticipate, based on past experience, that we will expend significant effort and goodwill to encourage a high response rate amongst our stakeholders. As such, we believe it is critical that we ask for comprehensive and actionable information in exchange.



Furthermore, the Miller framework for the assessment of Clinical Skills / Competence / Performance referenced in the performance measure proposal lists 4 areas related to the development of competence including knowledge, competence, performance and action1. From the provider perspective this means the provider has the requisite knowledge base, knows how to apply it, demonstrates how to apply it, and integrates that knowledge into clinical practice. As currently proposed, this measure will only measure a narrow sliver of applied knowledge regarding isolated equipment use. The measure is further concerning as it describes the measurement of “the correct use of pediatric specific equipment� which currently has no definition regarding what the equipment is, or should be. While the use of appropriately sized medical equipment is clearly an element of providing pediatric care, it is by no means the entirety of safe and effective care.



The performance measure justification further references the work of Lammers et. al and Su et. al. While the actual Lammers et. al work referenced is unclear, his work to date reflects the identification of errors in pediatric care by EMS providers in simulated environments2,3. Lammers suggests a variety of remedies for the errors found including targeted training, the use of quick reference tools, equipment inspection and testing of competency with medication dosing. Equipment issues, when referenced, often relate to generalized care equipment such as oxygen, airway adjuncts and glucometers used in both adult and pediatric patients3. None of these additional factors are considered in this performance measure. The work of Su and colleagues relates to retained knowledge after completion of pediatric resuscitation coursework with no specific reference to equipment4. Furthermore, the use of simulation to maintain and improve competency is referenced as an area of great promise regarding the continued competency of EMS providers by the IOM as well as other researchers5-7. Neither required coursework nor the use of simulation to validate competency at the service level is identified as part of this performance measure.



Another consideration regarding EMS provider competency is the ongoing availability and promotion of pediatric resuscitation and emergency care training such as the Pediatric Advanced Life Support (PALS) program, the Pediatric Education for Prehospital Providers (PEPP) course and the Emergency Pediatric Care (EPC) program which are in widespread use nationwide and often heavily promoted or subsidized by state EMS for Children programs. Despite their ongoing place in EMSC, the overall usage rates of these programs, is unknown and has never been measured by the EMS for Children program despite the fact it is fundamental to Miller’s framework.



Considering all of these factors and the variety of issues surrounding actual EMS provider competence, the proposed performance measure may be insufficient and will not likely afford the MCHB with adequate information to evaluate EMS provider competency assurance within EMS organizations, or the journey towards it. The proposed measurement as currently crafted will create a burden on EMS agencies in its collection but may fail to provide effective guidance to enable improvement. We would therefore suggest an alternative or modified performance measure, designed to more comprehensively evaluate the mechanisms in place to assure provider competency in pediatric care. Examples of more comprehensive measurement could include:



•Percentage of providers with supplemental pediatric education (i.e. PALS, PEPP and EPC),

•Existence of quality improvement metrics based on pediatric care protocols,

•Amount of agency level training specific to pediatric equipment, drug dosing and care protocols,

•Regularity of inspection of pediatric equipment,

•Availability of pediatric reference tools,

•Availability and use of simulation training in pediatric care and,

•Regularity of competency evaluation utilizing pediatric case scenarios.



It should be further noted that the measurement of these additional areas to a high degree of specificity will likely require no more than 10 – 20 survey questions, significantly less than the amount of information solicited from EMS organizations under previous performance measures. In contrast to the proposed measure, information on these expanded elements will provide state partnership grantees and the MCHB with more detailed information on what competency assurance elements are in place on an agency by agency basis, and where improvement efforts can be best targeted. Thank you for the opportunity to comment and please do not hesitate to contact our state partnership program manager, Sean Caffrey at [email protected] or 303-724-2565 if you have any additional questions regarding these comments.



Sincerely.



Sean M. Caffrey, MBA, CEMSO, NRP



Kathleen M. Adelgais, MD, MPH

EMSC has responded to all comments on Measure EMSC 03 with revisions and the following FAQs:
Q: To achieve the skill-checking measure a state would have to reach an ‘8’ on the scale. This score is too high and unrealistic. Can you consider lowering the score for achievement?

A: Yes. After reviewing the measuring scale and supporting evidence, the program has decided that a state/territory would achieve this measure by scoring a ‘6’ or higher on the scale.

Q: The skill-checking measure is too broad. Can it be revised to include the specific pieces of equipment that the program is interested in?

A: No. At one point in the development process, the measure did include specific pieces of life-saving equipment but as we field-tested the data collection instrument, we learned that there was variability among agencies as to what pieces of equipment were considered out of scope or if the medical director allowed agencies to use that equipment. As a result, the program determined that knowing if a process to skill-check, rather than which specific pieces of equipment they tested on, would help to understand how prepared agencies are to care for children. We assume that agencies who invest the resources to skill-check will select the pieces of equipment that are most crucial or very rarely used in the field to care for children.

Q: Our state certifies EMS instructors and only state certified EMS instructors can teach EMS providers, how does this figure into how EMS agencies will respond to the questions in the skill-checking measure?

A: EMS agencies in states which require certified EMS instructors will need to take state regulations into consideration when they respond to the survey questions which ask about the process their agency uses to skill-check.

Q: How do national courses such as PEPP and PALS and certifications like NREMT CCP fit into the skill-checking measure?

A: As long as it is an in-person course, an agency would respond in a way that reflects the methods used in the course or certification (i.e. simulation, skill-station) and frequency of the skill-checking activities whether the process occurred via a national course or not.

Q: Does an agency have to use or adopt all three of the methods listed (skill-station, simulation, and field encounter) to meet the measure?

A: No, the measure can be achieved by using only two of the three described methods.

Q: Can HRSA consider using the state’s recertification/re-licensure period as the time period for when skills check should occur?

A: At this time, a couple of states recertify/re-license annually so this would make the performance measure target burdensome. In addition, a couple of states recertify/relicense EMS providers every five years which would include a very long period of time before pediatric skills check would be required. The comment was considered but may not work well for the states with shorter and longer periods of recertification/re-licensure.

Change/ addition to wording,Definition added
38 12/29/2015 Workforce Development/ Training Univeristy of Alabama at Birmingham Pediatric Pulmonary Center Birmingham, AL CB 1 State capacity for advancing the health of MCH populations CB1 - The proposed Capacity-Building Measure 1 does not seem applicable to training grants.




Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
39 12/29/2015 Workforce Development/ Training Univeristy of Alabama at Birmingham Pediatric Pulmonary Center Birmingham, AL CB 2 Technical Assistance CB2 - The proposed Capacity-Building Measure 2 (TA) is duplicative of information collected in the Administrative Forms. If required of MCHB funded training grants, this would be a nearly 100% duplication of effort. It is unclear why both would be required. Duplication will not be an issue because data system will autopopulate wherever possible. No changes neccessary
40 12/29/2015 Workforce Development/ Training Univeristy of Alabama at Birmingham Pediatric Pulmonary Center Birmingham, AL CB 5 Scientific Publications CB5/6 - The proposed Capacity-Building Measure 5 (scientific publications) and proposed Capacity-Building Measure 6 (products) are duplicative of information collected in Administrative Forms (products and publications). If required of MCHB funded training grants, this would be a nearly 100% duplication of effort. It is unclear why both would be required. There will be connection of the information. They serve different purposes, and given that the data is already being collected, it is our opinion that it is not burdensome to use that information in two ways. No changes neccessary
41 12/29/2015 Workforce Development/ Training Univeristy of Alabama at Birmingham Pediatric Pulmonary Center Birmingham, AL CB 6 Products CB5/6 - The proposed Capacity-Building Measure 5 (scientific publications) and proposed Capacity-Building Measure 6 (products) are duplicative of information collected in Administrative Forms (products and publications). If required of MCHB funded training grants, this would be a nearly 100% duplication of effort. It is unclear why both would be required. There will be connection of the information. They serve different purposes, and given that the data is already being collected, it is our opinion that it is not burdensome to use that information in two ways. No changes neccessary
42 12/29/2015 Workforce Development/ Training Univeristy of Alabama at Birmingham Pediatric Pulmonary Center Birmingham, AL LC 1 Adequate Health Insurance Coverage '- The new reporting package will impose additional reporting burden on MCHB training grantees. No performance measures are deleted. Existing performance measures are modified, and additional measures are proposed.

-Having training grantees report on measures that involve patient/client information does not align with the purpose of MCHB funded training grants. Funding is not allocated for patient care/client activities. Funding is allocated for training activities for graduate students pursuing careers as leaders in MCH, to provide continuing education, and technical assistance for MCH professionals. Adding a reporting requirement on patient care/client activities (for example, number of clients referred for insurance coverage as part of Performance Measure LC1; number of clients assessed/screened for tobacco cessation as part of Performance Measure LC2, etc.) would require significant time of project faculty to develop a system to track this information.
Comments about some measures are erroneous, as those measures would in all likelihood not be assigned to those programs they are not applicable to. No changes neccessary
43 12/29/2015 Workforce Development/ Training Univeristy of Alabama at Birmingham Pediatric Pulmonary Center Birmingham, AL LC 2 Tobacco and eCigarette Cessation '-Having training grantees report on measures that involve patient/client information does not align with the purpose of MCHB funded training grants. Funding is not allocated for patient care/client activities. Funding is allocated for training activities for graduate students pursuing careers as leaders in MCH, to provide continuing education, and technical assistance for MCH professionals. Adding a reporting requirement on patient care/client activities (for example, number of clients referred for insurance coverage as part of Performance Measure LC1; number of clients assessed/screened for tobacco cessation as part of Performance Measure LC2, etc.) would require significant time of project faculty to develop a system to track this information. Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
44 12/29/2015 Workforce Development/ Training Univeristy of Alabama at Birmingham Pediatric Pulmonary Center Birmingham, AL General Forms/ ADEs
The “revised form 6� (abstract), Section V, section 2, is titled “Aims and Key Activities.� Our 2015 FOA included 4 “aims� to be addressed in the application, however we were also required to write goals and measurable objectives using SMART format. Clarification on whether this section should be written addressing the “aims� from the FOA or the goals and objectives as written in our application would be helpful. The instructions indicate MCHB will pre-populate this information, but uses the terms “aims� and “goals� interchangeably in the instructions. In assigning and/ or reviewing forms, Project Officers will clarify what aims/ goals are being referred to, as requirements about development of such measures vary across programs.

"Grammar/ spelling/ error issue, now fixed"
45 12/29/2015 Workforce Development/ Training Univeristy of Alabama at Birmingham Pediatric Pulmonary Center Birmingham, AL General Forms/ ADEs
We have been notified by MCHB that they intent to replace the Continuing Education reporting administrative form currently in the OMB package with a different form. The new form resembles current reporting requirements and would not impose any additional reporting burden. No resolution needed No changes neccessary
46 12/29/2015 Workforce Development/ Training Univeristy of Alabama at Birmingham Pediatric Pulmonary Center Birmingham, AL General Forms/ ADEs
The instructions on page 161 for the project abstract do not match the abstract form.



Abstract Form Instructions



I. Project Identifier Information I. Project Identifier Information

II.Budget II. Budget

III. Types of Service Provided III. Types of Services

IV. Domain Services are Provided to (no instructions)

V. Project Description or Experience to Date

IV. Program Description OR Current Status

1. Problem 1. Brief Description of project/problem

2. Aims/Activities 2. Up to 5 “aims� (see above)

3. HP2010 Objectives 3. HP2020 Objectives (2010 or 2020?)

4. (there is no #4) 4. Describe programs/activities to reach aims

5. Coordination 5. Coordination

6. Evaluation 6. Evaluation

7. Quality Improvement Activities (no instructions for reporting of QI activities)

V. (V is repeated) Key Words V. Key Words

VI. Annotation VI. Annotation
Correct instructions in "REVISED INSTRUCTIONS FOR THE COMPLETION OF FORM 6_x000D_ PROJECT ABSTRACT"


"Grammar/ spelling/ error issue, now fixed"
47 12/29/2015 Workforce Development/ Training Univeristy of Alabama at Birmingham Pediatric Pulmonary Center Birmingham, AL General Forms/ ADEs
The proposed administrative form for TA is a significant change from prior reporting requirements. Having very different reporting forms for CE and TA requires development of a new reporting system for grantees and imposes additional reporting burden for grantees. Specifically, The “target audience� pick list for TA (Local, Title V, Within State, Another State, Regional, National, International) does not match the pick list for CE target audience (Within State, With Another State, , Regional, National, International). It should be noted that the TA pick list categories are not mutually exclusive. For instance, is TA provided to my state Title V program counted as “Title V� or “Within State? The topic lists for TA and CE also do not match. The CE topic lists resemble the current reporting format and impose new additional burden. The proposed TA topic list A lacks an “other� category, placing al limit on allowing grantees to tell their story. The proposed TA topic list B is aligned with the National MCHB Block Grant Performance Measures and would require a new reporting system, an additional burden on grantees, and also lacks and “other� category. MCHB funded training grants have different goals than Block Grants and may provide TA in different categories. For example, the primary goal of training grants is to train future leaders. There is no TA topic for “leadership� and this information would therefore not be captured; there is also no TA topic for systems of care for CSHCN, another major focus of training grants. In conclusion, alignment of CE and TA reporting requirements is encouraged to facilitate a single system for grantees; and alignment of CE and TA reporting with training grantee mission is encouraged. DMCHWD has own CE form with relevant topics; DMCHWD made suggestions for addition to TA form for topics related to Autism CARES legislation Change/ addition to wording
48 1/4/2016 EMSC Kansas Board of Emergency Medical Services Topeka, KS EMSC 01 NEMSIS Submission EMSC 01 – Submission of NEMSIS compliant version 3.x data

The state of Kansas currently provides a cost-free electronic PCR solution, utilizing a NEMSIS v3.x compliant vendor, as well as a statutory mandate of reporting electronic patient care data into this system. Even with both of these items, we still have a significant percentage of ambulance services that do not submit data into this system. We believe that this performance measure falls outside the scope of the EMS for Children grant. However, it could be altered to address a percentage of pediatric calls submitted rather than a percentage of services submitting data.

Overall Comments_x000D_
In each of the definitions, an EMS agency includes transporting and non-transporting agencies as well as excludes those services that only respond in air or on water. In the state of Kansas and in some other states, non-transporting agencies fall outside the jurisdiction of the state regulatory entity – even though the licensure/certification of their personnel is within that jurisdiction. We also believe that if the desire is to have an all-encompassing view of prehospital care, then those air and water-only EMS services should also be included._x000D_
We are very appreciative of HRSA wishing to find ways of automated collection techniques to minimize the information collection burden. We feel that this will prove to be a more efficient method of timely analysis.
EMSC has responded to all comments on Measure EMSC 01 with revisions and the following FAQs:
NEMSIS-SPECIFIC Performance Measure:

Q: Aren’t these new EMS for Children performance measures unfunded mandates?_x000D_


A: No, these measures were developed so the national EMSC program can obtain baseline data on how the EMS system is operating on a national level and are intended to assist states and territories in showing improvement in these areas over the lifespan of the measures. The EMSC program recognizes that not every state or territory may meet these measures. _x000D_

Q: Why can’t the measure be “NEMSIS v3 data is being collected on 90% of the call volume in the state rather than from 90% of agencies? “
A: The program did consider using call volume versus percentage of agencies. One of the reasons that the program decided on using the percentage of agencies was that many states and territories may not know the total call volume but could reasonably know the number of agencies in their state. The program is concerned that if this measure captured the percent of call volume rather than the percent of agencies, the program would not understand what was happening at small agencies. For example, by using call volume, the measure could be considered biased toward rural states with a few urban areas that potentially have 90% of the call volume. This data is important for developing program planning to address the needs of all EMS agencies.
Q: My state may never convert to NEMSIS v3, so can the measure be rewritten to eliminate the version 3 part of the measure?
A: Unfortunately, no, the program is committed to assisting our federal partner, the National Highway Traffic Safety Administrations, (NHTSA) in their efforts to create a national databank of EMS data. In addition, beginning January 1, 2017, the NEMSIS Technical Assistance Center (TAC) will no longer accept NEMSIS v2 data from states and territories. As a result, the program wants to be current with national standards.
Q: My state does not license EMS agencies; does this mean that I don’t have to report data on this measure?
A: Even if your state or territory does not license EMS agencies, you still have to report on this measure. The intent of this measure is to determine how many agencies in the country are submitting NEMSIS v3 data, whether those agencies are licensed at a state, local, or some other level. If the EMS agencies in your state submit NEMSIS v3 data in 2017 (the expected first round of data collection on these new EMS for Children measures) then you should report the number of EMS agencies that submit NEMSIS v3 data to the State EMS office. Your state EMS Data Manager should be able to assist you with these numbers.
Q: The NEMSIS measure does not go far enough to improve the quality of the EMS data that is being submitted. Can the measure be revised to include a list of pediatric data elements, data validation, and scoring tools?
A: This is a great idea and is something that will be considered for development in the future. For the next five years, the EMS for Children program is interested in knowing how many agencies submit v3 data in order to have a baseline of EMS data collection numbers.
Q: This measure is not pediatric-specific and is out of scope of the EMS for Children program. Can it be eliminated as a measure?
A: No. The EMS for Children Program has participated in the development of a national EMS data system since its inception, and believes this is an important area for performance measurement and improvement. Past Funding Opportunities Announcements (FOA) for the State Partnership Grants have allowed grantees to use grant funds to support the EMS data infrastructure in their states, so this measure is in line with past EMS for Children efforts and within the scope of the program. In addition, since NEMSIS collects data on patients of all ages, and does contain pediatric-specific variables, pediatric patients are included
Change/ addition to wording,Definition added
49 1/4/2016 EMSC Kansas Board of Emergency Medical Services Topeka, KS EMSC 02 Pediatric Emergency Care Coordination EMSC 02 – Pediatric Emergency Care Coordination

The ability for all ambulance services to be able to designate a single person that is responsible for the coordination of pediatric emergency care for the service is a great concept when resources are plentiful. However, in ambulance services with a limited number of responders and personnel, having 1 person designated as being responsible for the level of coordination being gauged by this performance measure is unrealistic. We also believe that this contradicts a regionalized approach to care. We believe that this performance measure could be altered to reflect upon regions within a state rather than individual services. A regional approach to pediatric process improvement, pediatric continuing education opportunities, etc. provides for increased access to “experts� within pediatrics rather than an individual tasked with being the expert.

Overall Comments_x000D_
In each of the definitions, an EMS agency includes transporting and non-transporting agencies as well as excludes those services that only respond in air or on water. In the state of Kansas and in some other states, non-transporting agencies fall outside the jurisdiction of the state regulatory entity – even though the licensure/certification of their personnel is within that jurisdiction. We also believe that if the desire is to have an all-encompassing view of prehospital care, then those air and water-only EMS services should also be included._x000D_
We are very appreciative of HRSA wishing to find ways of automated collection techniques to minimize the information collection burden. We feel that this will prove to be a more efficient method of timely analysis.
EMSC has responded to all comments received regarding this measure with revisions and the following FAQ:
Q: Does the PECC need to be on staff at the EMS agency?_x000D_
A: No. Ideally, the Pediatric Emergency Care Coordinator (PECC) should be a member of the EMS agency and be familiar with the specific day-to-day operations and needs of the agency. Some states/territories utilize county or regional models of emergency care; if there is a designated individual who coordinates pediatric activities for a county or region, that individual could serve as the PECC for one of more individual EMS agencies within the county or region._x000D_
Q: Will there be a toolkit available for EMS agencies which provide a job description for a PECC? _x000D_
A: Yes, the EMS for Children Program resource centers will develop toolkits, fact sheets, and webinars to assist State Partnership Grantees in the implementation of the new performance measures. _x000D_
Q: Can you add the word ‘injury’ to the PECC role so that it reads ‘promote agency participation in pediatric injury prevention program’? _x000D_
A: No. As written, the specific function does not exclude injury but rather encompasses all types of prevention programs. Injury prevention is just one type of prevention activity that a PECC could engage in—other prevention programs can include asthma or other childhood illnesses-- so the EMS for Children program wants to keep the role more broadly defined. In addition, the EMS for Children program wanted to be consistent with what is recommended in the IOM report, “Emergency Care for Children: Growing Pains� (2006). _x000D_
Change/ addition to wording,Definition added
50 1/4/2016 EMSC Kansas Board of Emergency Medical Services Topeka, KS EMSC 03 Use of Pediatric-Specific Equipment EMSC 03 – Use of pediatric specific equipment

We appreciate the effort to ensure that pediatric specific equipment is utilized appropriately and that training for each of the pediatric specific devices is adequate. Our statutes also address being able to provide care with equipment and medications for which the provider has demonstrated his/her competency to utilize. Requiring or requesting an ambulance service to maintain a potentially different method of documenting competency is a burden to the ambulance service. We believe that this performance measure would be better addressed by remaining focused on building aspects of regional centers of pediatric excellence. For many of our ambulance services, the expectation that an EMS provider will be able to provide care to a pediatric patient in the “field� is minimal. Pediatric calls account for approximately 7% of the total calls in Kansas (as reported within our State EMS Information System). Ensuring that each provider is able to provide field care on a pediatric patient within a 2 year period is an impractical burden to ambulance services. However, building a “regional center� with the ability to provide simulated patient scenarios in conjunction with pediatric training sessions may better meet the strategic objective of this measure.

Overall Comments_x000D_
In each of the definitions, an EMS agency includes transporting and non-transporting agencies as well as excludes those services that only respond in air or on water. In the state of Kansas and in some other states, non-transporting agencies fall outside the jurisdiction of the state regulatory entity – even though the licensure/certification of their personnel is within that jurisdiction. We also believe that if the desire is to have an all-encompassing view of prehospital care, then those air and water-only EMS services should also be included._x000D_
We are very appreciative of HRSA wishing to find ways of automated collection techniques to minimize the information collection burden. We feel that this will prove to be a more efficient method of timely analysis.
EMSC has responded to all comments on Measure EMSC 03 with revisions and the following FAQs:
Q: To achieve the skill-checking measure a state would have to reach an ‘8’ on the scale. This score is too high and unrealistic. Can you consider lowering the score for achievement?

A: Yes. After reviewing the measuring scale and supporting evidence, the program has decided that a state/territory would achieve this measure by scoring a ‘6’ or higher on the scale.

Q: The skill-checking measure is too broad. Can it be revised to include the specific pieces of equipment that the program is interested in?

A: No. At one point in the development process, the measure did include specific pieces of life-saving equipment but as we field-tested the data collection instrument, we learned that there was variability among agencies as to what pieces of equipment were considered out of scope or if the medical director allowed agencies to use that equipment. As a result, the program determined that knowing if a process to skill-check, rather than which specific pieces of equipment they tested on, would help to understand how prepared agencies are to care for children. We assume that agencies who invest the resources to skill-check will select the pieces of equipment that are most crucial or very rarely used in the field to care for children.

Q: Our state certifies EMS instructors and only state certified EMS instructors can teach EMS providers, how does this figure into how EMS agencies will respond to the questions in the skill-checking measure?

A: EMS agencies in states which require certified EMS instructors will need to take state regulations into consideration when they respond to the survey questions which ask about the process their agency uses to skill-check.

Q: How do national courses such as PEPP and PALS and certifications like NREMT CCP fit into the skill-checking measure?

A: As long as it is an in-person course, an agency would respond in a way that reflects the methods used in the course or certification (i.e. simulation, skill-station) and frequency of the skill-checking activities whether the process occurred via a national course or not.

Q: Does an agency have to use or adopt all three of the methods listed (skill-station, simulation, and field encounter) to meet the measure?

A: No, the measure can be achieved by using only two of the three described methods.

Q: Can HRSA consider using the state’s recertification/re-licensure period as the time period for when skills check should occur?

A: At this time, a couple of states recertify/re-license annually so this would make the performance measure target burdensome. In addition, a couple of states recertify/relicense EMS providers every five years which would include a very long period of time before pediatric skills check would be required. The comment was considered but may not work well for the states with shorter and longer periods of recertification/re-licensure.

Change/ addition to wording,Definition added
51 1/4/2016 EMSC NYS Department of Health, Bureau of EMS and Trauma Systems New York EMSC 01 NEMSIS Submission NY recommends the goal should be 90% of the call volume, not 90% of EMS agencies. 90% of the call volume is a more realistic and achievable goal for states to meet as many EMS agencies are so small with very little call volume (and EMS personnel), and staffed with volunteers, that incurring the expense to purchase and maintain the software and hardware needed to collect data electronically is too much of a burden both on a personnel level (to maintain agency compliance, file submission, as system maintenance and staff training) as well as financial. Additionally, in NY, small agencies that are still using paper patient care reports, their data is entered in NY’s NEMSIS data electronic repository through a contracted vendor who key punches the data. Therefore in NY, smaller agencies’ data is captured in an electronic NEMSIS format without having those small agencies incur the cost of an electronic data system. NY would not meet this proposed measure, despite having >90% of the data in an electronic format.

This proposed Performance Measure is a missed opportunity to collect quality pediatric data on a national level. By only requiring NEMSIS 3 submission it does not ensure quality data as data submissions can be sent with null variables. Also, almost all states are already on track to move from NEMSIS 2 to NEMSIS 3. What is needed is a goal to ensure states are receiving ‘good’ data through validation and scoring of data transmission to the state. HRSA/MCHB should identify specific NEMSIS data elements to monitor/evaluate (with the goal of examining outcomes) and then set a Performance Measure to ensure validation and scoring of those identified data elements. By doing this, NEMSIS TAC would receive version 3 data elements (by virtue of identifying specific NEMSIS version 3 elements) as well as ensuring better, quality data is being submitted.

By requiring NEMSIS submission, HRSA is imposing an unfunded mandate and thereby a burden to EMS services. This is an issue for NHTSA to work out with states, not for HRSA/MCHB to require of its grantees.

EMSC has responded to all comments on Measure EMSC 01 with revisions and the following FAQs:
NEMSIS-SPECIFIC Performance Measure:

Q: Aren’t these new EMS for Children performance measures unfunded mandates?_x000D_


A: No, these measures were developed so the national EMSC program can obtain baseline data on how the EMS system is operating on a national level and are intended to assist states and territories in showing improvement in these areas over the lifespan of the measures. The EMSC program recognizes that not every state or territory may meet these measures. _x000D_

Q: Why can’t the measure be “NEMSIS v3 data is being collected on 90% of the call volume in the state rather than from 90% of agencies? “
A: The program did consider using call volume versus percentage of agencies. One of the reasons that the program decided on using the percentage of agencies was that many states and territories may not know the total call volume but could reasonably know the number of agencies in their state. The program is concerned that if this measure captured the percent of call volume rather than the percent of agencies, the program would not understand what was happening at small agencies. For example, by using call volume, the measure could be considered biased toward rural states with a few urban areas that potentially have 90% of the call volume. This data is important for developing program planning to address the needs of all EMS agencies.
Q: My state may never convert to NEMSIS v3, so can the measure be rewritten to eliminate the version 3 part of the measure?
A: Unfortunately, no, the program is committed to assisting our federal partner, the National Highway Traffic Safety Administrations, (NHTSA) in their efforts to create a national databank of EMS data. In addition, beginning January 1, 2017, the NEMSIS Technical Assistance Center (TAC) will no longer accept NEMSIS v2 data from states and territories. As a result, the program wants to be current with national standards.
Q: My state does not license EMS agencies; does this mean that I don’t have to report data on this measure?
A: Even if your state or territory does not license EMS agencies, you still have to report on this measure. The intent of this measure is to determine how many agencies in the country are submitting NEMSIS v3 data, whether those agencies are licensed at a state, local, or some other level. If the EMS agencies in your state submit NEMSIS v3 data in 2017 (the expected first round of data collection on these new EMS for Children measures) then you should report the number of EMS agencies that submit NEMSIS v3 data to the State EMS office. Your state EMS Data Manager should be able to assist you with these numbers.
Q: The NEMSIS measure does not go far enough to improve the quality of the EMS data that is being submitted. Can the measure be revised to include a list of pediatric data elements, data validation, and scoring tools?
A: This is a great idea and is something that will be considered for development in the future. For the next five years, the EMS for Children program is interested in knowing how many agencies submit v3 data in order to have a baseline of EMS data collection numbers.
Q: This measure is not pediatric-specific and is out of scope of the EMS for Children program. Can it be eliminated as a measure?
A: No. The EMS for Children Program has participated in the development of a national EMS data system since its inception, and believes this is an important area for performance measurement and improvement. Past Funding Opportunities Announcements (FOA) for the State Partnership Grants have allowed grantees to use grant funds to support the EMS data infrastructure in their states, so this measure is in line with past EMS for Children efforts and within the scope of the program. In addition, since NEMSIS collects data on patients of all ages, and does contain pediatric-specific variables, pediatric patients are included
Change/ addition to wording,Definition added
52 1/4/2016 EMSC NYS Department of Health, Bureau of EMS and Trauma Systems New York EMSC 02 Pediatric Emergency Care Coordination The “Recommended Roles� listed for a Pediatric Emergency Care (PEC) Coordinator in an EMS agency is more extensive than the roll of a PEC Coordinator in an Emergency Department and yet this is an unfunded addition/position and burden especially to voluntary EMS agencies.

As stated previously, in the very small, voluntary EMS agencies with very low call volume (<50 calls/year) who are finding it difficult to even staff an ambulance, it is unrealistic to assume a PEC Coordinator with all the recommended roles, This, and the previous Performance Measure, assumes most EMS agencies are large, robust entities- most of which are not. If this Measure allowed for a regional model for a PEC Coordinator, rather than only at the agency level, it would allow for the pooling of resources for resource-poor EMS agencies. Many states, like NY, already use a regional model and could more easily incorporate resource intensive initiatives like this, when resources are pooled. Also, utilizing a broader, regional model also assists with consistency, and quality assurance which, for larger states, is an issue. NY has 1,200+ EMS agencies and 18 EMS regions and strives to maintain quality and consistency; a more consistent, coordinated program can be disseminated to/from 18 regions rather than to 1,200 individual EMS agencies.

Additionally, we know of no state that allows individual EMS agencies to develop their own protocols (first bullet under Recommended Roles). Protocols are developed at the state or regional level. Asking EMS agencies when surveyed if “the [PEC Coordinator] ensures the pediatric perspective is included in the development of EMS protocols� makes the states and HRSA/MCHB look ignorant to the EMS protocol development process.

EMSC has responded to all comments received regarding this measure with revisions and the following FAQ:
Q: Does the PECC need to be on staff at the EMS agency?_x000D_
A: No. Ideally, the Pediatric Emergency Care Coordinator (PECC) should be a member of the EMS agency and be familiar with the specific day-to-day operations and needs of the agency. Some states/territories utilize county or regional models of emergency care; if there is a designated individual who coordinates pediatric activities for a county or region, that individual could serve as the PECC for one of more individual EMS agencies within the county or region._x000D_
Q: Will there be a toolkit available for EMS agencies which provide a job description for a PECC? _x000D_
A: Yes, the EMS for Children Program resource centers will develop toolkits, fact sheets, and webinars to assist State Partnership Grantees in the implementation of the new performance measures. _x000D_
Q: Can you add the word ‘injury’ to the PECC role so that it reads ‘promote agency participation in pediatric injury prevention program’? _x000D_
A: No. As written, the specific function does not exclude injury but rather encompasses all types of prevention programs. Injury prevention is just one type of prevention activity that a PECC could engage in—other prevention programs can include asthma or other childhood illnesses-- so the EMS for Children program wants to keep the role more broadly defined. In addition, the EMS for Children program wanted to be consistent with what is recommended in the IOM report, “Emergency Care for Children: Growing Pains� (2006). _x000D_
Change/ addition to wording,Definition added
53 1/4/2016 EMSC NYS Department of Health, Bureau of EMS and Trauma Systems New York EMSC 03 Use of Pediatric-Specific Equipment Like the previous Performance Measure, the scoring method with the proposed rubric is unrealistic for smaller, voluntary agencies. The evaluative rubric states a provider must demonstrate his/her skill in each of the three methods (skill station, case scenario and field encounter). A voluntary EMS provider for a small volume agency may never see a pediatric patient within a year (or two or three) therefore requiring EMS providers to demonstrate skills via a field encounter is not realistic or achievable. Has the rubric been validated? In the HRSA/MCHB webinar, HRSA/MCHB referenced the Lamer, et al. paper that states a paramedic treats a teen on average once every 625 days, a child every 958 days, and an infant every 1087 days. Using this cited reference, how can this Measure expect that an EMS provider will demonstrate the skill even once every two years in a field encounter, or more frequently- annually or biannually as the rubric requires?

Additionally there is concern at the state level of the competency or credentialing of the person who is evaluating EMS providers’ use of equipment. In NY, education and training of EMS providers is controlled at the state level and an educator has to go through state training to become a Certified Instructor who can then (after going through Certified Instructor training) attest an EMS training is to a core standard. This proposed Performance Measure would allow a non-certified instructor to attest to a provider’s competency without knowing the competency of that evaluator. Please note: Training providers in NY with Certified Instructors is tied to funding (EMS trainings are paid for by the state). If the Measure were changed to require “approved� instructors (or a state like NY were to require certified instructors be evaluators) this Measure would then create a financial burden and thereby another unfunded mandate.

EMSC has responded to all comments on Measure EMSC 03 with revisions and the following FAQs:
Q: To achieve the skill-checking measure a state would have to reach an ‘8’ on the scale. This score is too high and unrealistic. Can you consider lowering the score for achievement?

A: Yes. After reviewing the measuring scale and supporting evidence, the program has decided that a state/territory would achieve this measure by scoring a ‘6’ or higher on the scale.

Q: The skill-checking measure is too broad. Can it be revised to include the specific pieces of equipment that the program is interested in?

A: No. At one point in the development process, the measure did include specific pieces of life-saving equipment but as we field-tested the data collection instrument, we learned that there was variability among agencies as to what pieces of equipment were considered out of scope or if the medical director allowed agencies to use that equipment. As a result, the program determined that knowing if a process to skill-check, rather than which specific pieces of equipment they tested on, would help to understand how prepared agencies are to care for children. We assume that agencies who invest the resources to skill-check will select the pieces of equipment that are most crucial or very rarely used in the field to care for children.

Q: Our state certifies EMS instructors and only state certified EMS instructors can teach EMS providers, how does this figure into how EMS agencies will respond to the questions in the skill-checking measure?

A: EMS agencies in states which require certified EMS instructors will need to take state regulations into consideration when they respond to the survey questions which ask about the process their agency uses to skill-check.

Q: How do national courses such as PEPP and PALS and certifications like NREMT CCP fit into the skill-checking measure?

A: As long as it is an in-person course, an agency would respond in a way that reflects the methods used in the course or certification (i.e. simulation, skill-station) and frequency of the skill-checking activities whether the process occurred via a national course or not.

Q: Does an agency have to use or adopt all three of the methods listed (skill-station, simulation, and field encounter) to meet the measure?

A: No, the measure can be achieved by using only two of the three described methods.

Q: Can HRSA consider using the state’s recertification/re-licensure period as the time period for when skills check should occur?

A: At this time, a couple of states recertify/re-license annually so this would make the performance measure target burdensome. In addition, a couple of states recertify/relicense EMS providers every five years which would include a very long period of time before pediatric skills check would be required. The comment was considered but may not work well for the states with shorter and longer periods of recertification/re-licensure.

Change/ addition to wording,Definition added
54 1/4/2016 EMSC NYS Department of Health, Bureau of EMS and Trauma Systems New York EMSC 04 Pediatric Medical Emergencies The minimum percent threshold (25%) to meet the goal is arbitrary and not validated. According to the “National Quality Forum’s Evaluating Regionalized Emergency Medical Care Systems Using an Episodes of Care Approach� which is cited by HRSA/MCHB in its SPROC FOA:

“…the framework provides a conceptual model for emphasizing the evaluation of emergency medical care within a population or geographical region, rather than within an individual facility or single part of the system. Although earlier measurement efforts have focused on discrete parts of a system, new models should focus on evaluating the integration of the discrete service units that make up a system, and how the entire system performs. Thus, a major goal of this framework is to provide the context for evaluating the system as a whole, rather than just its component parts..�

Nowhere in this statement, nor the remaining report, does it state the number or percentage of facilities in the system is relevant. The goal for this Performance Measure should be whether or not the state has a developed system (yes or no) along with the continued use of the ‘scale’ to determine where states are in the process of developing a system.

Since EMTALA requires all EDs to be able, at a minimum, to stabilize and transfer patients, if NY were to create a designation of pediatric hospitals, NY would not include/designate hospitals with EDs that only stabilize and transfer pediatric patients in its regionalized system since it’s a baseline standard; NY would recognize higher level pediatric-capable hospitals. Therefore NY would not meet the 25% threshold, as it would need 48 out of 190 hospitals to meet this Measure, and realistically the system could not support this excessive number of pediatric hospitals.

HRSA appreciates the comments received related to the current performance measures to assure systems are prepared to stabilize and manage pediatric medical and traumatic emergencies. The comments will be used to begin the discussion as we develop the next generation of Hospital-Based performance measures. The comments are indeed very helpful and will be shared with additional subject matter experts as we work to build a consensus on what should be the next generation of EMSC hospital-based performance measures. We will keep EMSC stakeholders informed throughout the development. Change/ addition to wording,"Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
55 1/4/2016 EMSC NYS Department of Health, Bureau of EMS and Trauma Systems New York EMSC 05 Pediatric traumatic emergencies The minimum percent threshold (50%) to meet the goal is, like the previous Performance Measure, arbitrary, invalidated, and excessive especially for larger states. For example, NY has 8 pediatric trauma centers (4%) as verified by the ACS, and with our regionalized system feel this is sufficient geographic and coordinated coverage in NY. A 50% threshold would require 95 hospitals in NY to be designated (out of 190) to meet this measure. 95 hospitals is not a coordinated, regionalized sustainable system that can be supported (nor is necessary). As with the previous performance measure, the number or percentage of hospitals should not be the evaluative measure.

HRSA appreciates the comments received related to the current performance measures to assure systems are prepared to stabilize and manage pediatric medical and traumatic emergencies. The comments will be used to begin the discussion as we develop the next generation of Hospital-Based performance measures. The comments are indeed very helpful and will be shared with additional subject matter experts as we work to build a consensus on what should be the next generation of EMSC hospital-based performance measures. We will keep EMSC stakeholders informed throughout the development. Change/ addition to wording,"Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
56 1/4/2016 Workforce Development/ Training Tulane University School of Public Health and Tropical Medicine New Orleans, LA Training 6 Demonstrate Field Leadership '-Current phrasing in the data collection forms A and B is still a bit confusing, as numerator suggests current/crosssectional, while the four domains suggest current or past.
-Suggest inserting phrase “Since completing training program,� if that captures intent of measure (or “in past 3 years,� if that is intent of Form B).
Language has been revised/made consistent to indicate that leadership could be current or past. Change/ addition to wording
57 1/4/2016 Workforce Development/ Training Tulane University School of Public Health and Tropical Medicine New Orleans, LA Training 7 Diversity of Long-Term Trainees '- Based on the ultimate intent behind this measure, it may be useful to also capture if a person has immigrated from another country (and is now a citizen or permanent resident), or is the child of immigrants. There are a number of trainees who fall into the racial category of “white,� but have a firsthand understanding of and contribution to cultural and linguistic diversity because of their recent immigration background. The race and ethnicity categories reflected in this measure align with the data collected as part of the U.S. Census data and adhere to the 1997 Office of Management and Budget (OMB) standards on race and ethnicity which guide the Census Bureau data collection. The race and ethnicity categories will not be revised for this measure. No changes neccessary
58 1/4/2016 Workforce Development/ Training Tulane University School of Public Health and Tropical Medicine New Orleans, LA Training 9 Interdisciplinary Practice '- Suggest removing part C, as this is the only measure that requires a 10 year followup.
- Again, suggest using phrase “since completing training program,� (consistent with Training 6) as that is understood intent of measure (rather than crosssectional).
10 year follow-up has only been collected by MCHB around interdisciplinary practice historically. It is possible that the grantee is collecting additional 10-year data that is not being reported to MCHB. At this time, MCHB does not plan to expand 10-year follow-up data collection. No changes neccessary
59 1/5/2016 CSHCN Genetic Alliance Washington, DC Core 2 Quality Improvement We commend the wording of the Tier 4 measure regarding related outcomes as it allows for demonstration of success using a combination of data sources and metrics. No resolution needed No changes neccessary
60 1/5/2016 CSHCN Genetic Alliance Washington, DC Core 3 Health Equity – MCH Outcomes Comments: Health equity is extremely important but it looks different in different settings, especially as it relates to genetic services. Disparate access to genetic services may be caused be lack of existing services and service providers in a given area, not because of other socioeconomic or socially determined barriers preventing access. Goals for health equity should account for other factors such as availability of services.

In order to measure outcomes related to health equity objectives, we need sufficient benchmark data on the target population. Currently, there is no population based data on individuals with or at risk for genetic conditions and this prevents grantees from understanding access issues, especially as they relate
to health equity and needed services.


Additionally, it’s important that this measure for improving health equity take into consideration the fact that deciding to participate in health services, such as genetic testing, is driven in part by values and cultural considerations. Therefore, measuring uptake of testing would not be a good indicator for success towards health equity; instead, establishing objectives around education and activities to reduce barriers (cost, not knowing how/where to get tested) for testing would be more indicative of success
towards this goal.
Grantee-specific for now. No changes neccessary
61 1/5/2016 CSHCN Genetic Alliance Washington, DC Core 2 Quality Improvement Table 1: Activity Data Collection Form for Selected Measures


Comments: We would like to see clearer definitions and more clarity for the segments outlined in the chart. For example, what is included in the bucket of “State or National Agencies�? Does this mean all statewide or national organizations, or is it meant to specify state or federal agencies, such as the Agency for Healthcare Research and Quality? It would be helpful to have more clarity on this and what is included in “Community Partners.�

Consider including easily accessible definitions for each of these types of activities in the form to improve consistency across grantees. Overall, Table 1 is a good way to get a national snapshot of what is being done but it will not tell the full story of impact or benefit of programs, especially for programs that do not provide direct services.
Definition will be added.

Definition added
62 1/5/2016 CSHCN Genetic Alliance Washington, DC CSHCN 1 Family Engagement The way the Tier 1 measure is currently worded makes it unclear as to whether it is meant
to capture engagement of family members or engagement of children and youth with special health care needs.
Consider defining “meaningful participation� for Tier 4, measure 1.

The percentages calculated in Tier 4 will serve as helpful benchmark measures to determine any changes
in the future. However, without a sense from MCH leadership as to how many family and CSHCN leaders there should be (or percentage of CSHCN population) the current numerators and denominators do not indicate achievement nor need for improvement, only benchmark measures.

It’s important to point out that the target of many of our programs is individuals with genetic conditions (and their families), who make up only a portion of the estimated population of children with special health care needs (CSHCN). It is difficult to estimate and inaccurate for us to report our success towards any of the CSHCN measures because our programs focus on genetic services and individuals with genetic conditions, not the overall population of CSHCN.
-changed Tier 1 “Are you promoting and/or facilitating family engagement for children and youth with special health care needs in your program?" -The intention of these measures Change/ addition to wording
63 1/5/2016 CSHCN Genetic Alliance Washington, DC CSHCN 2 Access to and Use of Medical Home Many of our programs focus on population health and determining individual level related outcomes as in Tier 4 might not be possible or realistic. Clear expectations for related outcomes should be specified at the outset of the program so that grantees that do not deliver direct services are not asked to quantify outcomes on an individual level. Clear expectations will be provided by the Project Officer to the grantees for related outcomes so that grantees that do not deliver direct services are not required to quantify outcomes on an individual level. No changes neccessary
64 1/5/2016 Workforce Development/ Training Leadership Education in Neurodevelopmental and related Disabilities (LEND) Program Cincinnati, OH CB 2 Technical Assistance CB 2: There are 2 rows for Depression Screening/Screening for Major_x000D_ Depressive Disorder. There are no content areas related specifically to developmental disabilities. Clarify-- Major depressive is AH; Depression screening is WMH Definition added
65 1/5/2016 Workforce Development/ Training Leadership Education in Neurodevelopmental and related Disabilities (LEND) Program Cincinnati, OH CB 4 Sustainability CB 4: Tier 3 is confusing. How would this be measured? NA-- There is no Tier 3 Measure for this.
Change wording to note that there is no Tier 3 measure (as apparently NA isn't clear.)
Change/ addition to wording
66 1/5/2016 Workforce Development/ Training Leadership Education in Neurodevelopmental and related Disabilities (LEND) Program Cincinnati, OH CB 5,CB 6,General Forms/ ADEs Scientific Publications, Products CB 5/6: This seems to overlap with data that is captured in other areas regarding_x000D_ products/activities. Information that is consistent across measures and forms will auto-populate wherever possible. No changes neccessary
67 1/5/2016 Workforce Development/ Training Leadership Education in Neurodevelopmental and related Disabilities (LEND) Program Cincinnati, OH Core 1 Grant Impact CORE 1: Yes/No format will make it challenging for programs to demonstrate when_x000D_ they have only met parts of an objective. Will there be an open ended dialogue box for comments or discussions regarding changes to objectives throughout the 5year cycle? No change needed. No changes neccessary
68 1/5/2016 Workforce Development/ Training Leadership Education in Neurodevelopmental and related Disabilities (LEND) Program Cincinnati, OH Core 2 Quality Improvement Tier 2 – Can multiple aims be checked? It would be helpful to also have more detail regarding each of these options (such as examples)
Tier 3 – Will programs be sharing what type of training they
received/ who provided the training so information can be shared with other programs interested in QI?
In Tier 2, yes, multiple can be checked. With regards to the second question, while valuable, that will not be done through this system.

No changes neccessary
69 1/5/2016 Workforce Development/ Training Leadership Education in Neurodevelopmental and related Disabilities (LEND) Program Cincinnati, OH CSHCN 2,CSHCN 3 Access to and Use of Medical Home, Transition to Adult Health Care CSHCN 2/3: Tier 3: What is the difference between # trained and # educated/receiving information? Also, there is likely to be duplication within those reached (same person could attend multiple trainings and be counted multiple times), thus skewing the data. Counts may include duplicates. Interested in individuals that participate in specific activity; not necessarily the unduplicated number of individuals reached for the entire project
Defer to MCHB project officer guidances for #trained and #educated
No changes neccessary
70 1/5/2016 Workforce Development/ Training Leadership Education in Neurodevelopmental and related Disabilities (LEND) Program Cincinnati, OH Training 6 Demonstrate Field Leadership Training 6: The omission of the 10 year followup is a concern. While it has historically been difficult to track individuals 10 years after completion of the program, this data seems crucial. 5 years does not seem like enough time for leaders to fully grow and develop using the outcomes that we are measuring them on. 10 year follow-up has only been collected by MCHB around interdisciplinary practice historically. It is possible that the grantee is collecting additional 10-year data that is not being reported to MCHB. At this time, MCHB does not plan to expand 10-year follow-up data collection. No changes neccessary
71 1/5/2016 Workforce Development/ Training Leadership Education in Neurodevelopmental and related Disabilities (LEND) Program Cincinnati, OH Training 8 Title V Collaboration Training 8: Thank you for adding the examples of other MCHfunded and related programs – that is very helpful! No resolution needed No changes neccessary
72 1/5/2016 Workforce Development/ Training Leadership Education in Neurodevelopmental and related Disabilities (LEND) Program Cincinnati, OH Training 9 Interdisciplinary Practice Training 9: We like the list of interdisciplinary skills. Is there a reason why 10year followup is included in this PM and not PM Training #6? 10 year follow-up has only been collected by MCHB around interdisciplinary practice historically. It is possible that the grantee is collecting additional 10-year data that is not being reported to MCHB. At this time, MCHB does not plan to expand 10-year follow-up data collection. No changes neccessary
73 1/5/2016 Workforce Development/ Training Leadership Education in Neurodevelopmental and related Disabilities (LEND) Program Cincinnati, OH Training 12 Work with MCH Populations Training 12: Again, is there a reason why 10 year followup_x000D_ is not included here? 10 year follow-up has only been collected by MCHB around interdisciplinary practice historically. It is possible that the grantee is collecting additional 10-year data that is not being reported to MCHB. At this time, MCHB does not plan to expand 10-year follow-up data collection. No changes neccessary
74 1/5/2016 CSHCN National Coordinating Center for the Regional Genetic Service Collaboratives Bethesda, MD CB 1 State capacity for advancing the health of MCH populations In Tier 3, the list of State agencies should separate Newborn Screening (NBS) from Genetics as each is an important partner to HRSA/MCHB. Because HRSA has begun to emphasize genetics across the lifespan, it is critical to create distinct categories for NBS and Genetics. Through this ontology, we believe that HRSA will have additional insights about the extent to which genetics is being addressed by its grantees. Yes, these are now split into separate categories.


Change/ addition to wording
75 1/5/2016 CSHCN National Coordinating Center for the Regional Genetic Service Collaboratives Bethesda, MD CB 2 Technical Assistance Genetics is missing from the list of MCH priorities. Genetics should be added to Tier 3 and to the Data Collection Form for CB2.

As currently constructed the Data Collection Form for CB2 has the columns of Community/Local Partners separate from State or National Partners. We recommend that HRSA distinguish between governmental and non-governmental partnerships.

The definition of Technical Assistance is well done. We applaud HRSA for recognizing that this is a collaborative activity that can be done on a regional basis.
Yes.

Change/ addition to wording
76 1/5/2016 CSHCN National Coordinating Center for the Regional Genetic Service Collaboratives Bethesda, MD CB 3 Impact Measurement This is an opportunity to ask grantees about the State and national data sources that they are using to assess their activities and impact. It could give data HRSA data on the use of the National Survey of Children’s Health, birth defects registries, etc. This data would help support the importance taxpayers’ investment in these State and national data resources. While this would certainly be valuable, it will not be added here, as it would add additional burden, No changes neccessary
77 1/5/2016 CSHCN National Coordinating Center for the Regional Genetic Service Collaboratives Bethesda, MD CB 6 Products Tier 3 should also include some measure of use of these products. The NCC/RC system uses number of unique visits and home page visits to measure the use of its Internet resources. Impact factors of publications might be another metric to consider. We are assessing how it is disseminated, not how many people are reached by it.

No changes neccessary
78 1/5/2016 CSHCN National Coordinating Center for the Regional Genetic Service Collaboratives Bethesda, MD Core 2 Quality Improvement We applaud HRSA for recognizing cross-sectorial collaborative across multiple organizations in Tier 2. We suggest that an additional aim of this type of collaboration might be improved coordination across MCHB-funded programs. This should be captured in the third bullet-- orgs should be all orgs. No changes neccessary
79 1/5/2016 CSHCN National Coordinating Center for the Regional Genetic Service Collaboratives Bethesda, MD CSHCN 1 Family Engagement Add Tracking and Monitoring to Tier 2 to emphasize the importance of data collection around family engagement._x000D_

Add regional to the geographic units included in Tier 4. This addition would recognize that some activities can be more efficiently achieved on a regional basis._x000D_

While desirable to have racial and ethnic data on family CSHCN leaders, how feasible is it to obtain this information? Perhaps collecting data to show that affected individuals and families are engaged as CSHCN leaders would be easier to report.
We will include tracking to Tier 2. _x000D_
We will add regional to the geographic units included in Tier 4. _x000D_
We will still collect racial and ethnic data on family CSHCN leaders_x000D_
Change/ addition to wording,Definition added
80 1/5/2016 CSHCN National Coordinating Center for the Regional Genetic Service Collaboratives Bethesda, MD CSHCN 2 Access to and Use of Medical Home Table 1 is to be used to report activities. It would be helpful to clarify where local public health activities should be counted. As currently constructed this table has the columns of Community Partners separate from State and National. We recommend that HRSA distinguish between governmental and non-governmental partnerships._x000D_

Tier 4 could be enhanced by including other performance measures, e.g., promoting a framework for medical home, increasing the number of medical homes, or improving care coordination with specialists.
Table 1 is utilized by other domains and it is not clear how and for what purpose to distinguish between governmental and non-governmental partnerships.
Data for promotion of a framework for medical home can be captured in Tier 1 of this measure
Definition added
81 1/5/2016 CSHCN National Coordinating Center for the Regional Genetic Service Collaboratives Bethesda, MD LC 1 Adequate Health Insurance Coverage Add Tracking and Monitoring to Tier 2 to emphasize the importance of data collection around health insurance coverage. Similarly, add Tracking and Monitoring to the LC1 Data Collection form._x000D_

In Tier 4, it would be helpful to provide a definition for adequate health insurance coverage.
Add definition "A participant is considered to have insurance if they have any kind of health insurance that covers medical care, including prepaid plans such as HMOs, government plans such as Medicaid, and private coverage including coverage purchased through the Health Care Marketplace." Definition added
82 1/5/2016 CSHCN National Coordinating Center for the Regional Genetic Service Collaboratives Bethesda, MD All/ Most Domain Measures
Add Tracking and Monitoring as a new row. Data collection and analysis is sufficiently distinct from quality improvement to warrant its own row.
>>>> Tracking and surveillance
>>>> Needs a definition-- Check CDC

As indicated in our comments on CSHCN 1, we recommend that HRSA distinguish between governmental and non-governmental partnerships in the column headings.
>>>>> The differentiation between these are not meaningful
>>>>> Need to be cautious about not adding burden for information that will not be used meaningfully.
Add Tracking and Monitoring as it's own row in Tier 2. Change/ addition to wording
83 1/5/2016 EMSC Emergency Medical Services for Children State Partnership Program - New England Region Burlington, VT EMSC 01 NEMSIS Submission • The proposed performance measure is not pediatric-specific. We acknowledge that current, accurate data is essential to understand patient-care trends and opportunities for improvement, however in general state EMSC programs are unable to influence the data-collecting responsibilities of state EMS offices for 9-1-1 activations.
• Funding or state data-privacy issues will affect a state’s ability to meet or move towards this measure. This measure description cites 9-1-1 EMS activations but uses the numbers of EMS agencies within the state as numerator/denominator. Rural or
volunteer agencies may experience greater resource limitations preventing them from submitting data to state EMS offices. In states with high numbers of rural/volunteer agencies it might appear that a state is not making effort towards the established
benchmarks. A more accurate reflection might be gained by measuring the percentage of 9-1-1 response data (numerator) in relationship to the total estimated 9-1-1 statewide volume (denominator).
• The EMS COMPASS Project includes proposed performance measures for agencies, including data collection and use. We recommend that COMPASS, funded by NHTSA, will provide more appropriate opportunity to measure this information.
EMSC has responded to all comments on Measure EMSC 01 with revisions and the following FAQs:
NEMSIS-SPECIFIC Performance Measure:

Q: Aren’t these new EMS for Children performance measures unfunded mandates?_x000D_


A: No, these measures were developed so the national EMSC program can obtain baseline data on how the EMS system is operating on a national level and are intended to assist states and territories in showing improvement in these areas over the lifespan of the measures. The EMSC program recognizes that not every state or territory may meet these measures. _x000D_

Q: Why can’t the measure be “NEMSIS v3 data is being collected on 90% of the call volume in the state rather than from 90% of agencies? “
A: The program did consider using call volume versus percentage of agencies. One of the reasons that the program decided on using the percentage of agencies was that many states and territories may not know the total call volume but could reasonably know the number of agencies in their state. The program is concerned that if this measure captured the percent of call volume rather than the percent of agencies, the program would not understand what was happening at small agencies. For example, by using call volume, the measure could be considered biased toward rural states with a few urban areas that potentially have 90% of the call volume. This data is important for developing program planning to address the needs of all EMS agencies.
Q: My state may never convert to NEMSIS v3, so can the measure be rewritten to eliminate the version 3 part of the measure?
A: Unfortunately, no, the program is committed to assisting our federal partner, the National Highway Traffic Safety Administrations, (NHTSA) in their efforts to create a national databank of EMS data. In addition, beginning January 1, 2017, the NEMSIS Technical Assistance Center (TAC) will no longer accept NEMSIS v2 data from states and territories. As a result, the program wants to be current with national standards.
Q: My state does not license EMS agencies; does this mean that I don’t have to report data on this measure?
A: Even if your state or territory does not license EMS agencies, you still have to report on this measure. The intent of this measure is to determine how many agencies in the country are submitting NEMSIS v3 data, whether those agencies are licensed at a state, local, or some other level. If the EMS agencies in your state submit NEMSIS v3 data in 2017 (the expected first round of data collection on these new EMS for Children measures) then you should report the number of EMS agencies that submit NEMSIS v3 data to the State EMS office. Your state EMS Data Manager should be able to assist you with these numbers.
Q: The NEMSIS measure does not go far enough to improve the quality of the EMS data that is being submitted. Can the measure be revised to include a list of pediatric data elements, data validation, and scoring tools?
A: This is a great idea and is something that will be considered for development in the future. For the next five years, the EMS for Children program is interested in knowing how many agencies submit v3 data in order to have a baseline of EMS data collection numbers.
Q: This measure is not pediatric-specific and is out of scope of the EMS for Children program. Can it be eliminated as a measure?
A: No. The EMS for Children Program has participated in the development of a national EMS data system since its inception, and believes this is an important area for performance measurement and improvement. Past Funding Opportunities Announcements (FOA) for the State Partnership Grants have allowed grantees to use grant funds to support the EMS data infrastructure in their states, so this measure is in line with past EMS for Children efforts and within the scope of the program. In addition, since NEMSIS collects data on patients of all ages, and does contain pediatric-specific variables, pediatric patients are included
Change/ addition to wording,Definition added
84 1/5/2016 EMSC Emergency Medical Services for Children State Partnership Program - New England Region Burlington, VT EMSC 02 Pediatric Emergency Care Coordination The mission of the EMSC Program is to ensure that all children receive the best and most appropriate care in emergencies. As such we applaud the concept of a ‘pediatric coordinator’ for EMS agencies. We work closely with EMS agencies in our states, and we find that within each state our EMS systems vary greatly based on provider certification levels, scope of practice, local/state medical direction, pre-hospital protocols, population densities, regional geographic and economic conditions, as well as other factors.

• Agencies require flexibility in meeting or working towards this performance measure. Staffing, union requirements, funding, or availability of pediatric expertise are only some of the factors that will enter into an agency’s ability to assure the availability of a pediatric coordinator
• We recommend that flexible options are provided to assist all agencies in achieving this goal. Such options could include sharing of a pediatric coordinator among several agencies or on a regional basis, especially for rural/volunteer agencies.
• The wording in the survey question description is vague and may be interpreted by the reader/responder to mean that most or all of the possible coordinator activities must exist in order to achieve a ‘yes’ response (i.e. that the agency has a pediatric
coordinator).

o We recommend that the preliminary wording be changed to:
“…by DESIGNATING AN INDIVIDUAL who is responsible for ONE or MORE of the following activities:�
o We recommend that the following be added to the list of provided possible activities:
--- “Promote the adoption of family-centered care policies’
--- “Promote agency participation in pediatric injury prevention programs/ collaborations�
--- “Promote awareness of pediatric-specific clinical guidelines/ protocols
• The proposed survey question also includes language regarding the role of the pediatric coordinator in the development of EMS protocols. In many states this would never be the role of an agency-level coordinator, as it is handled at a higher level. The ‘pediatric representative’ that is embedded within the state’s EMS medical advisory board, (reference existing EMSC performance measure 79) and/or the EMSC advisory
committee, holds the responsibility for input on pediatric protocol development. We recommend that the referenced language be removed.
EMSC has responded to all comments received regarding this measure with revisions and the following FAQ:
Q: Does the PECC need to be on staff at the EMS agency?_x000D_
A: No. Ideally, the Pediatric Emergency Care Coordinator (PECC) should be a member of the EMS agency and be familiar with the specific day-to-day operations and needs of the agency. Some states/territories utilize county or regional models of emergency care; if there is a designated individual who coordinates pediatric activities for a county or region, that individual could serve as the PECC for one of more individual EMS agencies within the county or region._x000D_
Q: Will there be a toolkit available for EMS agencies which provide a job description for a PECC? _x000D_
A: Yes, the EMS for Children Program resource centers will develop toolkits, fact sheets, and webinars to assist State Partnership Grantees in the implementation of the new performance measures. _x000D_
Q: Can you add the word ‘injury’ to the PECC role so that it reads ‘promote agency participation in pediatric injury prevention program’? _x000D_
A: No. As written, the specific function does not exclude injury but rather encompasses all types of prevention programs. Injury prevention is just one type of prevention activity that a PECC could engage in—other prevention programs can include asthma or other childhood illnesses-- so the EMS for Children program wants to keep the role more broadly defined. In addition, the EMS for Children program wanted to be consistent with what is recommended in the IOM report, “Emergency Care for Children: Growing Pains� (2006). _x000D_
Change/ addition to wording,Definition added
85 1/5/2016 EMSC Emergency Medical Services for Children State Partnership Program - New England Region Burlington, VT EMSC 03 Use of Pediatric-Specific Equipment Statement: As written, PM 03 lends itself to wide interpretation by the respondent. In order to collect meaningful data, it is important to clarify the performance measure and offer well considered guidance to EMS agencies. We find that there are 3 areas needing clarification: type of equipment, use of standardized courses, and use of the National Registry’s Continued Competency Program for provider recertification.
Comments:

• We recommend that a list of example pediatric-specific equipment/supplies be developed and provided to survey respondents to illustrate the type/scope of pediatric items that may be considered when answering the proposed survey questions. Equipment/supplies vary based on certification levels, protocols, local and state medical direction among other factors. A focus on the ABC’s of pediatric EMS is recommended.
Examples of equipment/supplies:

BLS Oro-and naso-pharyngeal airways

Suction: tips, catheters, bulb suction

BVM, selection mask/bag sizes

Supraglottic airway: size/insertion/confirmation of placement

Weight/length-based tape use

AED

Child safety restraints (safety seats, safety harness, etc.)

ALS IV/IO insertion

ET tubes: size/insertion/confirmation

Needle decompression

Manual defibrillator, including synchronized cardioversion and transcutaneous pacing
• It is important that survey questions include specific reference to Healthcare Provider
CPR, PALS, PEPP, APLS, EPC and NRP. All of these standardized programs require physical demonstration of certain pediatric-specific skills. They all require a recertification process every two years. We strongly recommend that use of these courses be acknowledged and included when responding to PM 03 survey questions.
Additionally, in some areas, local medical directors require specific ‘skills-checkoffs’
annually or bi-annually. We recommend referring to these possible activities as well.

Many states use or are moving to the National Registry of EMT’s Continued Competency Program for provider recertification. The National Registry specifically requires skill verification by the service training officer or designee. We recommend that the use of the CCP recertification program for pediatric skills verification be acknowledged and allowed when responding to survey questions.
• We are concerned that the current scoring method for the proposed rubric may be biased against smaller, voluntary EMS agencies. Based on information referenced in the performance measure about the relative infrequency of pediatric EMS response, we can assume that many rural/volunteer providers may not see any pediatric patients in a given year. This removes the service option to respond to the referenced item in the
rubric. If they are presently assessing skill review annually and can also include the 2- year reviews that come with the standardized courses previously mentioned, they can achieve a score of ‘6’ and the state data will reflect that these agencies are indeed working to maintain pediatric skill levels. We therefore recommend that the rubric goal of “8� be changed to “6� to prevent bias against the rural/volunteer agencies.
EMSC has responded to all comments on Measure EMSC 03 with revisions and the following FAQs:
Q: To achieve the skill-checking measure a state would have to reach an ‘8’ on the scale. This score is too high and unrealistic. Can you consider lowering the score for achievement?

A: Yes. After reviewing the measuring scale and supporting evidence, the program has decided that a state/territory would achieve this measure by scoring a ‘6’ or higher on the scale.

Q: The skill-checking measure is too broad. Can it be revised to include the specific pieces of equipment that the program is interested in?

A: No. At one point in the development process, the measure did include specific pieces of life-saving equipment but as we field-tested the data collection instrument, we learned that there was variability among agencies as to what pieces of equipment were considered out of scope or if the medical director allowed agencies to use that equipment. As a result, the program determined that knowing if a process to skill-check, rather than which specific pieces of equipment they tested on, would help to understand how prepared agencies are to care for children. We assume that agencies who invest the resources to skill-check will select the pieces of equipment that are most crucial or very rarely used in the field to care for children.

Q: Our state certifies EMS instructors and only state certified EMS instructors can teach EMS providers, how does this figure into how EMS agencies will respond to the questions in the skill-checking measure?

A: EMS agencies in states which require certified EMS instructors will need to take state regulations into consideration when they respond to the survey questions which ask about the process their agency uses to skill-check.

Q: How do national courses such as PEPP and PALS and certifications like NREMT CCP fit into the skill-checking measure?

A: As long as it is an in-person course, an agency would respond in a way that reflects the methods used in the course or certification (i.e. simulation, skill-station) and frequency of the skill-checking activities whether the process occurred via a national course or not.

Q: Does an agency have to use or adopt all three of the methods listed (skill-station, simulation, and field encounter) to meet the measure?

A: No, the measure can be achieved by using only two of the three described methods.

Q: Can HRSA consider using the state’s recertification/re-licensure period as the time period for when skills check should occur?

A: At this time, a couple of states recertify/re-license annually so this would make the performance measure target burdensome. In addition, a couple of states recertify/relicense EMS providers every five years which would include a very long period of time before pediatric skills check would be required. The comment was considered but may not work well for the states with shorter and longer periods of recertification/re-licensure.

Change/ addition to wording,Definition added
86 1/5/2016 EMSC New Hampshire EMSC Program Lebanon, NH EMSC 01 NEMSIS Submission The degree to which EMS agencies submit NEMSIS compliant version 3.x data to the state EMS office for submission to NEMSIS Technical Assistance Center (TAC).

1. Recommend that this performance measure be eliminated. The proposed performance measure is not a pediatric specific performance measure and is being address by other groups. EMS Compass is working on developing overarching EMS performance measures that will be based on the latest version of the National EMS Information System (NEMSIS) and will allow local and state EMS systems to use their own data meaningfully.

EMSC has responded to all comments on Measure EMSC 01 with revisions and the following FAQs:
NEMSIS-SPECIFIC Performance Measure:

Q: Aren’t these new EMS for Children performance measures unfunded mandates?_x000D_


A: No, these measures were developed so the national EMSC program can obtain baseline data on how the EMS system is operating on a national level and are intended to assist states and territories in showing improvement in these areas over the lifespan of the measures. The EMSC program recognizes that not every state or territory may meet these measures. _x000D_

Q: Why can’t the measure be “NEMSIS v3 data is being collected on 90% of the call volume in the state rather than from 90% of agencies? “
A: The program did consider using call volume versus percentage of agencies. One of the reasons that the program decided on using the percentage of agencies was that many states and territories may not know the total call volume but could reasonably know the number of agencies in their state. The program is concerned that if this measure captured the percent of call volume rather than the percent of agencies, the program would not understand what was happening at small agencies. For example, by using call volume, the measure could be considered biased toward rural states with a few urban areas that potentially have 90% of the call volume. This data is important for developing program planning to address the needs of all EMS agencies.
Q: My state may never convert to NEMSIS v3, so can the measure be rewritten to eliminate the version 3 part of the measure?
A: Unfortunately, no, the program is committed to assisting our federal partner, the National Highway Traffic Safety Administrations, (NHTSA) in their efforts to create a national databank of EMS data. In addition, beginning January 1, 2017, the NEMSIS Technical Assistance Center (TAC) will no longer accept NEMSIS v2 data from states and territories. As a result, the program wants to be current with national standards.
Q: My state does not license EMS agencies; does this mean that I don’t have to report data on this measure?
A: Even if your state or territory does not license EMS agencies, you still have to report on this measure. The intent of this measure is to determine how many agencies in the country are submitting NEMSIS v3 data, whether those agencies are licensed at a state, local, or some other level. If the EMS agencies in your state submit NEMSIS v3 data in 2017 (the expected first round of data collection on these new EMS for Children measures) then you should report the number of EMS agencies that submit NEMSIS v3 data to the State EMS office. Your state EMS Data Manager should be able to assist you with these numbers.
Q: The NEMSIS measure does not go far enough to improve the quality of the EMS data that is being submitted. Can the measure be revised to include a list of pediatric data elements, data validation, and scoring tools?
A: This is a great idea and is something that will be considered for development in the future. For the next five years, the EMS for Children program is interested in knowing how many agencies submit v3 data in order to have a baseline of EMS data collection numbers.
Q: This measure is not pediatric-specific and is out of scope of the EMS for Children program. Can it be eliminated as a measure?
A: No. The EMS for Children Program has participated in the development of a national EMS data system since its inception, and believes this is an important area for performance measurement and improvement. Past Funding Opportunities Announcements (FOA) for the State Partnership Grants have allowed grantees to use grant funds to support the EMS data infrastructure in their states, so this measure is in line with past EMS for Children efforts and within the scope of the program. In addition, since NEMSIS collects data on patients of all ages, and does contain pediatric-specific variables, pediatric patients are included
Change/ addition to wording,Definition added
87 1/5/2016 EMSC New Hampshire EMSC Program Lebanon, NH EMSC 02 Pediatric Emergency Care Coordination The percentage of EMS agencies in the state/territory that have a designated individual who coordinates pediatric emergency care.

1. Recommend clarifying the definition of a “designated individual� and allow innovative ideas for achieving this performance measure. Large EMS services may have the ability to designate a single individual to coordinate pediatric emergency care. This is similar to the model IOM recommendation that an individual in a hospital emergency department should be designated. However, EMS services are not mobile replicas of emergency departments and it is important to take this reality into account. Rural areas of the country with very small EMS services already have adopted innovative models that utilize regional pediatric emergency care coordinators. Innovative models are needed for achieving this performance measure.

2. Recommend emphasizing that the follow-up questions to the initial questions are for informational purposes only. The follow-up questions ask about the specific roles of the coordinator. The only indication that the list or roles is not a requirement (in part or in total), for an affirmative answer to the first question, is the use of the single word “could�. This is a very subtle way of indicating that the list is not a requirement and may be easily overlooked by the provider completing the survey. As a result, an EMS provider may answer “yes� to the general question for this performance measure but decide to change the answer to “no� after reading the exhaustive list of specific responsibilities. It is important to place more emphasis within the sentence on the fact that this list is simply a list of potential roles and responsibilities and they are not required. This clarification will result in less confusion and better, more accurate data.

3. Specific suggestions for the list of potential roles and activities:
--- Add - “Promote the adoption of family centered care policies.� This was in the original IOM suggestions for the activities.
--- Add the word “injury� so that the activity reads “Promote agency participation in pediatric injury prevention programs.�
--- Add the word “protocols� so that the activity reads: “Ensure that fellow providers follow pediatric clinical practice guidelines/protocols�. Many states have protocols therefore adding the word will preclude confusion and ensure more accurate data.
--- The proposed survey includes a question regarding the development of EMS protocols. In many states, this is role is not available to providers as mandated protocols are developed at a state or regional level. Suggest that a third response option should be provided to reflect this situation as a way to minimize confusion and gain better, more accurate data.
--- Add additional, explanatory information to the statement “Oversee pediatric process improvement� so that the survey respondent understands what this means and how it differs from “Ensure that fellow providers follow pediatric clinical practice guidelines/protocols�.
EMSC has responded to all comments received regarding this measure with revisions and the following FAQ:
Q: Does the PECC need to be on staff at the EMS agency?_x000D_
A: No. Ideally, the Pediatric Emergency Care Coordinator (PECC) should be a member of the EMS agency and be familiar with the specific day-to-day operations and needs of the agency. Some states/territories utilize county or regional models of emergency care; if there is a designated individual who coordinates pediatric activities for a county or region, that individual could serve as the PECC for one of more individual EMS agencies within the county or region._x000D_
Q: Will there be a toolkit available for EMS agencies which provide a job description for a PECC? _x000D_
A: Yes, the EMS for Children Program resource centers will develop toolkits, fact sheets, and webinars to assist State Partnership Grantees in the implementation of the new performance measures. _x000D_
Q: Can you add the word ‘injury’ to the PECC role so that it reads ‘promote agency participation in pediatric injury prevention program’? _x000D_
A: No. As written, the specific function does not exclude injury but rather encompasses all types of prevention programs. Injury prevention is just one type of prevention activity that a PECC could engage in—other prevention programs can include asthma or other childhood illnesses-- so the EMS for Children program wants to keep the role more broadly defined. In addition, the EMS for Children program wanted to be consistent with what is recommended in the IOM report, “Emergency Care for Children: Growing Pains� (2006). _x000D_
Change/ addition to wording,Definition added
88 1/5/2016 EMSC New Hampshire EMSC Program Lebanon, NH EMSC 03 Use of Pediatric-Specific Equipment The percentage of EMS agencies in the state/territory that have a process that requires EMS providers to physically demonstrate the correct use of pediatric specific equipment.

It is strongly recommended that this performance measure be reassessed and significant revisions made in an effort to prevent general confusion and widely variable interpretation. Surveys that cause confusion are generally not completed or submitted, as responsible people do not want to provide inaccurate information. If the EMSC program coordinator does not know how to interpret the question, how can we expect the busy EMS provider to interpret it? As the performance measure is currently written, results will be inaccurate and/or meaningless.

1. Recommend developing a clear definition of the word “process�.

Below is a list of a few of the likely questions that will be voiced:
•Must the process include all pediatric equipment? If yes, what equipment would that include? What exceptions would have to be made for local scope of practice?
•If the EMS unit has a “process� to demonstrate the use of a pediatric length-based tape twice a year, but no other pediatric equipment is included in the “process�, would the EMS unit be able to claim a “process for pediatric-specific equipment� resulting in a score of 4 according to the rubric?
•If each provider is required to demonstrate (via a skill station) the use of a pediatric oral airway once a year and the use of a pediatric IO once a year, does that result in a score of 4 according to the rubric?
•If providers are required to complete a PALS course every other year and required to demonstrate pediatric skills via a skill station during the opposite year, does this constitute a “process� and result in a score of 2 according to the rubric?
•Can maintenance of pediatric CPR certification count as part of the “process�?
•If an EMS provider uses pediatric equipment in the field and the emergency department staff find no fault in its application, why do we need someone on our service ride along to verify pediatric skills? The ED physicians and nurses are more skilled and knowledgeable about pediatric care than our training officer or chief. We see children very infrequently.
•My providers are participating in the National Registry of EMT’s Continued Competency Program for recertification and it includes pediatric equipment skills. Can this be considered part of our “process�?

2. Recommend changing the numerator to read: “The number of licensed EMS agencies in the state/territory that score a 6 or more on a 0-12 scale�.

While this PM is in line with the national trend towards clinical competence within regards to continued education of EMS providers, the scoring method with the proposed rubric is biased against rural states with numerous EMS agencies with very small call volumes. The evaluative rubric assumes a provider will demonstrate pediatric skills using 3 methods (skill station, case scenario and field encounter). The fact is that an EMS provider for a small volume agency may never see a pediatric patient within a year (or two or three) therefore use of field encounters is not realistic or achievable. The Lamer paper states that a paramedic treats a teen on average once every 625 days, a child every 958 days, and an infant every 1087 days. Using this cited reference, how can this performance measure expect that an EMS provider on a small service will have an opportunity to demonstrate a pediatric equipment skill even once every two years in a field encounter? The proposed metric is biased against these small services. Adjusting the numerator to a score of ‘6’ or more on a 0-12 would allow the small services to be included in a realistic goal of strengthening the health workforce.



3. Recommend developing a list of pediatric equipment to illustrate the type and scope of pediatric equipment that may be considered when answering the proposed survey questions. The creation of this list will assist in defining the “process�.



The following list is an example of pediatric specific equipment that might be used in this performance measure. With the wide variety of protocols, and skills among EMS in the nation, the equipment competency should have a strong focus on the ABC’s.



Examples include:

BLS:

1. Oro- and Nasopharyngeal airways

2. Suctioning - tips and catheters and bulb suction

3. BVM- selection mask and bag sizes

4. Supraglottic airway

5. AED

6. IV and IO

7. Weight/ Length-based Tape



ALS:

8. Endotracheal Tubes

9. Manual Defibrillator and synchronized cardioversion



4. Recommend clarifying the use of standardized courses (PEPP, PALS, APLS, and EPC) and the use of the National Registry of EMT’s Continued Competency Program (CCP) in the services “process�.



Since many services include standardized classes/courses in their continuing education requirements and these courses include the demonstration of pediatric equipment case scenarios, it is recommended that theses programs be recognized and included in the definition of a “process�.



In addition, the number of states utilizing the National Registry of EMT’s Continued Competency Program (CCP) for recertification is increasing. As the National Registry specifically requires skill verification by the service training officer/supervisor, it is recommended that the use of CCP recertification program to verify pediatric specific equipment skills be acknowledged and allowed in the definition of “process�.



EMS services have already integrated various methods of pediatric skill verification. Do not penalize the services by eliminating standardized courses and the CCP recertification process.

EMSC has responded to all comments on Measure EMSC 03 with revisions and the following FAQs:
Q: To achieve the skill-checking measure a state would have to reach an ‘8’ on the scale. This score is too high and unrealistic. Can you consider lowering the score for achievement?

A: Yes. After reviewing the measuring scale and supporting evidence, the program has decided that a state/territory would achieve this measure by scoring a ‘6’ or higher on the scale.

Q: The skill-checking measure is too broad. Can it be revised to include the specific pieces of equipment that the program is interested in?

A: No. At one point in the development process, the measure did include specific pieces of life-saving equipment but as we field-tested the data collection instrument, we learned that there was variability among agencies as to what pieces of equipment were considered out of scope or if the medical director allowed agencies to use that equipment. As a result, the program determined that knowing if a process to skill-check, rather than which specific pieces of equipment they tested on, would help to understand how prepared agencies are to care for children. We assume that agencies who invest the resources to skill-check will select the pieces of equipment that are most crucial or very rarely used in the field to care for children.

Q: Our state certifies EMS instructors and only state certified EMS instructors can teach EMS providers, how does this figure into how EMS agencies will respond to the questions in the skill-checking measure?

A: EMS agencies in states which require certified EMS instructors will need to take state regulations into consideration when they respond to the survey questions which ask about the process their agency uses to skill-check.

Q: How do national courses such as PEPP and PALS and certifications like NREMT CCP fit into the skill-checking measure?

A: As long as it is an in-person course, an agency would respond in a way that reflects the methods used in the course or certification (i.e. simulation, skill-station) and frequency of the skill-checking activities whether the process occurred via a national course or not.

Q: Does an agency have to use or adopt all three of the methods listed (skill-station, simulation, and field encounter) to meet the measure?

A: No, the measure can be achieved by using only two of the three described methods.

Q: Can HRSA consider using the state’s recertification/re-licensure period as the time period for when skills check should occur?

A: At this time, a couple of states recertify/re-license annually so this would make the performance measure target burdensome. In addition, a couple of states recertify/relicense EMS providers every five years which would include a very long period of time before pediatric skills check would be required. The comment was considered but may not work well for the states with shorter and longer periods of recertification/re-licensure.

Change/ addition to wording,Definition added
89 1/5/2016 MIECHV/ Home Visiting Early Childhood Comprehensive Systems Project for New York State Rensselaer, NY CB 1 State capacity for advancing the health of MCH populations This is somewhat of a confusing section.

Comment is that this is confusing, no specific resolution needed. No changes neccessary
90 1/5/2016 MIECHV/ Home Visiting Early Childhood Comprehensive Systems Project for New York State Rensselaer, NY CB 2 Technical Assistance Some of this could be covered by the Office of Children and Family Services and CACFP training requirements, but a challenge to collect.

Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
91 1/5/2016 MIECHV/ Home Visiting Early Childhood Comprehensive Systems Project for New York State Rensselaer, NY CB 3 Impact Measurement Possible data sources would include:
•Early Care & Learning Council – Child Care Resource & Referral (CCR&Rs) statewide coordinating agency (CCR&Rs are required to do surveying)
•Infant Toddler Specialists
•Department of Education - Regional Early Childhood Protection Centers
•UPK and early UPK
•Headstart

This would be difficult to collect because there are different systems across the state.

Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
92 1/5/2016 MIECHV/ Home Visiting Early Childhood Comprehensive Systems Project for New York State Rensselaer, NY CB 4 Sustainability New training developed for Child Care Health (expected date: February 2016). Training modules and updating of website, supported by ECCS funds, will be available and ongoing.

Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
93 1/5/2016 MIECHV/ Home Visiting Early Childhood Comprehensive Systems Project for New York State Rensselaer, NY CB 5 Scientific Publications Not much here. This really isn’t a focus of HV and the ECCS grant.

Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
94 1/5/2016 MIECHV/ Home Visiting Early Childhood Comprehensive Systems Project for New York State Rensselaer, NY CB 6 Products New training developed for Child Care Health (expected date: February 2016). Training modules and updating of website, supported by ECCS funds, will be available and ongoing. Blogs have also been written for Child Care Health Consultants. Many state agencies/entities, such as Docs for Tots, the Early Childhood Advisory Council, Early Care & Learning Council, NYS Zero-To-Three all do publications. This would not be difficult to track, but time-consuming.

Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
95 1/5/2016 MIECHV/ Home Visiting Early Childhood Comprehensive Systems Project for New York State Rensselaer, NY CH 1 Well Child Visit HV programs promote the importance of well-child visits and will assist clients to find providers or make appointments if necessary. Tiers 2 and 3 are not currently collected. Both Tier 4 measures can be reported using MIECHV data, but will be based on parental self-report. MIECHV benchmark is different, measuring the children receiving the recommended number of well-child visits based on age.

Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
96 1/5/2016 MIECHV/ Home Visiting Early Childhood Comprehensive Systems Project for New York State Rensselaer, NY CH 2 Quality of Well Child Visit This would be difficult to collect, although important.

Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
97 1/5/2016 MIECHV/ Home Visiting Early Childhood Comprehensive Systems Project for New York State Rensselaer, NY CH 3 Developmental Screening Home visitors administer developmental screenings with clients and their children and will make referrals when necessary. Tiers 2 and 3 not currently collected. Tier 4 could be reported for MIECHV index children, but not up to 71 months because Nurse Family Partnership ends at 24 months and Healthy Families New York ends at 60 months of age. The measure is similar to the MIECHV benchmark, which looks at 1 screening by 9 months, 2 by 18 months and 3 by 30 months.

Additionally, the Child Care Resource & Referral agencies are now surveying providers who do developmental screening. The Early Intervention group might also have this information and other individual child care programs. The CCDBG will make this more prominent and programs will have to do this.

Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
98 1/5/2016 MIECHV/ Home Visiting Early Childhood Comprehensive Systems Project for New York State Rensselaer, NY CH 4 Injury Prevention HV programs provide information about child safety and injury prevention at multiple times during program involvement. Tiers 2 and 3 not collected, but could probably check which child safety domains HV programs address. The Child Development Associate Credential (CDA) also has modules on injury prevention so outreach data could be pulled from here. The first measure under Tier 4 is collected by HV programs but based on parental self-report. As mentioned above, the HV programs do not serve children to age 9. The second measure under Tier 4 is for children outside the age range served by MIECHV HV programs.

Additionally, child care programs do not focus specifically on this. Information could be collected by violations detected by the Office of Children and Family Services and injury rates. Supervision requirements could also be looked at.

Lastly, a new training is being developed for Child Care Health Consultants that will address the injury prevention (expected date: February 2016). Outreach data could be analyzed.

Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
99 1/5/2016 MIECHV/ Home Visiting Early Childhood Comprehensive Systems Project for New York State Rensselaer, NY PIH 1 Safe Sleep Home visiting (HV) programs provide information about safe sleep and connect clients to resources to create a safe sleep environment. Additionally, licensed and regulated providers must receive training on safe sleep, as per requirements of the New York State Office of Children and Family Services (OCFS). To collect Tiers 2 and 3 would require a new data collection method—not currently collected. There is however a group of Infant Toddler Specialists in New York State that could pull some of this technical assistance data on safe sleep from their own data collection system, but it is limited. The first measure in Tier 4 is part of a composite measure that’s been proposed for the new MIECHV benchmarks so New York State would have that information for MIECHV-funded sites. The second measure in Tier 4 could be collected by reviewing past violations of a program.

Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
100 1/5/2016 MIECHV/ Home Visiting Early Childhood Comprehensive Systems Project for New York State Rensselaer, NY PIH 2 Breast Feeding HV programs promote and facilitate breastfeeding through prenatal education and postpartum support. Tiers 2 and 3 are not currently collected from our MIECHV providers. Again, there is a group of Infant Toddler Specialists in New York State that could provide some data on technical assistance (re: breastfeeding) but it is limited. Both Tier 4 measures are collected by Nurse Family Partnership and Heathy Families New York. The second measure is the same as the proposed MIECHV benchmark regarding breastfeeding.

Breastfeeding data is also collected in all CACFP programs, and program violations are also a good indicator. There are also Referral Specialists of the Child Care Resource & Referral agencies in New York State that would have some data on parents asking for breastfeeding programs.

Lastly, a new training is being developed for Child Care Health Consultants that will address the importance of breastfeeding (expected date: February 2016) and therefore outreach data will be collected.

Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
101 1/5/2016 MIECHV/ Home Visiting Early Childhood Comprehensive Systems Project for New York State Rensselaer, NY PIH 3 Newborn Screening Newborn screening program is not explicitly part of the HV curricula, so our MIECHV partners would not have this data, nor would our other available sources. This would be difficult to collect, although important for New York State.

Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
102 1/5/2016 EMSC Rhode Island Center for Emergency Medical Services Providence, RI EMSC 01 NEMSIS Submission The degree to which EMS agencies submit NEMSIS compliant version 3.x data to the state EMS office for submission to NEMSIS Technical Assistance Center (TAC).

1. RI EMSC recommends the elimination of this performance measure. NEMSIS compliance is not within the roles or scope of an EMS for Children Program Manager. According the EMSC National Resource Center “The primary role of the SP manager is to coordinate and manage all aspects of the EMSC SP program to ensure that the emergency care needs of children are well integrated throughout the entire continuum of care, from illness and injury prevention to bystander care, dispatch, prehospital EMS, definitive hospital care, rehabilitation, and return to community.� The data collection oversight is limited to helping NEDARC with survey techniques, and help NRC collect pediatric specific data collection as wells a submitting data to HRSA.
2. In addition, EMS Compass is working on developing overarching EMS performance measures that will be based on the latest version of the National EMS Information System (NEMSIS) and will allow local and state EMS systems to use their own data meaningfully.
EMSC has responded to all comments on Measure EMSC 01 with revisions and the following FAQs:
NEMSIS-SPECIFIC Performance Measure:

Q: Aren’t these new EMS for Children performance measures unfunded mandates?_x000D_


A: No, these measures were developed so the national EMSC program can obtain baseline data on how the EMS system is operating on a national level and are intended to assist states and territories in showing improvement in these areas over the lifespan of the measures. The EMSC program recognizes that not every state or territory may meet these measures. _x000D_

Q: Why can’t the measure be “NEMSIS v3 data is being collected on 90% of the call volume in the state rather than from 90% of agencies? “
A: The program did consider using call volume versus percentage of agencies. One of the reasons that the program decided on using the percentage of agencies was that many states and territories may not know the total call volume but could reasonably know the number of agencies in their state. The program is concerned that if this measure captured the percent of call volume rather than the percent of agencies, the program would not understand what was happening at small agencies. For example, by using call volume, the measure could be considered biased toward rural states with a few urban areas that potentially have 90% of the call volume. This data is important for developing program planning to address the needs of all EMS agencies.
Q: My state may never convert to NEMSIS v3, so can the measure be rewritten to eliminate the version 3 part of the measure?
A: Unfortunately, no, the program is committed to assisting our federal partner, the National Highway Traffic Safety Administrations, (NHTSA) in their efforts to create a national databank of EMS data. In addition, beginning January 1, 2017, the NEMSIS Technical Assistance Center (TAC) will no longer accept NEMSIS v2 data from states and territories. As a result, the program wants to be current with national standards.
Q: My state does not license EMS agencies; does this mean that I don’t have to report data on this measure?
A: Even if your state or territory does not license EMS agencies, you still have to report on this measure. The intent of this measure is to determine how many agencies in the country are submitting NEMSIS v3 data, whether those agencies are licensed at a state, local, or some other level. If the EMS agencies in your state submit NEMSIS v3 data in 2017 (the expected first round of data collection on these new EMS for Children measures) then you should report the number of EMS agencies that submit NEMSIS v3 data to the State EMS office. Your state EMS Data Manager should be able to assist you with these numbers.
Q: The NEMSIS measure does not go far enough to improve the quality of the EMS data that is being submitted. Can the measure be revised to include a list of pediatric data elements, data validation, and scoring tools?
A: This is a great idea and is something that will be considered for development in the future. For the next five years, the EMS for Children program is interested in knowing how many agencies submit v3 data in order to have a baseline of EMS data collection numbers.
Q: This measure is not pediatric-specific and is out of scope of the EMS for Children program. Can it be eliminated as a measure?
A: No. The EMS for Children Program has participated in the development of a national EMS data system since its inception, and believes this is an important area for performance measurement and improvement. Past Funding Opportunities Announcements (FOA) for the State Partnership Grants have allowed grantees to use grant funds to support the EMS data infrastructure in their states, so this measure is in line with past EMS for Children efforts and within the scope of the program. In addition, since NEMSIS collects data on patients of all ages, and does contain pediatric-specific variables, pediatric patients are included
Change/ addition to wording,Definition added
103 1/5/2016 EMSC Rhode Island Center for Emergency Medical Services Providence, RI EMSC 02 Pediatric Emergency Care Coordination The percentage of EMS agencies in the state/territory that have a designated individual who coordinates pediatric emergency care.
1. RI EMSC recommends clarification on “designated individual�.
2. RI EMSC recommends the development of a guide that explains the details of the pediatric emergency care coordinator that includes: Researched best practices, scope of work, opportunities for funding, certification requirements and education.
3. In RI we have a large volunteer rural EMS; therefore, we recommend innovative models for achieving this performance measure.
4. We recommend emphasizing that the follow-up questions to the initial questions are for informational purposes only because these may change the answer from yes to no based on the requirement. For example, an EMS provider may answer “yes� to the general question for this performance measure but decide to change the answer to “no� after reading the exhaustive list of specific responsibilities.
5. We recommend revising or eliminating the question regarding the development of EMS protocols. In RI we utilize State protocols and this might either confuse the role into creating their own pediatric protocol and overriding state mandate or simply changing the yes answer of number one back to no.
6. Add additional, explanatory information to the statement “Oversee pediatric process improvement� so that the survey respondent understands what this means and how it differs from “Ensure that fellow providers follow pediatric clinical practice guidelines/protocols�.
EMSC has responded to all comments received regarding this measure with revisions and the following FAQ:
Q: Does the PECC need to be on staff at the EMS agency?_x000D_
A: No. Ideally, the Pediatric Emergency Care Coordinator (PECC) should be a member of the EMS agency and be familiar with the specific day-to-day operations and needs of the agency. Some states/territories utilize county or regional models of emergency care; if there is a designated individual who coordinates pediatric activities for a county or region, that individual could serve as the PECC for one of more individual EMS agencies within the county or region._x000D_
Q: Will there be a toolkit available for EMS agencies which provide a job description for a PECC? _x000D_
A: Yes, the EMS for Children Program resource centers will develop toolkits, fact sheets, and webinars to assist State Partnership Grantees in the implementation of the new performance measures. _x000D_
Q: Can you add the word ‘injury’ to the PECC role so that it reads ‘promote agency participation in pediatric injury prevention program’? _x000D_
A: No. As written, the specific function does not exclude injury but rather encompasses all types of prevention programs. Injury prevention is just one type of prevention activity that a PECC could engage in—other prevention programs can include asthma or other childhood illnesses-- so the EMS for Children program wants to keep the role more broadly defined. In addition, the EMS for Children program wanted to be consistent with what is recommended in the IOM report, “Emergency Care for Children: Growing Pains� (2006). _x000D_
Change/ addition to wording,Definition added
104 1/5/2016 EMSC Rhode Island Center for Emergency Medical Services Providence, RI EMSC 03 Use of Pediatric-Specific Equipment The percentage of EMS agencies in the state/territory that have a process that requires EMS providers to physically demonstrate the correct use of pediatric specific equipment.

This performance measure is perhaps the most concerning for RI EMSC. Therefore we strongly recommend that this performance measure be reassessed and significant revisions made in an effort to prevent general confusion and widely variable interpretation, especially the survey. Regardless of definitions, surveys are tests, and no one wants to fail the test. Therefore, as the performance measure is currently written, results will be inaccurate and/or meaningless.

--- 1. We recommend developing a clear definition of the word “process�.
--- 2. Recommend changing the numerator to read: “The number of licensed EMS agencies in the state/territory that score a 6 or more on a 0-12 scale�.
We concur with other New England states in that while this PM is in line with the national trend towards clinical competence within regards to continued education of EMS providers, the scoring method with the proposed rubric is biased against rural states with numerous EMS agencies with very small call volumes. The evaluative rubric assumes an agency will provide 3 methods of pediatric training and that 90% of these agencies will have a “process� for it? Adjusting the numerator to a score of ‘6’ or more on a 0-12 would allow the small services to be included in a realistic goal of strengthening the health workforce.
--- 3. Recommend developing a list of pediatric equipment to illustrate the type and scope of pediatric equipment that may be considered when answering the proposed survey questions. The creation of this list will assist in defining the “process�.
--- 4. Recommend clarifying and justifying the limitation of the use of standardized courses (PEPP, PALS, APLS, and ENPC) and the use of the National Registry of EMT’s Continued Competency Program (CCP) in the services “process�.
Same as comment 88.

Addressed in other similar or identical comment.
105 1/5/2016 DCAFH School-Based Health Alliance Washington, DC CB 1 State capacity for advancing the health of MCH populations Tier 3
- # of professionals trained on program priority topic
We recommend that MCHB also inquire about # of professionals who received follow-up training, support, or consultation coaching, given that one-time trainings have been shown to be necessary but insufficient for implementation success, particularly in health care professions with high turnover rates (particularly behavioral health)
- # of state agencies and departments participating on priority area. This includes the following key state agencies.
We recommend that MCHB include school-based health services
on this list

Tier 4
The School-Based Health Alliance conducts an annual policy survey with state health departments, which does not currently include these measures; however, it can be adapted to include these measures.

The Center for School Mental Health is currently leading a National School Mental Health Census to identify the states
and districts providing comprehensive school mental health services (CSMHS); it could be adapted to include some of
these measures in a future version once all CSMHSs are counted in the initial Census.
Related to first comment: This would create unneeded additional work for grantees, so, though this is important, this change is not essential.
Related to this comment: "We recommend that MCHB include school-based health services on this list"-- This would be captured under 'education'.
Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
106 1/5/2016 DCAFH School-Based Health Alliance Washington, DC CB 2 Technical Assistance The School-Based Health Alliance could adapt our online TA reporting system to measure these specific areas: child
well visit TA, adolescent well visit TA, major depressive order TA, and oral health TA.

The Center for School Mental Health could develop a TA reporting system to systematically capture TA currently
provided in the areas of: depression screening TA, family engagement TA, adequate health insurance coverage TA, data research and evaluation TA, and other TA

In order to complete the data form for CB2, it would require drastic changes in both the School-Based Health Alliance’s
and Center for School Mental Health’s data collection methodologies. It would be extremely difficult to track the number of individuals receiving TA by type of audience reached.
Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
107 1/5/2016 DCAFH School-Based Health Alliance Washington, DC CB 3 Impact Measurement This category does not really apply to our organizations. We use participant surveys related to the trainings and TA that we provide. We also collect qualitative assessments related to our projects, but this information is not related to grantees, rather to individual teams who work on projects with us. Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
108 1/5/2016 DCAFH School-Based Health Alliance Washington, DC CB 4 Sustainability The School-Based Health Alliance and CSMH provide resources and materials on sustainability on their websites, webinars, and online training and TA to Title V-sponsored SBHCs and school mental health programs if requested. However, the recipients
are not our grantees; therefore, we do not collect follow-up data to the level of detail outlined in the performance measures.

Also, both the School-Based Health Alliance and the CSMH are leading a Collaborative Improvement and Innovation Network as a result of their MCHB-funded cooperative agreement, and are focusing efforts with half of the engaged sites specifically on sustainability at local and state levels. Therefore, both
organizations could be actively involved in supporting the mechanisms listed in Tier 2.
Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
109 1/5/2016 DCAFH School-Based Health Alliance Washington, DC CB 5 Scientific Publications Both organizations currently report on this measure, which is very applicable to capturing the performance of our work with MCHB, and we are enthusiastic about continuing to report this measure. Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
110 1/5/2016 DCAFH School-Based Health Alliance Washington, DC CB 6 Products Both organizations currently report on this measure, which is very applicable to capturing the performance of our work with MCHB, and we are enthusiastic about continuing to report this measure.
Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
111 1/5/2016 EMSC Vermont Emergency Medical Services for Children Program Burlington, VT EMSC 01 NEMSIS Submission The degree to which EMS agencies submit NEMSIS compliant version 3.x data to the state
EMS office for submission to NEMSIS Technical Assistance Center (TAC).
Statement:

The proposed performance measure related to NEMSIS is not a pediatric specific performance measure. EMS data is essential in understanding trends/opportunities for improvement in the prehospital setting; however addressing statewide EMS data systems is the responsibility of the State EMS offices. As future performance measures are developed, please take into consideration the need to ensure clear applicability to pediatric
specific efforts.
EMSC has responded to all comments on Measure EMSC 01 with revisions and the following FAQs:
NEMSIS-SPECIFIC Performance Measure:

Q: Aren’t these new EMS for Children performance measures unfunded mandates?_x000D_


A: No, these measures were developed so the national EMSC program can obtain baseline data on how the EMS system is operating on a national level and are intended to assist states and territories in showing improvement in these areas over the lifespan of the measures. The EMSC program recognizes that not every state or territory may meet these measures. _x000D_

Q: Why can’t the measure be “NEMSIS v3 data is being collected on 90% of the call volume in the state rather than from 90% of agencies? “
A: The program did consider using call volume versus percentage of agencies. One of the reasons that the program decided on using the percentage of agencies was that many states and territories may not know the total call volume but could reasonably know the number of agencies in their state. The program is concerned that if this measure captured the percent of call volume rather than the percent of agencies, the program would not understand what was happening at small agencies. For example, by using call volume, the measure could be considered biased toward rural states with a few urban areas that potentially have 90% of the call volume. This data is important for developing program planning to address the needs of all EMS agencies.
Q: My state may never convert to NEMSIS v3, so can the measure be rewritten to eliminate the version 3 part of the measure?
A: Unfortunately, no, the program is committed to assisting our federal partner, the National Highway Traffic Safety Administrations, (NHTSA) in their efforts to create a national databank of EMS data. In addition, beginning January 1, 2017, the NEMSIS Technical Assistance Center (TAC) will no longer accept NEMSIS v2 data from states and territories. As a result, the program wants to be current with national standards.
Q: My state does not license EMS agencies; does this mean that I don’t have to report data on this measure?
A: Even if your state or territory does not license EMS agencies, you still have to report on this measure. The intent of this measure is to determine how many agencies in the country are submitting NEMSIS v3 data, whether those agencies are licensed at a state, local, or some other level. If the EMS agencies in your state submit NEMSIS v3 data in 2017 (the expected first round of data collection on these new EMS for Children measures) then you should report the number of EMS agencies that submit NEMSIS v3 data to the State EMS office. Your state EMS Data Manager should be able to assist you with these numbers.
Q: The NEMSIS measure does not go far enough to improve the quality of the EMS data that is being submitted. Can the measure be revised to include a list of pediatric data elements, data validation, and scoring tools?
A: This is a great idea and is something that will be considered for development in the future. For the next five years, the EMS for Children program is interested in knowing how many agencies submit v3 data in order to have a baseline of EMS data collection numbers.
Q: This measure is not pediatric-specific and is out of scope of the EMS for Children program. Can it be eliminated as a measure?
A: No. The EMS for Children Program has participated in the development of a national EMS data system since its inception, and believes this is an important area for performance measurement and improvement. Past Funding Opportunities Announcements (FOA) for the State Partnership Grants have allowed grantees to use grant funds to support the EMS data infrastructure in their states, so this measure is in line with past EMS for Children efforts and within the scope of the program. In addition, since NEMSIS collects data on patients of all ages, and does contain pediatric-specific variables, pediatric patients are included
Change/ addition to wording,Definition added
112 1/5/2016 EMSC Vermont Emergency Medical Services for Children Program Burlington, VT EMSC 02 Pediatric Emergency Care Coordination The percentage of EMS agencies in the state/territory that have a designated individual who coordinates pediatric emergency care.

Recommendations:
1. Recommend clarifying the definition of a “designated individual� and allow innovative ideas for achieving this performance measure. Large EMS services may have the ability to designate a single individual to coordinate pediatric emergency care. Rural areas of the country with very small EMS services already have adopted innovative models that utilize regional pediatric emergency care coordinators. Innovative models and flexibility are needed for achieving this performance
measure.
2. Recommend emphasizing that the follow-up questions to the initial questions are for informational purposes only and that the list is simply a list of potential roles and responsibilities for the designated individual. The follow-up questions ask about the specific roles of the coordinator, this will result in less confusion of the role, and allow EMS agencies to develop the role to meet the need. In Vermont, we are fortunate to have statewide protocols/ guidelines; individual services have little say in the development of new protocols.
3. Recommendations for the list of potential roles and activities:
Add to the list “Promote the adoption of family centered care policies.� This was in the original IOM suggestions for the activities.
Add the word “injury� so that the activity reads “Promote agency participation in pediatric injury prevention programs.�
Add the word “protocols� so that the activity reads: “Ensure that fellow providers follow pediatric clinical practice guidelines/protocols�. Many states have protocols therefore
adding the word will preclude confusion and ensure more accurate data.

Add additional information to the statement “Oversee pediatric process improvement� so that the survey respondent understands what this entails and how it differs from “Ensure that fellow providers follow pediatric clinical practice guidelines/protocols�.
EMSC has responded to all comments received regarding this measure with revisions and the following FAQ:
Q: Does the PECC need to be on staff at the EMS agency?_x000D_
A: No. Ideally, the Pediatric Emergency Care Coordinator (PECC) should be a member of the EMS agency and be familiar with the specific day-to-day operations and needs of the agency. Some states/territories utilize county or regional models of emergency care; if there is a designated individual who coordinates pediatric activities for a county or region, that individual could serve as the PECC for one of more individual EMS agencies within the county or region._x000D_
Q: Will there be a toolkit available for EMS agencies which provide a job description for a PECC? _x000D_
A: Yes, the EMS for Children Program resource centers will develop toolkits, fact sheets, and webinars to assist State Partnership Grantees in the implementation of the new performance measures. _x000D_
Q: Can you add the word ‘injury’ to the PECC role so that it reads ‘promote agency participation in pediatric injury prevention program’? _x000D_
A: No. As written, the specific function does not exclude injury but rather encompasses all types of prevention programs. Injury prevention is just one type of prevention activity that a PECC could engage in—other prevention programs can include asthma or other childhood illnesses-- so the EMS for Children program wants to keep the role more broadly defined. In addition, the EMS for Children program wanted to be consistent with what is recommended in the IOM report, “Emergency Care for Children: Growing Pains� (2006). _x000D_
Change/ addition to wording,Definition added
113 1/5/2016 EMSC Vermont Emergency Medical Services for Children Program Burlington, VT EMSC 03 Use of Pediatric-Specific Equipment The percentage of EMS agencies in the state/territory that have a process that requires EMS providers to physically demonstrate the correct use of pediatric specific equipment. It is strongly recommended that this performance measures be reassessed and significant revisions made in effort to prevent biased against smaller, rural EMS agencies. It is recommended to develop a clear defining the “process� related to this performance measure.

1. Recommend changing the numerator to read: “The number of licensed EMS agencies in the state/territory that score a 6 or more on a 0-12 scale� to allow the small volunteer and, first responder communities to be included in a realistic goal of strengthening the health workforce.
Many states have a large portion of first response/ non-transporting services within their EMS system. A voluntary EMS provider, for a small volume agency, may not see a pediatric patient within a year (or two or three) therefore requiring EMS providers to demonstrate skills via a field encounter is not realistic or achievable.
2. Equipment List Recommendation: It is important to clarify the performance measure and provide well-considered guidance to the services. Three areas needing clarification are type of equipment, use of standardized courses in the services “process� and use of the National Registry of EMT’s Continued competency Program (CCP) in the services “process�.
It is suggested that a list of example pediatric-specific equipment be developed and provided to the survey respondents to illustrate the type and scope of pediatric equipment that may be considered when answering the proposed survey questions. With the wide variety of protocols, and skills among EMS in the nation, the equipment competency should have a strong focus on the ABC’s of EMS! The following list is an example of pediatric specific equipment that might be used in an EMS agencies competency testing.

Examples/Suggested list includes:
BLS:
1. Oro- and Nasopharyngeal airways
2. Suctioning - tips and catheters and bulb suction
3. BVM- selection mask and bag sizes
4. Supraglottic airway - selection of size, insertion technique, and confirmation of placement
5. AED
6. Child safety restraints (safety seats and other kinds of child specific restraints)
7. IV and IO Insertion
8. Weight/ Length-based Tape use

ALS:
9. ET tubes - selection of size, insertion technique, and confirmation of
10. Manual Defibrillator and synchronized cardioversion

3. It is strongly recommended that the use of CCP recertification program to verify pediatric specific equipment skills be acknowledged and allowed when answering the survey questions for PM 3.

Since many services include standardized classes/courses in their “process� to ensure that providers physically demonstrate the correct use of pediatric-specific equipment, it is important that the survey include information regarding PALS, PEEP, APLS, ENPC, and NRP. All these courses require physical demonstration of some pediatric specific equipment skills. It is strongly recommended that use of these courses be acknowledged and allowed when answering the survey questions for PM 3.
EMSC has responded to all comments on Measure EMSC 03 with revisions and the following FAQs:
Q: To achieve the skill-checking measure a state would have to reach an ‘8’ on the scale. This score is too high and unrealistic. Can you consider lowering the score for achievement?

A: Yes. After reviewing the measuring scale and supporting evidence, the program has decided that a state/territory would achieve this measure by scoring a ‘6’ or higher on the scale.

Q: The skill-checking measure is too broad. Can it be revised to include the specific pieces of equipment that the program is interested in?

A: No. At one point in the development process, the measure did include specific pieces of life-saving equipment but as we field-tested the data collection instrument, we learned that there was variability among agencies as to what pieces of equipment were considered out of scope or if the medical director allowed agencies to use that equipment. As a result, the program determined that knowing if a process to skill-check, rather than which specific pieces of equipment they tested on, would help to understand how prepared agencies are to care for children. We assume that agencies who invest the resources to skill-check will select the pieces of equipment that are most crucial or very rarely used in the field to care for children.

Q: Our state certifies EMS instructors and only state certified EMS instructors can teach EMS providers, how does this figure into how EMS agencies will respond to the questions in the skill-checking measure?

A: EMS agencies in states which require certified EMS instructors will need to take state regulations into consideration when they respond to the survey questions which ask about the process their agency uses to skill-check.

Q: How do national courses such as PEPP and PALS and certifications like NREMT CCP fit into the skill-checking measure?

A: As long as it is an in-person course, an agency would respond in a way that reflects the methods used in the course or certification (i.e. simulation, skill-station) and frequency of the skill-checking activities whether the process occurred via a national course or not.

Q: Does an agency have to use or adopt all three of the methods listed (skill-station, simulation, and field encounter) to meet the measure?

A: No, the measure can be achieved by using only two of the three described methods.

Q: Can HRSA consider using the state’s recertification/re-licensure period as the time period for when skills check should occur?

A: At this time, a couple of states recertify/re-license annually so this would make the performance measure target burdensome. In addition, a couple of states recertify/relicense EMS providers every five years which would include a very long period of time before pediatric skills check would be required. The comment was considered but may not work well for the states with shorter and longer periods of recertification/re-licensure.

Change/ addition to wording,Definition added
114 1/5/2016 EMSC Tennessee Emergency Medical Services for Children Program Nashville, TN EMSC 01 NEMSIS Submission Overview: There are two overarching premises regarding performance measures: 1) the ultimate purpose is to improve the acute care of children and 2) the expectations of the performance measure must be clearly defined otherwise the confusion leads to various interpretations and data that doesn’t demonstrate a difference in the care of children. As a longstanding EMSC program manager and PI, I would like to provide some historical context. The first iteration of the EMSC performance measures also was not well defined and it resulted in much misunderstanding of expectations and the baseline data was not very useful. It is my hope that this comment period will encourage HRSA to take a pause and ensure the measures are both flexible and clearly defined. It is also important that the measure is not a simple measurement so that every state can check off the box but that the measures ultimately improve the care of children.

EMSC Proposed Performance Measure 01 EMSC Performance Measure 1 (new): The degree to which EMS agencies submit NEMSIS compliant version 3.x data to the state EMS office for submission to NEMSIS Technical Assistance Center (TAC).
-TN EMSC applauds the proposed performance measure related to NEMSIS and believes it will improve the overall care of all the citizens in our country but is not a pediatric specific performance measure.
-As future performance measures are developed, please take into consideration the need to ensure clear applicability to pediatric specific efforts.
EMSC has responded to all comments on Measure EMSC 01 with revisions and the following FAQs:
NEMSIS-SPECIFIC Performance Measure:

Q: Aren’t these new EMS for Children performance measures unfunded mandates?_x000D_


A: No, these measures were developed so the national EMSC program can obtain baseline data on how the EMS system is operating on a national level and are intended to assist states and territories in showing improvement in these areas over the lifespan of the measures. The EMSC program recognizes that not every state or territory may meet these measures. _x000D_

Q: Why can’t the measure be “NEMSIS v3 data is being collected on 90% of the call volume in the state rather than from 90% of agencies? “
A: The program did consider using call volume versus percentage of agencies. One of the reasons that the program decided on using the percentage of agencies was that many states and territories may not know the total call volume but could reasonably know the number of agencies in their state. The program is concerned that if this measure captured the percent of call volume rather than the percent of agencies, the program would not understand what was happening at small agencies. For example, by using call volume, the measure could be considered biased toward rural states with a few urban areas that potentially have 90% of the call volume. This data is important for developing program planning to address the needs of all EMS agencies.
Q: My state may never convert to NEMSIS v3, so can the measure be rewritten to eliminate the version 3 part of the measure?
A: Unfortunately, no, the program is committed to assisting our federal partner, the National Highway Traffic Safety Administrations, (NHTSA) in their efforts to create a national databank of EMS data. In addition, beginning January 1, 2017, the NEMSIS Technical Assistance Center (TAC) will no longer accept NEMSIS v2 data from states and territories. As a result, the program wants to be current with national standards.
Q: My state does not license EMS agencies; does this mean that I don’t have to report data on this measure?
A: Even if your state or territory does not license EMS agencies, you still have to report on this measure. The intent of this measure is to determine how many agencies in the country are submitting NEMSIS v3 data, whether those agencies are licensed at a state, local, or some other level. If the EMS agencies in your state submit NEMSIS v3 data in 2017 (the expected first round of data collection on these new EMS for Children measures) then you should report the number of EMS agencies that submit NEMSIS v3 data to the State EMS office. Your state EMS Data Manager should be able to assist you with these numbers.
Q: The NEMSIS measure does not go far enough to improve the quality of the EMS data that is being submitted. Can the measure be revised to include a list of pediatric data elements, data validation, and scoring tools?
A: This is a great idea and is something that will be considered for development in the future. For the next five years, the EMS for Children program is interested in knowing how many agencies submit v3 data in order to have a baseline of EMS data collection numbers.
Q: This measure is not pediatric-specific and is out of scope of the EMS for Children program. Can it be eliminated as a measure?
A: No. The EMS for Children Program has participated in the development of a national EMS data system since its inception, and believes this is an important area for performance measurement and improvement. Past Funding Opportunities Announcements (FOA) for the State Partnership Grants have allowed grantees to use grant funds to support the EMS data infrastructure in their states, so this measure is in line with past EMS for Children efforts and within the scope of the program. In addition, since NEMSIS collects data on patients of all ages, and does contain pediatric-specific variables, pediatric patients are included
Change/ addition to wording,Definition added
115 1/5/2016 EMSC Tennessee Emergency Medical Services for Children Program Nashville, TN EMSC 02 Pediatric Emergency Care Coordination The percentage of EMS agencies in the state/territory that have a designated individual who coordinates pediatric emergency care.
-Recommend clarifying the definition of a “designated individual.� In Tennessee, many of the responsibilities lie within the Training Officer Role at each EMS agency.
-Recommend in the survey that the follow up questions regarding the coordination of pediatric emergency care wording be changed to “for informational purposes� only.
-It is important to place more emphasis within the sentence on the fact that this list is simply a list of potential roles and responsibilities and they are not required. This clarification will result in more accurate data and less confusion.
-Add the word “protocols� so that the activity reads: “Ensure that fellow providers follow pediatric clinical practice guidelines/protocols�.
-Add the “the adoption of family centered care policies�
-Add “injury� before prevention programs
-Add additional, explanatory information to the statement “Oversee pediatric process improvement� so that the survey respondent understands what this means and how it differs from “Ensure that fellow providers follow pediatric clinical practice guidelines/protocols�.
-The proposed survey includes a question regarding the development of EMS protocols. In Tennessee, this role is done by the EMS medical director and not the providers. Most TN EMS agencies adopt the state EMS Medical Directors protocols but the individual medical director can adopt the state’s, modify them or create their own.
EMSC has responded to all comments received regarding this measure with revisions and the following FAQ:
Q: Does the PECC need to be on staff at the EMS agency?_x000D_
A: No. Ideally, the Pediatric Emergency Care Coordinator (PECC) should be a member of the EMS agency and be familiar with the specific day-to-day operations and needs of the agency. Some states/territories utilize county or regional models of emergency care; if there is a designated individual who coordinates pediatric activities for a county or region, that individual could serve as the PECC for one of more individual EMS agencies within the county or region._x000D_
Q: Will there be a toolkit available for EMS agencies which provide a job description for a PECC? _x000D_
A: Yes, the EMS for Children Program resource centers will develop toolkits, fact sheets, and webinars to assist State Partnership Grantees in the implementation of the new performance measures. _x000D_
Q: Can you add the word ‘injury’ to the PECC role so that it reads ‘promote agency participation in pediatric injury prevention program’? _x000D_
A: No. As written, the specific function does not exclude injury but rather encompasses all types of prevention programs. Injury prevention is just one type of prevention activity that a PECC could engage in—other prevention programs can include asthma or other childhood illnesses-- so the EMS for Children program wants to keep the role more broadly defined. In addition, the EMS for Children program wanted to be consistent with what is recommended in the IOM report, “Emergency Care for Children: Growing Pains� (2006). _x000D_
Change/ addition to wording,Definition added
116 1/5/2016 EMSC Tennessee Emergency Medical Services for Children Program Nashville, TN EMSC 03 Use of Pediatric-Specific Equipment Tennessee applauds the performance measures towards looking at clinical competencies however; it is strongly recommended that this performance measure be reassessed and significant revisions made in an effort to create better clarity.
-An EMS agency could meet this measure by simply having a skill station to demonstrate a bulb syringe and simply oral suction. As the performance measure is currently written, results will be inaccurate and/or meaningless.
-How would an all EMS providers at a service that answers calls in small communities meet this matrix when it is most likely statistically impossible to every provider to have a pediatric encounter in the timeframe outlined?
-An additional concern from Tennessee’s perspective is the competency or credentialing of the person evaluating the EMS providers. As this performance measure is written it would allow a non-certified “instructor� to make competency judgements.
EMSC has responded to all comments on Measure EMSC 03 with revisions and the following FAQs:
Q: To achieve the skill-checking measure a state would have to reach an ‘8’ on the scale. This score is too high and unrealistic. Can you consider lowering the score for achievement?

A: Yes. After reviewing the measuring scale and supporting evidence, the program has decided that a state/territory would achieve this measure by scoring a ‘6’ or higher on the scale.

Q: The skill-checking measure is too broad. Can it be revised to include the specific pieces of equipment that the program is interested in?

A: No. At one point in the development process, the measure did include specific pieces of life-saving equipment but as we field-tested the data collection instrument, we learned that there was variability among agencies as to what pieces of equipment were considered out of scope or if the medical director allowed agencies to use that equipment. As a result, the program determined that knowing if a process to skill-check, rather than which specific pieces of equipment they tested on, would help to understand how prepared agencies are to care for children. We assume that agencies who invest the resources to skill-check will select the pieces of equipment that are most crucial or very rarely used in the field to care for children.

Q: Our state certifies EMS instructors and only state certified EMS instructors can teach EMS providers, how does this figure into how EMS agencies will respond to the questions in the skill-checking measure?

A: EMS agencies in states which require certified EMS instructors will need to take state regulations into consideration when they respond to the survey questions which ask about the process their agency uses to skill-check.

Q: How do national courses such as PEPP and PALS and certifications like NREMT CCP fit into the skill-checking measure?

A: As long as it is an in-person course, an agency would respond in a way that reflects the methods used in the course or certification (i.e. simulation, skill-station) and frequency of the skill-checking activities whether the process occurred via a national course or not.

Q: Does an agency have to use or adopt all three of the methods listed (skill-station, simulation, and field encounter) to meet the measure?

A: No, the measure can be achieved by using only two of the three described methods.

Q: Can HRSA consider using the state’s recertification/re-licensure period as the time period for when skills check should occur?

A: At this time, a couple of states recertify/re-license annually so this would make the performance measure target burdensome. In addition, a couple of states recertify/relicense EMS providers every five years which would include a very long period of time before pediatric skills check would be required. The comment was considered but may not work well for the states with shorter and longer periods of recertification/re-licensure.

Change/ addition to wording,Definition added
117 1/5/2016 EMSC Tennessee Emergency Medical Services for Children Program Nashville, TN EMSC 04,EMSC 05 Pediatric Medical Emergencies, Pediatric traumatic emergencies EMSC Performance Measure 04 and 05 (Previously 74 and 75): The percent of hospitals with an Emergency Department (ED) recognized through a statewide, territorial, or regional standardized system that are able to stabilize and/or manage pediatric medical emergency and trauma emergencies.
-Tennessee applauds HRSA for inclusion of the pediatric patient with a medical or trauma emergency presenting to a hospital facility as a performance measure._x000D_
-The important word in this performance measure is system_x000D_
o The goal for this Performance Measure should be whether or not the state has implemented a system (yes or no). It should include a ‘scale’ to determine where states are in the process of developing their system. By Webster’s definition, a system is “a group of devices or artificial objects or an organization forming a network especially for distributing something or serving a common purpose <a telephone system> <a heating system> <a highway system> <a computer system (http://www.merriam-webster.com/dictionary/system.
o The important word in this performance measure is system and the data collection tool should reflect that by substituting recognition program with recognition system._x000D_
o The scale as written states, “At least one facility has been formally recognized through the pediatric medical facility recognition program.� One hospital that meets the National Guidelines for the Care of Children in an Emergency Department does not make a system. To address the morbidity and mortality of critically ill child a state needs a system (more than one hospital) that includes something beyond the minimum basic level such as a critical care unit. Every hospital in a state should meet at the minimum the National Guidelines cited above. However, every hospital in a state doesn’t need to be a Comprehensive Regional Pediatric Center. The performance measure should take into consideration that a state’s system may include critical care units into another state or region due borders and the remoteness of frontier and territories._x000D_
o In addition, it has been demonstrated by Tennessee’s system of care for children that it supports all hospitals and EMS agencies in our state to meet the all the Performance Measures._x000D_
-As an example, EMS transports are reviewed at the Comprehensive Regional Pediatric Centers. A report is sent to the EMS service and hospitals if this was an inter-facility transfer on the quality of care. This process would address both quality improvement and field encounter skill demonstration. Prior to the recognition system in Tennessee, there was some trauma outreach. Now every hospital and EMS service is connected with a CRPC for QI, education, and some research.
HRSA appreciates the comments received related to the current performance measures to assure systems are prepared to stabilize and manage pediatric medical and traumatic emergencies. The comments will be used to begin the discussion as we develop the next generation of Hospital-Based performance measures. The comments are indeed very helpful and will be shared with additional subject matter experts as we work to build a consensus on what should be the next generation of EMSC hospital-based performance measures. We will keep EMSC stakeholders informed throughout the development. Change/ addition to wording,"Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
118 1/5/2016 CSHCN Got Transition/Center for Health Care Transition Improvment Washington, DC CSHCN 3 Transition to Adult Health Care Our comments on CSHCN Goal 3 Transition pertain to the goal statement, its measurement, definition, significance, and activity data collection form. Overall, we recommend that the reporting requirements consider transition for all youth, as MCHB ™s performance measure calls for. We also recommend that you consider adding a question before the definition about whether or not the agency responding provides clinical care or not because much of the information requested in two of the three outcomes will not work for those not providing direct services. Below are our suggested alternatives for each of these sections.



Goal: To ensure supportive programming for transition to adult health care for youth with and without special needs.
Comment: Although MCHB ™s transition measure falls under the MCH population domain of children with special health care needs, the actual goal should be consistent with MCHB ™s performance measure and expanded to include youth with and without special health care needs.

Tier 3. How many (systems, providers, patients, and families) are reached through these transition-related activities?
Comment: We recommend simplifying this table to count only total numbers and not to break out by audience.

Tier 4. What are the related transition outcomes?
Comment: All of these outcomes are new.

% of grantees promoting an evidence-informed framework (eg, Six Core Elements of Health Care Transition) and clinical recommendations (AAP/AAFP/ACP Supporting the Health Care Transition from Adolescence to Adulthood in the Medical Home) for transition from pediatric to adult health care.
Numerator: Number of Grantees promoting an evidence-informed framework
Denominator: Total Number of grantees reporting transition performance measure

% of grantees involving both pediatric and adult providers/systems in transition efforts
Numerator: Number of pediatric and adult dyads involved in grantees ™ transition efforts
Denominator: Total number of transition practices sponsored by grantee

% of grantees initiating or encouraging transition planning early in adolescence
Numerator: Number of Grantees promoting transition planning early in adolescence
Denominator: Total number of grantees reporting transition performance measure


% of grantees linking transition efforts with medical home initiatives
Numerator: Number of Grantees promoting transition as part of routine medical home care
Denominator: Total number of grantees reporting transition performance measure

% of grantees linking transition efforts with adolescent preventive care efforts
Numerator: Number of grantees promoting transition as part of routine adolescent preventive care
Denominator: Total number of grantees reporting transition performance measure

Measure has been revised, not all Tiers will be assigned to all grantees. Change/ addition to wording, Relates to ability to report, and should be taken into consideration when assigning measures further down the road.
119 1/5/2016 Workforce Development/ Training University of Arizona Pediatric Pulmonary Center Arizona Not Related to Measure
Thank you for giving us this opportunity to review and provide feedback on the proposed performance measures. We would like to support the feedback provided by University of Alabama at Birmingham Pediatric Pulmonary Center. We believe the extra reporting measures and capacity building measures will not focus on the aims and goals the PPCs have in their training grant and will provide duplication in some areas. No resolution necessary; comment refers to comments made by University of Alabama at Birmingham PPC. No changes neccessary
120 1/5/2016 Workforce Development/ Training WI LEND Program - University of Wisconsin - Madison Madison, WI Not Related to Measure,Training Forms
Technical Assistance and Collaboration Form (and also Continuing Education)

While the new way of choosing/categorizing the activities could be helpful in reducing reporting burden, the topics listed in List B do not include many relevant issues for which the LEND programs provide technical assistance and continuing education. Missing from list B are topics related to neurodevelopmental disabilities such as autism and other developmental disabilities; children with special health care needs; developmental screening; early childhood growth, development and education; and life course issues. This list seems to narrow. If other topics can not be added, please add an "other"
category/option.

Also, I'm not sure why Title V is added as a separate primary target audience for technical assistance this appears redundant with listing Title V as the recipient of TA/Collaborator, and would be very confusing to complete data entry.
Form has been revised. Suggested additional audiences have largely not been added as the potential audiences are extensive, so prioritizations have been made. An Other category has been added. Change/ addition to wording
121 1/5/2016 Workforce Development/ Training WI LEND Program - University of Wisconsin - Madison Madison, WI Training 9 Interdisciplinary Practice Training 09 Interdisciplinary Practice

While the aggregate data on per cent of longterm trainees that work in an interdisciplinary manner would be relevant based on responses for the listed activities, I'm not sure why the individual % for each item is helpful.
Collecting data on specific interdisciplinary skills allows DMCHWD to present a more detailed picture of former trainee outcomes. No changes neccessary
122 1/5/2016 Workforce Development/ Training WI LEND Program - University of Wisconsin - Madison Madison, WI Training 13 Policy Development, Implementation, and Evaluation Training 13 Policy_x000D_ Development
Under Category #1 Training, Elements 13_x000D_ is this referring to participants in the LONGTERM training program? Please specify whether the intended measurement is for longterm trainees, both activities and the % of trainees with increased policy knowledge and skills.
Yes - the pre/post knowledge is relevant only to long-term trainees. This has veen clarified in the Training 13 measure. Change/ addition to wording
123 1/6/2016 CSHCN Family Voices Albuquerque, NM F2F 1 Provide National Leadership for families with children with special health needs 3. Family Voices recognizes the critical importance of data collection to document the outcomes of MCH investments. However, based on our experiences working with the F2F HIC grantees, the burden can vary tremendously, and may be particularly burdensome for grantees with limited staff and resources. For the data to be valuable, time must be dedicated to understanding the measure, developing appropriate data collection elements, and allocating staff time to collect and report the data. Based on our experience in helping F2F HICs to report data for current Performance Measure 70, the estimated time burden of 41 hours per grantee underestimates the amount of time some F2Fs need to collect, analyze and report the data required. See calculations below under # 6. _x000D_

Family Voices recommends that MCHB provide resources for training for grantees, and consider the resources needed by grantees for data collection based on the specific program requirements in future grant awards. _x000D_
All efforts will be made to minimize reporting burden, the hours reported in the burden statement represents a weighted average burden, rather than the anticipated burden on each specific grantee. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
124 1/6/2016 CSHCN Family Voices Albuquerque, NM CSHCN 1 Family Engagement 2. One of the most important aspects of measurement is consistency in data reporting among all grantees. Without definitions and a universal process to collect data, it is difficult to aggregate data and compare results within and across programs. For example, on the performance measure for CSHCN, how is family versus CSHCN leaders defined? Is CSHCN leaders � meant to be Title V staff, other professionals, or children themselves? The wording of the measure on the percent of programs promoting and/ or facilitating family engagement among children and youth with special health care needs is confusing. Is it measurement of the percent of programs promoting and/or facilitating family engagement among children and youth with special health care needs, or does it refer to measurement of the percent of programs promoting and/or facilitating family engagement within the programs that serve CSHCN? _x000D_
_x000D_
Family Voices recommends that MCHB provide guidance which includes definitions and specific suggestions for tools and processes to collect the data that is intended to inform the measures. Family Voices also recommends that groups of grantees and Project Officers meet periodically to discuss protocols, processes and strategies for reporting these performance measures. _x000D_
--- Definitions are attached_x000D_
--- Tier 1 will be changed to Are you promoting and/or facilitating family engagement within programs that serve children and youth with special health care needs?_x000D_
Change/ addition to wording,Definition added
125 1/6/2016 CSHCN Family Voices Albuquerque, NM CSHCN 2,Suggested Addition Access to and Use of Medical Home 4. As acknowledged by MCHB, access to and use of medical home is key to improving outcomes for all MCHB populations, not just children and youth with special health care needs. However, it appears that this performance measure is focused solely on children with special health care needs, and is not required by other programs serving the full range of MCH populations._x000D_

Family Voices recommends that a performance measure be added to the Women, Child and Adolescent domains to address promoting and /or facilitating medical home access._x000D_
No CSHCN-specific resolution is needed. No changes neccessary
126 1/6/2016 CSHCN Family Voices Albuquerque, NM F2F 1 Provide National Leadership for families with children with special health needs '6. The following comments are specific to the proposed F2F 1 Performance Measure: The percent of families with Children with Special Health Care Needs that have been provided information, education and/or training by Family-to-Family Health Information Centers (F2F)


Item #A1a: Our organization provided one-on-one health care information (including referrals)/education/training/peer support to families with CSHCN to assist them in accessing information and services. Total number of families served/trained:___________

Comment: This score represents a ratio calculated from the total number of families that have been provided information, education, and/or training from an F2F divided by the estimated number of families with CSHCN in the State, calculated from the National Survey. This ratio is complicated by the following:

•The national survey provides an estimate of the number of families with CSHCN in the state based on calls made to individual families which is then extrapolated to arrive at the estimate of total families in the state with CSHCN. When the F2F is calculating the # of families served, it is not always possible for an F2F to identify the individual family who is being assisted or trained. For example, many F2Fs provide trainings for which no identifying information is provided about the participant. Similarly, it is not always possible to obtain identifying information, including racial and ethnic data, on individuals served one-on-one at community events. In addition, many F2Fs provide assistance via Facebook and it is not feasible to obtain identifying information in these circumstances. If identifying information is not available then the participant data cannot be merged and de-duplicated with data on individuals that have been served and identified by the F2F, resulting in inaccurate numbers.



Item #A1b: Race

Comment:

•It appears that American Indian/Alaskan Native (AIAN) is missing from the list of Race Categories.

•Some families identify in more than one race category. Is it feasible, then, to have the total # of families served by race be greater than the total # of families served (A1a)?





Item #A1d: Instances of service by type

Comment:

•Should training that is provided online be included here, as It may not be interpreted as a one-on-one service?

•What types or examples of meetings/conferences should be included here? If an F2F staff participates in a meeting but does not contribute, is the F2F providing one-on-one service? Should conferences be broken down into individual workshops (participants may differ from workshop to workshop)?

Item #2a: Total number of professionals/providers served/trained

Comment:

•Are counts of professionals/providers also based on one-on-one service?



Item #3a: Print/media information and resource dissemination

Comment:

•Are all information/resources counted or just those authored by the F2F?

•It is assumed that hardcopy disseminations represent a total # of materials disseminated. For example, if the F2F disseminated 50 brochures and 100 care notebooks at an event, then 150 would be reported here. What is counted for electronic newsletter, listserv, and social media platforms? For example if the F2F announces a new resource to their listserv of 500 members, is the count of 1 or 500 reported here?

•Should web downloads of materials (PDF, doc, ppt) be also added to this list?


Item #4a: Types of State agencies

Comment:

•Why is this measure a count of types rather than a total of agencies?

•Or is this measure intended to include a count of types as well as a total of all agencies?

Comments re the estimate of time needed for data tracking of F2F 1 Performance Measure

Multiple data processing steps are required to meet this measure. See footnote below about data entry.*

•The range of 1 -1 duplicated cases served by F2Fs reported in F 2015 ranges from 166 – 73,401. The median is 1503. Calculated at 2.5 minutes per case, the number of hours of data entry time needed based on the median is 62 hours.

•Data entry time required to enter trainings calculated at 10 minutes per training including participants based on the average # of trainings recorded by an F2F in FY2015 (216), is 36 hours.

•Time required to aggregate this data and submit to the NCFPP is estimated at 2-8 hours per F2F - 8 hours_x000D_


•Time required to pull, clean and aggregate material disseminations by type, partnering agencies by type, partner agreements, and staff counts is estimated at 5 hours per F2F - 5 hours._x000D_
•Total Estimated Data Processing Time per F2F based on current PM 70: 111 hours annually_x000D_
* From an F2F comment submitted with their 2014-2015 data report to the NCFPP: “Even with the customized Salesforce data system, data collection and reporting remains extremely time consuming. There are several steps required to enter all of the information for each encounter with a family. New cases can take up to five minutes to enter all information and details._x000D_
A1a: The language in the definition on page 140 will be revised to state the following denominator: "The targeted number of families with CSHCN in the State. The new denominator to be used will be the number of families that can be reasonably served with provided federal grant funds. The new denominator will be based upon a formula that factors in the national survey data and the relative number of families that can be served by the federal allocation of Title V funds. The total number of families served is based solely on “one-to-one� service conducted by the F2F (A1). Grantees are expected to capture client-level data for families provided targeted, individualized assistance. If identifier information (e.g., racial and ethnic data) is not available, the grantee will capture this total within the “unknown� section. Specific to assistance provided via social media, at minimum, basic identifier information (e.g., Twitter handle), should be tracked.



A1b: Per the recommended comment, the "Native American/American Indian or Alaskan Native category" will be added to the race category within item A1b. In addition, an item reflecting "Multiple races" will be added to capture those families who identify themselves as multi-racial.



A1d: No. Online trainings conducted should be reflected in the “Group training opportunities� section. Trainings to one, individual family should be captured in the “individualized assistance� section. The meetings/conferences section will be revised to state, “Outreach/information sharing.� Conference trainings, workshops, or any venue where the F2F will be teaching a skill with tangible learning objectives will be classified as “group training opportunities.� The new outreach/information sharing section should capture instances where the F2F provides general information to build awareness, educate, or communicate a topic and/or organization’s services to the public or specific group of individuals.



A2a: Yes. For consistency, similar categories shown in A1a and A1b will be added for providers.



A3a: After reviewing this item, it may not be feasible or practical for F2Fs to provide numbers of resources disseminated. Therefore, this question will be designed to capture “how� outreach is conducted via information and resources dissemination. Item 3a will be revised to state, “Select the modes of how print/media information and resources are disseminated. (Select all that apply).� The revised outlets will include:

• Electronic newsletters and listservs
• Hardcopy
• Public television/radio
• Social media (Specify platform): __________
• Text messaging_x000D_
• Website _x000D_
• Other (Specify): _____________x000D_
_x000D_
Grantees will be instructed to develop in-house media metrics to gauge the effectiveness and impact of mode/type of information dissemination. _x000D_
_x000D_
A4a: Correct. Item A4a will be revised to state, “Number of state agencies/programs…� In addition, item B1a will be revised to state, “Number of community-based organizations.�
Change/ addition to wording,Definition added
127 1/6/2016 CSHCN Family Voices Albuquerque, NM LC 3 Oral Health The population domains include children, but not specifically children and youth with special health care. Access to oral health is a particular challenge for children/youth with special health care needs and it would be helpful to collect data on efforts to improve access to oral health for this subpopulation of children. It should be noted that the other life course performance measures have CSHCN as a separate population domain.

Family Voices recommends that the life course performance measure on oral health be revised to have a specific domain for children and youth with special health care needs. This will align this performance measure with the other life course performance measures

Yes--now included in LC measure tables.

Change/ addition to wording
128 1/6/2016 CSHCN Family Voices Albuquerque, NM Training 1,CSHCN 1,Suggested Addition MCH Training Program Family Member/Youth/Community Member participation, Family Engagement '1. Family Voices welcomes and strongly supports MCHB’s acknowledgement of the critical role of family engagement in policymaking activities and the statement that “in accordance with this philosophy, MCHB is facilitating such partnerships at the local, state and national levels�.

However, we believe that family/consumer engagement should be required and measured across all MCHB funded programs, beyond children and youth with special health care needs. Family Voices believes that all consumers of health care services – women, youth, and families of all children– need to play a critical role in informing policy and driving program activities that are relevant to the services they consume. While it appears that the revised Performance Measures for Discretionary Grants require that some programs (e.g. workforce development) are required to measure this involvement, it is unclear whether all programs will be required to measure their partnerships with consumers and families. Partnerships with family – led organizations, particularly engagement with fellow MCHB funded F2F HICs should be measured by all grantees. Furthermore, every grantee as part of measurement of authentic family engagement, should be required to gather feedback directly from the family members/consumers with whom they are engaged, including information on the diversity of populations they represent, and this feedback should relate to the value/impact of their engagement in the development, implementation and evaluation of the program.

Family Voices recommends that MCHB review all the domain and program specific measures and detail sheets to assure that there is universal application of measurement of the critical role of family and consumer engagement. We recommend that a performance measure be added to all domains (e.g. Child, Women, Adolescent health) to address promoting and facilitating family/consumer engagement. We recommend that grantees be required to gather feedback from their engaged families/consumers as part of their measurement protocol and that this feedback represent and be gathered from the full diversity of populations served, including particularly those from underserved groups, and family-led organizations

No Training or CSHCN-specific resolution is needed. No changes neccessary
129 1/6/2016 EMSC Illinois DPH - Division of EMS and Highway Saftey Springfield, IL EMSC 01 NEMSIS Submission Recommend eliminating this measure since it is not pediatric specific.

Access to data is recognized as an essential EMS System component in order to obtain a better understanding of patient populations and resource utilization/needs as well as identifying trends, educational needs and opportunities for improvement. However this proposed measure implies that State EMSC programs have responsibility for their state EMS data. In fact, this activity falls outside the authority of State EMSC programs, and would be better tasked to the entities directly responsible for EMS data at the state and national levels. State EMSC programs should certainly support data initiatives - for example there is currently an EMS Compass initiative which is working to develop overarching EMS performance measures based on the latest version of the National EMS Information System (NEMSIS) and will allow local and state EMS systems to use their own data meaningfully. In addition, it’s very important to understand that the funding provided to state partnership grantees is limited, and therefore should be used to primarily target performance measures with clear applicability to the pediatric community.
EMSC has responded to all comments on Measure EMSC 01 with revisions and the following FAQs:
NEMSIS-SPECIFIC Performance Measure:

Q: Aren’t these new EMS for Children performance measures unfunded mandates?_x000D_


A: No, these measures were developed so the national EMSC program can obtain baseline data on how the EMS system is operating on a national level and are intended to assist states and territories in showing improvement in these areas over the lifespan of the measures. The EMSC program recognizes that not every state or territory may meet these measures. _x000D_

Q: Why can’t the measure be “NEMSIS v3 data is being collected on 90% of the call volume in the state rather than from 90% of agencies? “
A: The program did consider using call volume versus percentage of agencies. One of the reasons that the program decided on using the percentage of agencies was that many states and territories may not know the total call volume but could reasonably know the number of agencies in their state. The program is concerned that if this measure captured the percent of call volume rather than the percent of agencies, the program would not understand what was happening at small agencies. For example, by using call volume, the measure could be considered biased toward rural states with a few urban areas that potentially have 90% of the call volume. This data is important for developing program planning to address the needs of all EMS agencies.
Q: My state may never convert to NEMSIS v3, so can the measure be rewritten to eliminate the version 3 part of the measure?
A: Unfortunately, no, the program is committed to assisting our federal partner, the National Highway Traffic Safety Administrations, (NHTSA) in their efforts to create a national databank of EMS data. In addition, beginning January 1, 2017, the NEMSIS Technical Assistance Center (TAC) will no longer accept NEMSIS v2 data from states and territories. As a result, the program wants to be current with national standards.
Q: My state does not license EMS agencies; does this mean that I don’t have to report data on this measure?
A: Even if your state or territory does not license EMS agencies, you still have to report on this measure. The intent of this measure is to determine how many agencies in the country are submitting NEMSIS v3 data, whether those agencies are licensed at a state, local, or some other level. If the EMS agencies in your state submit NEMSIS v3 data in 2017 (the expected first round of data collection on these new EMS for Children measures) then you should report the number of EMS agencies that submit NEMSIS v3 data to the State EMS office. Your state EMS Data Manager should be able to assist you with these numbers.
Q: The NEMSIS measure does not go far enough to improve the quality of the EMS data that is being submitted. Can the measure be revised to include a list of pediatric data elements, data validation, and scoring tools?
A: This is a great idea and is something that will be considered for development in the future. For the next five years, the EMS for Children program is interested in knowing how many agencies submit v3 data in order to have a baseline of EMS data collection numbers.
Q: This measure is not pediatric-specific and is out of scope of the EMS for Children program. Can it be eliminated as a measure?
A: No. The EMS for Children Program has participated in the development of a national EMS data system since its inception, and believes this is an important area for performance measurement and improvement. Past Funding Opportunities Announcements (FOA) for the State Partnership Grants have allowed grantees to use grant funds to support the EMS data infrastructure in their states, so this measure is in line with past EMS for Children efforts and within the scope of the program. In addition, since NEMSIS collects data on patients of all ages, and does contain pediatric-specific variables, pediatric patients are included
Change/ addition to wording,Definition added
130 1/6/2016 EMSC Illinois DPH - Division of EMS and Highway Saftey Springfield, IL EMSC 02 Pediatric Emergency Care Coordination Recommend clarifying the definition of a “designated individual� and revising the detail sheet to allow flexibility in the achievement of this performance measure.

1. Healthcare organizations can benefit from access to a pediatric emergency care coordinator who works to assure the inclusion of pediatric considerations into clinical care, protocol development, education/training and quality initiatives. Larger EMS agencies will likely have the ability to meet this measure, however mandating this requirement at the local EMS agency level will create challenges for small, rural volunteer agencies who at times have difficulty retaining adequate staff to provide 24 hour coverage. This performance measure should allow for pediatric coordinators at the EMS Regional or EMS System level (who have direct contact with their local EMS agencies). This regional approach reflects the type of infrastructure that already exists in many states, in which education/training, protocol development, quality oversight and other activities are coordinated at the EMS Region or EMS System level.

2. It is unclear as to whether the survey questions on page 108 of the data collection form for this measure are examples only. These questions identify specific responsibilities of the coordinator, and it could be misconstrued that all of these questions must be answered in the affirmative in order to achieve a score of “3� for this measure (Score of 3 = “Our EMS agency HAS a designated INDIVIDUAL who coordinates pediatric emergency care�). The list should emphasize that it contains examples of potential (not required) responsibilities. This clarification will decrease the potential for misinterpretation, resulting in more accurate data.

3. Recommend revisiting the list of coordinator responsibilities on page 108 as follows:
--- Change “pediatric clinical practice guidelines� to “EMS pediatric clinical practice guidelines/protocols�, since most EMS agencies utilize the term “EMS protocols�. In addition, “protocols� is used in the first question on page 108 so this change will ensure consistency in language.
--- Clarify “pediatric process improvement� by changing to “pediatric quality improvement� (or similar language) to avoid misinterpretation.
EMSC has responded to all comments received regarding this measure with revisions and the following FAQ:
Q: Does the PECC need to be on staff at the EMS agency?_x000D_
A: No. Ideally, the Pediatric Emergency Care Coordinator (PECC) should be a member of the EMS agency and be familiar with the specific day-to-day operations and needs of the agency. Some states/territories utilize county or regional models of emergency care; if there is a designated individual who coordinates pediatric activities for a county or region, that individual could serve as the PECC for one of more individual EMS agencies within the county or region._x000D_
Q: Will there be a toolkit available for EMS agencies which provide a job description for a PECC? _x000D_
A: Yes, the EMS for Children Program resource centers will develop toolkits, fact sheets, and webinars to assist State Partnership Grantees in the implementation of the new performance measures. _x000D_
Q: Can you add the word ‘injury’ to the PECC role so that it reads ‘promote agency participation in pediatric injury prevention program’? _x000D_
A: No. As written, the specific function does not exclude injury but rather encompasses all types of prevention programs. Injury prevention is just one type of prevention activity that a PECC could engage in—other prevention programs can include asthma or other childhood illnesses-- so the EMS for Children program wants to keep the role more broadly defined. In addition, the EMS for Children program wanted to be consistent with what is recommended in the IOM report, “Emergency Care for Children: Growing Pains� (2006). _x000D_
Change/ addition to wording,Definition added
131 1/6/2016 EMSC Illinois DPH - Division of EMS and Highway Saftey Springfield, IL EMSC 03 Use of Pediatric-Specific Equipment Strongly recommend revising this measure since it lends to widely variable interpretation, which can result in inconsistent reporting.

This performance measure needs clarification since it can be interpreted in a variety of ways, thus likely resulting in inconsistent and/or unreliable reporting. Recommend the following:
 Develop a clear definition of the word “process�.
 Define examples of specific skills/equipment, utilizing the ABCs as a framework.
 Clarify the qualifications/credentialing of the individuals evaluating the skill/equipment use.
 Revisit the rubric to assure a consistent interpretation (provide examples).
 Change the defined achievement score to “6 or higher� on a 0 – 12 scale (currently a score of 8 or higher is needed to meet achievement). This change takes into account that field encounters are less realistic or achievable for many providers, particularly small volume agencies.
 Allow the skills demonstrations within standardized courses (such as PALS, PEPP, APLS and ENPC) and the use of the National Registry of EMT’s Continued Competency Program (CCP) to meet the skill station component in the scoring rubric.
EMSC has responded to all comments on Measure EMSC 03 with revisions and the following FAQs:
Q: To achieve the skill-checking measure a state would have to reach an ‘8’ on the scale. This score is too high and unrealistic. Can you consider lowering the score for achievement?

A: Yes. After reviewing the measuring scale and supporting evidence, the program has decided that a state/territory would achieve this measure by scoring a ‘6’ or higher on the scale.

Q: The skill-checking measure is too broad. Can it be revised to include the specific pieces of equipment that the program is interested in?

A: No. At one point in the development process, the measure did include specific pieces of life-saving equipment but as we field-tested the data collection instrument, we learned that there was variability among agencies as to what pieces of equipment were considered out of scope or if the medical director allowed agencies to use that equipment. As a result, the program determined that knowing if a process to skill-check, rather than which specific pieces of equipment they tested on, would help to understand how prepared agencies are to care for children. We assume that agencies who invest the resources to skill-check will select the pieces of equipment that are most crucial or very rarely used in the field to care for children.

Q: Our state certifies EMS instructors and only state certified EMS instructors can teach EMS providers, how does this figure into how EMS agencies will respond to the questions in the skill-checking measure?

A: EMS agencies in states which require certified EMS instructors will need to take state regulations into consideration when they respond to the survey questions which ask about the process their agency uses to skill-check.

Q: How do national courses such as PEPP and PALS and certifications like NREMT CCP fit into the skill-checking measure?

A: As long as it is an in-person course, an agency would respond in a way that reflects the methods used in the course or certification (i.e. simulation, skill-station) and frequency of the skill-checking activities whether the process occurred via a national course or not.

Q: Does an agency have to use or adopt all three of the methods listed (skill-station, simulation, and field encounter) to meet the measure?

A: No, the measure can be achieved by using only two of the three described methods.

Q: Can HRSA consider using the state’s recertification/re-licensure period as the time period for when skills check should occur?

A: At this time, a couple of states recertify/re-license annually so this would make the performance measure target burdensome. In addition, a couple of states recertify/relicense EMS providers every five years which would include a very long period of time before pediatric skills check would be required. The comment was considered but may not work well for the states with shorter and longer periods of recertification/re-licensure.

Change/ addition to wording,Definition added
132 1/6/2016 EMSC Illinois DPH - Division of EMS and Highway Saftey Springfield, IL EMSC 04 Pediatric Medical Emergencies Recommend exploring strategies to assist more states in attaining achievement of this measure.

EMSC Performance Measure #74 remains a challenging measure, however achievement lends to innumerable benefits within a state and enhances the pediatric emergency/critical care infrastructure. A core benefit of a tiered recognition system is the resultant collaborative efforts and cross-institutional work. For example, small community hospitals lacking the resources to truly invest in pediatric quality improvement initiatives can benefit from the collaboration with pediatric tertiary care centers, through a pediatric facility recognition process. Recommend a steadfast exploration of strategies and commitment of resources to assist more states in attaining this performance measure.
HRSA appreciates the comments received related to the current performance measures to assure systems are prepared to stabilize and manage pediatric medical and traumatic emergencies. The comments will be used to begin the discussion as we develop the next generation of Hospital-Based performance measures. The comments are indeed very helpful and will be shared with additional subject matter experts as we work to build a consensus on what should be the next generation of EMSC hospital-based performance measures. We will keep EMSC stakeholders informed throughout the development. Change/ addition to wording,"Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
133 1/6/2016 CSHCN Region 4 Midwest Genetics Collaborative - Michigan Public Health Institute Okemos, MI CB 1 State capacity for advancing the health of MCH populations In Tier 3, the list of State agencies should separate Newborn Screening (NBS) from Genetics as each is an important partner to HRSA/MCHB. Because HRSA has begun to emphasize genetics across the lifespan, it is critical to create distinct categories for NBS and Genetics. Through this ontology, we believe that HRSA will have additional insights about the extent to which genetics is being addressed by its grantees. Same as above. No changes neccessary
134 1/6/2016 CSHCN Region 4 Midwest Genetics Collaborative - Michigan Public Health Institute Okemos, MI CB 2 Technical Assistance Genetics is missing from the list of MCH priorities. Genetics should be added to Tier 3 and to the Data Collection Form for CB2. Yes.

Change/ addition to wording
135 1/6/2016 CSHCN Region 4 Midwest Genetics Collaborative - Michigan Public Health Institute Okemos, MI CB 3 Impact Measurement This is an opportunity to ask grantees about the State and national data sources that they are using to assess their activities and impact. It could give data HRSA data on the use of the National Survey of Children’s Health, birth defects registries, etc. This data would help support the importance taxpayers’ investment in these State and national data resources.

Impact should be defined in Tier 1, 2, and 4
Need definition for impact measurement-- tie to logic model-- Jamelle Banks Definition added
136 1/6/2016 CSHCN Region 4 Midwest Genetics Collaborative - Michigan Public Health Institute Okemos, MI CB 6 Products Tier 3 should also include some measure of use of these products. The NCC/RC system uses number of unique visits and home page visits to measure the use of its Internet resources. Impact factors of publications might be another metric to consider.

Web based products should be categorized for data collection. Web based products vary greatly in their reach and it would be helpful to collect this at a national level. Particularly as we move towards the future and most products/outreach is taking place through the internet.
Same as comment 166

Addressed in other similar or identical comment.
137 1/6/2016 CSHCN Region 4 Midwest Genetics Collaborative - Michigan Public Health Institute Okemos, MI Core 2 Quality Improvement We applaud HRSA for recognizing crosssectorial_x000D_ collaborative across multiple organizations in Tier 2. We suggest that an additional aim of this type of collaboration might be improved coordination across MCHBfunded programs. While valuable, this recommendation will not be implemented at this time. No changes neccessary
138 1/6/2016 CSHCN Region 4 Midwest Genetics Collaborative - Michigan Public Health Institute Okemos, MI Core 2 Quality Improvement Add Tracking and Monitoring as a new row. Data collection and analysis is sufficiently distinct from quality improvement to warrant its own row.


As indicated in our comments on CSHCN 1, we recommend that HRSA distinguish between governmental and nongovernmental
partnerships in the column headings.
Addressed elsewhere. Change/ addition to wording
139 1/6/2016 CSHCN Region 4 Midwest Genetics Collaborative - Michigan Public Health Institute Okemos, MI CSHCN 1 Family Engagement Add Tracking and Monitoring to Tier 2 to emphasize the importance of data collection around family engagement. Meaningful participation should be defined.

Add regional to the geographic units included in Tier 4. This addition would recognize that some activities can be more efficiently achieved on a regional basis.
Changed to meaningful roles. Definition is attached Definition added
140 1/6/2016 CSHCN Region 4 Midwest Genetics Collaborative - Michigan Public Health Institute Okemos, MI CSHCN 2 Access to and Use of Medical Home Table 1 is to be used to report activities. It would be helpful to clarify where local public health activities should be counted. We recommend that HRSA distinguish between governmental and nongovernmental partnerships.

Tier 4 could be enhanced by including other performance measures, e.g. improving care coordination with specialists.
Refer to comment 80 Addressed in other similar or identical comment.
141 1/6/2016 CSHCN Region 4 Midwest Genetics Collaborative - Michigan Public Health Institute Okemos, MI LC 1 Adequate Health Insurance Coverage -- Add Tracking and Monitoring to Tier 2 to emphasize the importance of data collection around health insurance coverage. ------- Similarly, add Tracking and Monitoring to the LC1 Data Collection form.
-- In Tier 4, it would be helpful to provide a defini
Same as above-- adopt 'tracking and surveillance'. Change/ addition to wording
142 1/6/2016 CSHCN Region 4 Midwest Genetics Collaborative - Michigan Public Health Institute Okemos, MI PIH 3 Newborn Screening We applaud the State’s newborn screening programs and the identification and followup testing saving thousands of lives each year. However, there is no current mechanism for tracking nationally what happens to these children once they have been referred to the physician of record. Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
143 1/6/2016 EMSC National Association of State EMS Officials (NASEMSO), Pediatric Emergency Care Council (PECC)
EMSC 01 NEMSIS Submission By requiring NEMSIS submission, you are imposing an unfunded mandate and a burden to EMS services. We recommend a goal of "90% of the call volume," not 90% of EMS agencies. Ninety-percent of the call volume is a more realistic and achievable goal for stats to meet as many agencies are so small with very little call volume (and EMS personnel) that incurring the expense to purchase and maintain the software and hardware needed to collect data electronically is too much of a burden both on personnel level as well as financial.

This proposed Performance Measure is a missed opportunity to collect quality pediatric data on a national level. By only requiring NEMSIS 3 submission does not ensure quality data as data submissions can be sent with null variables. Also, almost all states are already on track to move from NEMSIS 2 to NEMSIS 3. What is needed is a goal to ensure states are receiving 'good' data through validation and scoring of data transmission to the state. The federal EMSC Program should identify specific NEMSIS data elements to monitor/evaluate (with the goal of examining outcomes) and then set a Performance Measure to ensure validation and scoring of those identified data elements. By doing this, NEMSIS TAC would receive version 3 data elements (by virtue of identifying specific NEMSIS 3 elements as well as ensuring better, quality data is being submitted.

Comment: Why is EMS data submission a requirement fo a grant focused on pediatric care? While I agree that good data is important, this requirement falls outside the scope of most (if not all) EMS for Children programs. The focus of the program should remain on pediatric emergency care, not in areas where we have little or no actual impact or power to make effective change.

Comment: Recognizing that some states are not moving forward with NEMSIS 3, some states with a solid plan to transition to NEMSIS 3 is the near future have no major concerns with this measure.

Comment: While it may help to forward research agenda and assist in finding gaps in pediatric care; there is a concern regarding EMSC representation at NEMSIS and it may detract from near future goals effecting immediate care of children.

Comment: This measure is not a pediatric specific performance measure. EMS data is essential in understanding trends/opportunities for improvement in the prehospital setting; however addressing statewide EMS data systems is the responsibility of the State EMS offices. As future performance measures are developed, please take into consideration the need to ensure clear applicability to pediatric specific efforts and more beneficial to the pediatric community.

EMS Compass is already working on developing EMS performance measures, which will be based on the latest verson of the National EMS Information System (NEMSIS) and will allow local and state EMS systems to use their own data meaningfully.

Comment: It should be recognized that each system varies greatly in its scope of practice, state and/or local jurisdiction, medical direction, and the geographic, demographic and economic realities of the regions they serve. It is our belief that applying strict guidelines to all the diverse EMS for Children programs will not be a successful strategy. Similarly, the new proposed performance measure seem to be biased towards urban, well-staffed, and well-funded agencies while creating a heavier burden on EMS providers in rural, volunteer agencies. The mission of all EMSC is to aid their EMS providers to ensure that all children receive the best and most appropriate care in the event of an emergency, thus performance measures should be more fluid to accommodate every agency's unique needs. For this reason, we believe proposed EMSC Performance Measure 01 be revised.

Performance Measure 01 relates to the submission of NEMSIS version 3.x compliant data to the state EMS office, and then along the NEMSIS Technical Assistance Center (TAC). Reliable EMS data is essential for understanding the underlying trends of pre-hospital settings, and for identifying areas that could use improvement, however it is not within the scope of the EMS for Children programs to oversee the data collection or police data submissions for their state. The process of becoming NEMSIS version 3 compliant is a job for State EMS offices who may have more funding and/or more staff to devote to this goal. In short, while this is a worthy goal for national EMS in general, it is not pediatric specific and thus should not be the sole responsibility of EMS for Children programs.
EMSC has responded to all comments on Measure EMSC 01 with revisions and the following FAQs:
NEMSIS-SPECIFIC Performance Measure:

Q: Aren’t these new EMS for Children performance measures unfunded mandates?_x000D_


A: No, these measures were developed so the national EMSC program can obtain baseline data on how the EMS system is operating on a national level and are intended to assist states and territories in showing improvement in these areas over the lifespan of the measures. The EMSC program recognizes that not every state or territory may meet these measures. _x000D_

Q: Why can’t the measure be “NEMSIS v3 data is being collected on 90% of the call volume in the state rather than from 90% of agencies? “
A: The program did consider using call volume versus percentage of agencies. One of the reasons that the program decided on using the percentage of agencies was that many states and territories may not know the total call volume but could reasonably know the number of agencies in their state. The program is concerned that if this measure captured the percent of call volume rather than the percent of agencies, the program would not understand what was happening at small agencies. For example, by using call volume, the measure could be considered biased toward rural states with a few urban areas that potentially have 90% of the call volume. This data is important for developing program planning to address the needs of all EMS agencies.
Q: My state may never convert to NEMSIS v3, so can the measure be rewritten to eliminate the version 3 part of the measure?
A: Unfortunately, no, the program is committed to assisting our federal partner, the National Highway Traffic Safety Administrations, (NHTSA) in their efforts to create a national databank of EMS data. In addition, beginning January 1, 2017, the NEMSIS Technical Assistance Center (TAC) will no longer accept NEMSIS v2 data from states and territories. As a result, the program wants to be current with national standards.
Q: My state does not license EMS agencies; does this mean that I don’t have to report data on this measure?
A: Even if your state or territory does not license EMS agencies, you still have to report on this measure. The intent of this measure is to determine how many agencies in the country are submitting NEMSIS v3 data, whether those agencies are licensed at a state, local, or some other level. If the EMS agencies in your state submit NEMSIS v3 data in 2017 (the expected first round of data collection on these new EMS for Children measures) then you should report the number of EMS agencies that submit NEMSIS v3 data to the State EMS office. Your state EMS Data Manager should be able to assist you with these numbers.
Q: The NEMSIS measure does not go far enough to improve the quality of the EMS data that is being submitted. Can the measure be revised to include a list of pediatric data elements, data validation, and scoring tools?
A: This is a great idea and is something that will be considered for development in the future. For the next five years, the EMS for Children program is interested in knowing how many agencies submit v3 data in order to have a baseline of EMS data collection numbers.
Q: This measure is not pediatric-specific and is out of scope of the EMS for Children program. Can it be eliminated as a measure?
A: No. The EMS for Children Program has participated in the development of a national EMS data system since its inception, and believes this is an important area for performance measurement and improvement. Past Funding Opportunities Announcements (FOA) for the State Partnership Grants have allowed grantees to use grant funds to support the EMS data infrastructure in their states, so this measure is in line with past EMS for Children efforts and within the scope of the program. In addition, since NEMSIS collects data on patients of all ages, and does contain pediatric-specific variables, pediatric patients are included
Change/ addition to wording,Definition added
144 1/6/2016 EMSC National Association of State EMS Officials (NASEMSO), Pediatric Emergency Care Council (PECC)
EMSC 02 Pediatric Emergency Care Coordination The "Recommended Roles" listed for a Pediatric Emergency Care (PEC) Coordinator in an EMS agency is more extensive than a PEC Coordinator in an Emergency Department and yet this is an unfunded addition/position to a voluntary EMS agency. In the very small, voluntary EMS agencies with very low call volume (<50 calls/year) who are finding it difficult to even staff an ambulance, it is unrealistic to assume a PEC Coordinator with all the recommended roles.

This, and EMSC Performance Measure 01, assumes most EMS agencies are large, robust entities, most of which are not. If this measure allowed for a regional model for a PEC Coordinator, rather than only at the agency level, it would allow for the pooling of resources for resource-poor EMS agencies. States that already use a regional model could more easily incorporate resource intensive initiatives lis this, when resources are pooled. Also, utilizing a broader, regional model also assists with consistency, and quality assurance which, for larger states, is an issue.

Additionally, the first bullet under Recommended Roles is that the PEC Coordinator "ensure the pediatric perspective is included in the development of EMS protocols". In states with mandated protocols, the PEC Coordinator would potentially have no say in the pediatric content of the mandated protocols.

Comment: The Performance Measure creates a burden for EMS agencies, particularly those with the fewest resources (small, volunteer, rural, etc.). I feel that the result of this would be (1) failure to meet the measure, or (2) agencies will assign the duties to someone, without any expectation of meeting all the criteria. In either case, the intent of the measure (improving pediatric pre-hospital care) will not be met.

In addition, since we are trying to create and implement measures with validated science behind them. I feel it necessary to state that I have yet to see any studies that indicate that this measure impacts care at the AGENCY level. This concept (the Pediatric Emergency Care Coordinator) has been imported from hospital ED studies to the pre-hospital arena, without any proof of its effectiveness there. Bad science does not make good programming.

Comment: There is high turnover in EMS agencies for these types of positions. The state EMS Office is constantly updating the data regarding training officers, agency leads, emergency managers and instructors. Adding another position will be administratively challenging and also not within the realm of the EMSC program. We suggest that the goal be modified to allow for a designated individual that works with EMS agencies within a county or preferably a multi-county region. That would ease administration and facilitate better communication with a smaller group. It would also ease the burden on our rural agencies that have challenges with personnel retention and recruitment.

Comment: We feel this is a great idea but there is an issue with the financial obligations especially for smaller agencies and it requires a high level of commitment to find the right person.

Could this be done by a volunteer subject matter expert? HRSA needs to spin this so that it appears to be a clearly positive step. There should be a toolkit so that agencies will have a job description, QA tool, and orientation process. This position could function under the guidance of the Medical Director at the state/local level. Could this somehow be tied to the new health care coalitions being established?

Comment: The first introductory paragraph to the survey questions lists various types of activities for the designated individual. The only indication that the list is not a requirement (in part or in total) is the use of the word "could." This is a very subtle way of indicating that the list is not a requirement and may be easily overlooked by the provider completing the survey. It is important to place more emphasis within the sentence on the fact that this list is simply a list of potential roles and responsibilities for the designated individual. This clarification will result in less confusion and better, more accurate data. There needs to be flexibility with how states can meet this performance measure.

We think the measure means well but it is the interpretation of the validation of a state as to having met a measure that we have an issues with. The "ultimate goal" is improved pediatric patient care outcome so that should be the theme throughout the process.

Suggestions for the list of potential roles and activities:
-"Promote the adoption of family centered care policies." this was in the original IOM suggestions for the activities.
-Add the word "injury" so that the activity reads "Promote agency participation in pediatric injury prevention programs."
-Add the word "protocols" so that the activity reads: "Ensure that fellow providers follow pediatric clinical practice guidelines/protocols." Many states have protocols therefore adding the word will preclude confusion and ensure more accurate data.

The proposed survey includes a question regarding the development of EMS protocols. In many states, this role is not available to providers as mandated protocols are developed at a state or regional level. A third option for the response should be provided to reflect this situation as a way to minimize confusion and gain better, more accurate data.

Add additional information to the statement "Oversee pediatric process improvement" so that the survey respondent understand what this entails and how it differs from "Ensure that fellow providers follow pediatric clinical practice guidelines/protocols."

Comment: In some states, "coordination of care" at the EMS agency level is held closely by the Operational Medical Directors (OMD), or similar group, for the agency, on whom the responsibility of field care is borne by extension with the OMD's license to practice medicine. We would respectfully suggest a more suitable role for an EMS provider to accomplish the intent of EMSC Performance Measure 02 would be a Pediatric "Advocate" for the EMS agency, with some focused but minimal change in the wording of suggested responsibilities.

Many states have high percentages of volunteers staffing their EMS agencies, and to many, pushing for a "coordinator of pediatric care" would be seen as requiring an additional unfunded role; one which many do NOT have an existing member suitably trained, compensated, or even available to undertake. The description of the role of the "coordinator" uses words like "ensure" and "oversee" which will not be acceptable to the agencies whom we are nurturing relationships in order to achieve these performance measures. We would be coming off as being in an "ivory academic tower," especially to smaller numerous rural providers and we cannot realistically mandate in regulation that these agencies name someone to assume this role with the current wording. They would immediately enlist legislators to come to their assistance to block us, even if their medical directors did not. Please consider replacing "coordinator" with "advocate".
EMSC has responded to all comments received regarding this measure with revisions and the following FAQ:
Q: Does the PECC need to be on staff at the EMS agency?_x000D_
A: No. Ideally, the Pediatric Emergency Care Coordinator (PECC) should be a member of the EMS agency and be familiar with the specific day-to-day operations and needs of the agency. Some states/territories utilize county or regional models of emergency care; if there is a designated individual who coordinates pediatric activities for a county or region, that individual could serve as the PECC for one of more individual EMS agencies within the county or region._x000D_
Q: Will there be a toolkit available for EMS agencies which provide a job description for a PECC? _x000D_
A: Yes, the EMS for Children Program resource centers will develop toolkits, fact sheets, and webinars to assist State Partnership Grantees in the implementation of the new performance measures. _x000D_
Q: Can you add the word ‘injury’ to the PECC role so that it reads ‘promote agency participation in pediatric injury prevention program’? _x000D_
A: No. As written, the specific function does not exclude injury but rather encompasses all types of prevention programs. Injury prevention is just one type of prevention activity that a PECC could engage in—other prevention programs can include asthma or other childhood illnesses-- so the EMS for Children program wants to keep the role more broadly defined. In addition, the EMS for Children program wanted to be consistent with what is recommended in the IOM report, “Emergency Care for Children: Growing Pains� (2006). _x000D_
Change/ addition to wording,Definition added
145 1/6/2016 EMSC National Association of State EMS Officials (NASEMSO), Pediatric Emergency Care Council (PECC)
EMSC 03 Use of Pediatric-Specific Equipment Like the EMSC Performance Measure 02, the scoring method with the proposed rubric is unrealistic for smaller, voluntary agencies. The evaluative rubric states a provider must demonstrate his/her skill in each of the three methods (skill station, case scenario, and field encounter). A voluntary EMS provider for a small volume agency may never see a pediatric patient within a year (or two or three) therefore requiring EMS providers to demonstrate skills via a field encounter is not realistic or achievable. Has the rubric been validated? In the HRSA webinar, you reference the Lamer, et al. paper that states a paramedic treats a teen on average once every 625 days, a child every 958 days, and an infant every 1087 days. Using this cited reference, how can this Measure expect that an EMS provider will demonstrate the skill even once every two years in a field encounter or more frequently - annually or biannually as the rubric requires?

Additionally there is concern at the state level of the competency or credentialing of the person who is evaluating EMS providers' use of equipment. In some states, education and training of EMS providers is controlled at the state level and an educator has to go through state training to become a Certified Instructor who can then (after going through Certified instructor training) attest training is to a core standard. This proposed Performance Measure would allow a non-certified instructor to attest to a providers' competency without knowing the competency of that evaluator.

Comment: This measure creates a significant burden on those agencies most in need of effective pediatric pre-hospital education (small, volunteer, rural, etc.). These agencies tend to struggle with receiving quality pediatric education of ANY type. Also, tasking agencies with this also seems counter-productive for states that have approved continuing education sites. These smaller agencies are reliant on others to provide continuing education. They are at the mercy of the skill level of educational sites outside of their command structure, and this requirement is difficult to understand in this context.

The low volume of pediatric patients for many agencies renders the "field encounter" measurement meaningless.

Comment: The burden of meeting this requirement is placed on the agencies for which we have no control. There is no incentive for EMS agencies to have this structured skill proficiency in place unless it is mandated through rule. They will see it as an unfunded mandate if it becomes a regulation. This measure is an added burden especially to the rural agencies. It increases the amount of work to monitor compliance with this measure considerably for the state EMSC personnel. Will NEDARC collect the data through surveys of our EMS agencies? Will this be done annually? How will the self-reported information be verified? We see our agencies struggling most with this performance measure. A field encounter for rural agencies is not feasible. The rural agencies in our state count their blessings to have at least two EMTs available to respond on a call. Both will be providing care to the patient when they have a pediatric patient. This will be challenging for our rural EMS agencies because of the lack of patient volume. We suggest starting small with case reviews and skill proficiency within their certification/licensure period. We also suggest omitting the field encounter.

Comment: We are concerned about the state level of the competency or credentialing of the person who is evaluating EMS providers' use of equipment.

Comment:re:revised 'passing' score of "6":
While in line with the national trend towards clinical competence within regards to continued education of EMS providers, the current scoring method with the proposed rubric is biased against smaller, voluntary EMS agencies. Many states have a large portion of first response/non-transporting services within their EMS system. The currently proposed metrics is biased against these critical-access services; adjusting the numerator to a score of '6' or more on a 0-12 would allow the small volunteer and, first responder communities to be included in a realistic goal of strengthening the health workforce.

Comment: re: Equipment List:
As written, this Performance Measure lends itself to wide interpretation by the respondent. In order to make the resulting data meaningful, it is important to clarify the Performance Measure and provide well-considered guidance to the services. Three areas needing clarification are: type of equipment, use of standardize courses in the services "process" and use of the National Registry of EMT's Continued Competency Program (CCP) in the services "process".

It is suggested that a list of example pediatric-specific equipment be developed and provided to the survey respondents to illustrate the type and scope of pediatric equipment that may be considered when answering the proposed survey questions. The following list is an example of pediatric specific equipment that can be used in an EMS agencies competency testing. It is important to note that this list is not all inclusive as equipment requirements will vary depending on the certification levels of the providers, local and state medical direction and jurisdiction, population densities, geographic and economic conditions of the regions, as well as other factors. With the wide variety of protocols, and skills among EMS in the nation, the equipment competency should have a strong focus on the ABC's of EMS!

Examples/Suggested list includes"
Ventilation and Airway Equipment
BLS:
1. Oro- and Naso-pharyngeal airways
2. Suctioning - tips and catheters and bulb suction
3. BVM - selection mask and bag sizes
4. Supraglottic airways-selection of size, insertion technique, and confirmation of placement
5. AED
6. Child saftey restraints (safety seats and other kinds of child specific restraints)
7. IV and IO insertion
8. Weight/Length-based tape use

ALS:
9. ET tubes - selection of size, insertion technique, and confirmation of
10. Pleural Decompression
11. Manual Defibrillator and synchronized cardioversion

Since many services include standardized classes/courses in their "process" to ensure that providers physically demonstrate the correct use of pediatric-specific equipment, it is important that the survey include information regarding PALS, PEEP, APLS, ENPC, and NRP. All these courses require physical demonstration of some pediatric specific equipment skills. It is strongly recommended that use of these courses be acknowledged and allowed when answering the survey questions for PM 3.

There are many States that are allowing services to utilize the National Registry of EMT's Continued Competency Program (CCP) for certification of providers. National Registry specifically requires skill verification by the service training officer/supervisor. It is strongly recommended that the use of CCP recertification program to verify pediatric specific equipment skills be acknowledged and allowed when answering the survey questions for PM 3.

Comment: The scoring method is not realistic for small EMS agencies who see very few pediatric patients, and who do not have the lucury of field training officers or supervisors monitoring their care. they need a definable process by which they can demonstrate specific skills on specific equipment to someone who can mentor them if their technique is not what it should be; simple and patient-centered, and not dependent upon records of past calls, etc.

Comment: Considering many factors, including the variety of issues surrounding actual EMS provider competency, the proposed performance measure may be insufficient and will not likely afford the MCHB with adequate information to evaluate EMS provider competency assurance within EMS organizations, or the journey towards it. The proposed measurement as currently crafted will create a burden on EMS agencies in its collection but may fail to provide effective guidance to enable improvement. We would therefore suggest an alternative or modified performance measure, designed to more comprehensively evaluate the mechanisms in place to assure provider competency in pediatric care.

It should be further noted that the measurement of these additional areas to a high degree of specificity will likely require no more than 10-20 survey questions, significantly less than the amount of information solicited from EMS organizations under previous performance measures.
EMSC has responded to all comments on Measure EMSC 03 with revisions and the following FAQs:
Q: To achieve the skill-checking measure a state would have to reach an ‘8’ on the scale. This score is too high and unrealistic. Can you consider lowering the score for achievement?

A: Yes. After reviewing the measuring scale and supporting evidence, the program has decided that a state/territory would achieve this measure by scoring a ‘6’ or higher on the scale.

Q: The skill-checking measure is too broad. Can it be revised to include the specific pieces of equipment that the program is interested in?

A: No. At one point in the development process, the measure did include specific pieces of life-saving equipment but as we field-tested the data collection instrument, we learned that there was variability among agencies as to what pieces of equipment were considered out of scope or if the medical director allowed agencies to use that equipment. As a result, the program determined that knowing if a process to skill-check, rather than which specific pieces of equipment they tested on, would help to understand how prepared agencies are to care for children. We assume that agencies who invest the resources to skill-check will select the pieces of equipment that are most crucial or very rarely used in the field to care for children.

Q: Our state certifies EMS instructors and only state certified EMS instructors can teach EMS providers, how does this figure into how EMS agencies will respond to the questions in the skill-checking measure?

A: EMS agencies in states which require certified EMS instructors will need to take state regulations into consideration when they respond to the survey questions which ask about the process their agency uses to skill-check.

Q: How do national courses such as PEPP and PALS and certifications like NREMT CCP fit into the skill-checking measure?

A: As long as it is an in-person course, an agency would respond in a way that reflects the methods used in the course or certification (i.e. simulation, skill-station) and frequency of the skill-checking activities whether the process occurred via a national course or not.

Q: Does an agency have to use or adopt all three of the methods listed (skill-station, simulation, and field encounter) to meet the measure?

A: No, the measure can be achieved by using only two of the three described methods.

Q: Can HRSA consider using the state’s recertification/re-licensure period as the time period for when skills check should occur?

A: At this time, a couple of states recertify/re-license annually so this would make the performance measure target burdensome. In addition, a couple of states recertify/relicense EMS providers every five years which would include a very long period of time before pediatric skills check would be required. The comment was considered but may not work well for the states with shorter and longer periods of recertification/re-licensure.

Change/ addition to wording,Definition added
146 1/6/2016 EMSC National Association of State EMS Officials (NASEMSO), Pediatric Emergency Care Council (PECC)
EMSC 04 Pediatric Medical Emergencies The minimum percent threshold (25%) to meet the goal is arbitrary and not validated. Since EMTALA requires all EDs to be able, at a minimum, to stabilize and transfer patients, some states would not include/designate hospitals EDs that only stabilize and transfer pediatric patients in it's regionalized system since it's a baseline standard; although they may recognize higher-level pediatric-capable hospitals. Therefore many larger states would not meet the 25% threshold, nor could they system(s) support this excessive number.

According to the "National Quality Forum's Evaluating Regionalized Emergency Medical Care Systems Using and Episodes of Care Approach" which is cited by HRSA in its SPROC FOA: "...the framework provides a conceptual model for emphasizing the evaluation of emergency medical care within a population or geographical region, rather than within an individual facility or single part of the system. Although earlier measurement efforts have focused on discrete parts of a system, new models should focus on evaluating the integration of the discrete service units that make up a system, and how the entire system performs. Thus, a major goal of this framework is to provide the context for evaluating the system as a whole, rather than just is component parts."

Nowhere in this statement, nor the remaining report, does it state the number or percentage of facilities in the system is relevant. The goal for this Performance Measure should be whether or not the state has a developed system ('yes' or 'no') along with the continued use of the 'scale' to determine where states are in the process of developing a system.

Comment: Recommend that "statewide or regional standardized system that are able to stabilize and/or manage pediatric medical emergencies" be better defined. Is compliance with the minimum standards set forth by the AAENA/ACEP consensus document (most current version) by emergency departments considered to meed that definition, or is it more appropriate to construct a multi-level recognition/categorization/designation system?
HRSA appreciates the comments received related to the current performance measures to assure systems are prepared to stabilize and manage pediatric medical and traumatic emergencies. The comments will be used to begin the discussion as we develop the next generation of Hospital-Based performance measures. The comments are indeed very helpful and will be shared with additional subject matter experts as we work to build a consensus on what should be the next generation of EMSC hospital-based performance measures. We will keep EMSC stakeholders informed throughout the development. Change/ addition to wording,"Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
147 1/6/2016 EMSC National Association of State EMS Officials (NASEMSO), Pediatric Emergency Care Council (PECC)
EMSC 05 Pediatric traumatic emergencies '1. --- The minimum percent threshold (50%) to meet the goal is, like EMSC performance measure 04, arbitrary, invalidated, and excessive for larger states. For example, a state could have 4% of their hospitals designated as pediatric trauma centers, and with a rationalized system could determine that this amount is sufficient geographic and coordinated coverage. A 50% threshold would not necessarily be a coordinated, rationalized, or sustainable system that could be supported, or necessary. As with EMSC Performance Measure 04, the number or percentage of hospitals should not be the evaluative measure.

2. --- The problem here seems to be in not defining "statewide or regional standardized system that recognizes hospitals that are able to stabilize and/or manage pediatric trauma," which makes the current wording and the target percentage appear arbitrary and excessive. We also wonder how the 50% figure was chosen and/or validated, as we would argue that some states with robust trauma systems that do address pediatrics would not be able to meet this 50% metric.

Advocating for "rationalization" of care, then requiring that half of hospitals achieve "recognition" of specialized pediatric trauma capabilities seems to be sending a mixed message. Virginia does not want 50% of hospitals to be designated as trauma centers - it would make no sense. In order to be licensed in some states' code, every hospital must agree to honor statewide trauma triage guidelines that have been developed through the Trauma System Oversight & Management Committee and approved by the Board of Health (which now contain specific pediatric components), and the hospital must transfer or redirect patients meeting those criteria to a designated trauma center. So, the hospitals DO participate in such a system as the performance measure implies - they are not formally designated or recognized like the designated trauma centers (other than by licensure). Unfortunately, we would never be able to achieve the 50% goal stipulation with the way Performance Measure EMSC 05 is currently worded.
HRSA appreciates the comments received related to the current performance measures to assure systems are prepared to stabilize and manage pediatric medical and traumatic emergencies. The comments will be used to begin the discussion as we develop the next generation of Hospital-Based performance measures. The comments are indeed very helpful and will be shared with additional subject matter experts as we work to build a consensus on what should be the next generation of EMSC hospital-based performance measures. We will keep EMSC stakeholders informed throughout the development. Change/ addition to wording,"Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
148 1/6/2016 CSHCN The Center for Comprehensive Care & Diagnosis of Inherited Blood Disorders Santa Ana, CA Not Related to Measure
Comment Overview: In general, we support the proposed revisions. Notable improvements are HRSA’s reducing the number of measures required, Yes/No response options, and willingness to use automated collection techniques. Reduced measures will promote thoughtful choice of specific measures for each grant program, and thereby foster the value of each measure to promote HRSA’s overall purposes for these measures: for grantee monitoring, program planning, performance reporting, and demonstrate alignment between MCHB discretionary programs and the MCH Title V Block grant program. Our comments will specifically address: 1) the necessity and utility of the proposed information collection for the proper performance of the agency’s functions, 2) the accuracy of the estimated burden, 3) ways to enhance the quality, utility and clarity of the information to be collected, and 4) the use of automated collection techniques or other forms of information technology
to minimize the information collection burden.
Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
149 1/6/2016 CSHCN The Center for Comprehensive Care & Diagnosis of Inherited Blood Disorders Santa Ana, CA CB 2 Technical Assistance Ways to enhance the quality, utility and clarity of the information to be collected. In general, we recommend adding definitions for each form and measure’s key terms. As definitions may vary, we
recommend HRSA seek guidance from nationally recognized agencies such as the Institute of Medicine, or Agency for Healthcare Quality and Research to determine the definitions Good starts are at CB 2 – technical assistance, which is defined under the Tier 2 measure. But health equity and QI, for example, are core measures that would benefit from definitions of those key terms.
Definition: Health equity is defined as the attainment of the highest level of health for all people. It is the removal of any and all differences (disparities) in health that are avoidable, unfair, and unjust. It requires valuing everyone equally with focused and ongoing societal efforts to address avoidable inequalities, historical and contemporary injustices, and the elimination of health and health care disparities. Definition added
150 1/6/2016 CSHCN The Center for Comprehensive Care & Diagnosis of Inherited Blood Disorders Santa Ana, CA CB 6 Products Automated collection techniques: The proposed measure CB 6 (Page 47, Attachment B): percent programs supporting the development of informational products is important, but currently burdensome to collect. We applaud your proposal to not require full details of each specific product. However, requiring zero descriptors under Tier 2 eliminates value opportunities to examine and potentially use these products. Proposed solutions: to allow respondents to optionally input the web addresses of products, and provide a fillable PDF for grantees. Furthermore, Tier 3 for this measure would benefit from a PDF fillable option linked to the above Tier 2 recommendation. Most of this is possible and appropriate in the related Products form, so no changes made to this measure.

No changes neccessary
151 1/6/2016 CSHCN The Center for Comprehensive Care & Diagnosis of Inherited Blood Disorders Santa Ana, CA Core 1,Core 2,Core 3 Grant Impact, Quality Improvement, Health Equity – MCH Outcomes Necessity and utility of the proposed information collection for the proper performance of the agency’s functions: We see both necessity and utility of the newly redesigned proposed measures,
specifically the three core measures required of all grantees [meeting stated aims, quality improvement (QI), and health equity]. QI and health equity data, uniformly collected as proposed, will provide new and valuable information that documents trends in the breadth and depth of these efforts. These data will be useful not only to HRSA, but also to the individual grantees, who could use these data to identify potential partners for future collaborations to advance QI and health equity efforts.
Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
152 1/6/2016 CSHCN The Center for Comprehensive Care & Diagnosis of Inherited Blood Disorders Santa Ana, CA Not Related to Measure
Burden Accuracy: The burden of accuracy is dependent upon the scope and size of the grant program, hence it will be difficult to measure and should not be consistently applied to all programs.
Within a small program, the estimated 41 burden hours per response can be roughly accurate. However, within a larger program, the amount of time needed to review the instructions; to validate and verify information; to train personnel and to be able to respond to a collection of information; and to transmit or otherwise disclose the information can be time consuming. We recommend adding additional requirements such as type of program, size of the population and number of collaborators, in order to accurately determine the burden.
Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
153 1/6/2016 Workforce Development/ Training UAB School of Public Health Birmingham, AL Core 2 Quality Improvement Core 2 PM – Goal 2: Quality Improvement � Because some projects are primarily academic institutions, their QI initiatives might have a student or academic foci. I suggest changing the wording of “Improve client satisfaction� to “Improve client satisfaction/outcomes� as, in some cases, their “clients� are often their students and those projects might be focusing on something besides satisfaction (like graduation rates, MCH competency knowledge/skills, etc.) Yes

Change/ addition to wording
154 1/6/2016 Workforce Development/ Training UAB School of Public Health Birmingham, AL Core 2 Quality Improvement Table 1: Activity Data Collection Form for Selected Measures (PROPOSED)� (Attachment B: Detail Sheets|51) will be VERY DIFFICULT to report, primarily because it is difficult to know what the difference is between “Providers/Professionals� and “Community Partners�, for instance. Frequently, the “Consumers� of our TA efforts are both Providers/Professionals AND Community Partners, which would make them eligible for all 3 categories. In addition, for academic programs that provide TA to many groups through
individual faculty, collecting this information will be cumbersome.
Comment is about Table 1, not Core 2. Addressed elsewhere Definition added
155 1/6/2016 Workforce Development/ Training UAB School of Public Health Birmingham, AL Training 1 MCH Training Program Family Member/Youth/Community Member participation On the detail sheet for “Family/Youth/Community Engagement in MCH Training Programs�, the word_x000D_ “community members� needs to be made plural in boxes 1, 3, and 4 so that those boxes are consistent with boxes 2 and 5. To clarify, in box 2, “population served�, in some cases, often means “students�. Also, I suggest adding “students/trainees� to the end of box 5 and expanding the word “staff� to “faculty/staff�. As it’s currently written, it’s very service organization oriented. Since the primary focus of many projects is as academic graduate training programs, I would think HRSA would want to know if they've got family members/youth/community members working with their training programs to provide training to their faculty/staff and students/trainees. Community members made plural; expanded wording to faculty/staff; Added students/trainees to Box 5 as suggested_x000D_ Change/ addition to wording,"Grammar/ spelling/ error issue, now fixed"
156 1/6/2016 Workforce Development/ Training UAB School of Public Health Birmingham, AL Training 2 MCH Training Program Cultural Competence On the detail sheets for “Cultural Competence in MCH Training Programs�, I LOVE that boxes 4 and 5_x000D_ have been expanded to “staff/faculty�, not just “faculty�. No resolution needed No changes neccessary
157 1/6/2016 Workforce Development/ Training UAB School of Public Health Birmingham, AL Training 6 Demonstrate Field Leadership On the data collection forms for “Field Leadership�, both 2 and 5 years after training completion, different language for D. and E. (I. and J.) is used. For D. and I., the term “demonstrating field leadership� is used and, for E. and J., the term “demonstrating MCH leadership� is used. This is confusing! These data collection forms are for the “Field Leadership� goal, so it seems that both should use consistent language (“demonstrating field leadership�). Revised language around field leadership so that it is consistent across the measure Change/ addition to wording
158 1/6/2016 Workforce Development/ Training UAB School of Public Health Birmingham, AL Training 9 Interdisciplinary Practice On the data collection form for “Interdisciplinary Practice�, this is the language used: “The number of long�term trainees who WORK…�. However, on the data collection form for “Field Leadership�, this is the language used: “Number of trainees that HAVE PARTICIPATED…�. The language used for Interdisciplinary Practice needs to be changed to “The number of long�term trainees who HAVE WORKED…� for consistency’s sake.

Also, the language for “Field Leadership� can be translated “Have you done this?�, but the language for “Interdisciplinary Practice� is translated “Are you doing this right now?� If the language remains unchanged, the data collected for Interdisciplinary Practice will not represent what HRSA wants to
know.

On a separate note, if we’re contacting former trainees 10 years after training completion, it’s just not that big of a deal to ascertain their field leadership along with their interdisciplinary practice. Why wouldn’t we collect both at 10 years rather than only collecting interdisciplinary practice at 10 years? Additionally, the required information for interdisciplinary practice is vague. We can glean this information from our graduates, but a clearer explanation for these data points might be helpful in asking the questions.
The specific interdisciplinary skills added to the measure will allow MCHB to demonstrate the specific interdisciplinary skills that trainees exhibit. This list was developed in coordination with the DMCHWD performance measure workgroup. No changes neccessary
159 1/6/2016 Workforce Development/ Training UAB School of Public Health Birmingham, AL Training 13 Policy Development, Implementation, and Evaluation On the sheet for “Category #2: Participation in Policy Change and Translation of Research into Policy�, the wording of the “If yes…� statement in boxes 5 and 6 needs to be made consistent with box 4. Both need to be changed to “If yes, indicate the policy arenas to which they have contributed�. Left alone, they are confusing and grammatically incorrect. Recommendation adopted, change made on Training 13 detail sheet. Change/ addition to wording,"Grammar/ spelling/ error issue, now fixed"
160 1/5/2016 CSHCN American College of Medical Genetics and Genomics

Bethesda, MD CSHCN 1 Family Engagement Add Tracking and Monitoring to Tier 2 to emphasize the importance of data collection around family engagement.

Add regional to the geographic units included in Tier 4. This addition would recognize that some activities can be more efficiently achieved on a regional basis.

While desirable to have racial and ethnic data on family CSHCN leaders, how feasible is it to obtain this information? Perhaps collecting data to show that affected individuals and families are engaged as CSHCN leaders would be easier to report.
Tracking and surveillance was added to domain measures.

Not all sections of measures will be assigned to all grantees.
Change/ addition to wording,"Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
161 1/5/2016 CSHCN American College of Medical Genetics and Genomics

Bethesda, MD CSHCN 2 Access to and Use of Medical Home Table 1 is to be used to report activities. It would be helpful to clarify where local public health activities should be counted. As currently constructed this table has the columns of Community Partners separate from State and National. We recommend that HRSA distinguish between governmental and nongovernmental partnerships.

Tier 4 could be enhanced by including other performance measures, e.g., promoting a framework for medical home, increasing the number of medical homes, or improving care coordination with specialists.
Refer to comment 80 Addressed in other similar or identical comment.
162 1/5/2016 CSHCN American College of Medical Genetics and Genomics

Bethesda, MD LC 1 Adequate Health Insurance Coverage Add Tracking and Monitoring to Tier 2 to emphasize the importance of data collection around health insurance coverage.
Similarly, add Tracking and Monitoring to the LC1 Data Collection form.

In Tier 4, it would be helpful to provide a definition for adequate health insurance coverage.
Comment is the same as 141. Addressed in other similar or identical comment.
163 1/5/2016 CSHCN American College of Medical Genetics and Genomics

Bethesda, MD CB 1 State capacity for advancing the health of MCH populations In Tier 3, the list of State agencies should separate Newborn Screening (NBS) from Genetics as each is an important partner to HRSA/MCHB. Because HRSA has begun to emphasize genetics across the lifespan, it is critical to create distinct categories for NBS and Genetics. Through this ontology, we believe that HRSA will have additional insights about the extent to which genetics is being addressed by its grantees. Same as above. Addressed in other similar or identical comment.
164 1/5/2016 CSHCN American College of Medical Genetics and Genomics

Bethesda, MD CB 2 Technical Assistance Genetics is missing from the list of MCH priorities. Genetics should be added to Tier 3 and to the Data Collection Form for CB2.

As currently constructed the Data Collection Form for CB2 has the columns of Community/Local Partners separate from State or National Partners. We recommend that HRSA distinguish between governmental and nongovernmental partnerships.

The definition of Technical Assistance is well done. We applaud HRSA for recognizing that this is a collaborative activity that can be done on a regional basis.
Yes.

Change/ addition to wording
165 1/5/2016 CSHCN American College of Medical Genetics and Genomics

Bethesda, MD CB 3 Impact Measurement This is an opportunity to ask grantees about the State and national data sources that they are using to assess their activities and impact. It could give data HRSA data on the use of the National Survey of Children’s Health, birth defects registries, etc. This data would help support the importance taxpayers’ investment in these State and national data resources. Same as 135

Addressed in other similar or identical comment.
166 1/5/2016 CSHCN American College of Medical Genetics and Genomics

Bethesda, MD CB 6 Products Tier 3 should also include some measure of use of these products. The NCC/RC system uses number of unique visits and home page visits to measure the use of its Internet resources. Impact factors of publications might be another metric to consider. We are assessing how it is disseminated, not how many people are reached by it.

Addressed in other similar or identical comment.
167 1/5/2016 CSHCN American College of Medical Genetics and Genomics

Bethesda, MD Core 2 Quality Improvement We applaud HRSA for recognizing crosssectorial collaborative across multiple organizations in Tier 2. We suggest that an additional aim of this type of collaboration might be improved coordination across MCHB funded programs. Same comment as 78. Addressed in other similar or identical comment.
168 1/5/2016 CSHCN American College of Medical Genetics and Genomics

Bethesda, MD Core 2 Quality Improvement Add Tracking and Monitoring as a new row. Data collection and analysis is sufficiently distinct from quality improvement to warrant its own row. As indicated in our comments on CSHCN 1, we recommend that HRSA distinguish between governmental and nongovernmental partnerships in the column headings. Yes to tracking and monitoring, but not in Core 2. Change/ addition to wording
169 12/31/2015 EMSC Virginia EMSC
EMSC 02 Pediatric Emergency Care Coordination In Virginia, “coordination of care� at the EMS agency level is held closely by the Operational Medical Directors (OMD) for the agency, on whom the responsibility of field care is borne by extension with the OMD’s license to practice medicine. We would respectfully suggest a more suitable role for an EMS provider to accomplish the intent of EMSC Performance Measure 02 would be as a Pediatric “Advocate� for the EMS agency, with some focused but minimal changes in the wording of suggested responsibilities.
_x000D_
75% of Virginia’s EMS agencies are staffed with volunteers, and to many, pushing for this “coordinator of pediatric care� this would be seen as requiring an additional unfunded role; one which many do NOT have an existing member suitably trained, compensated, or even available to undertake. The description of the role of the “coordinator� uses words like “ensure� and “oversee� which will not be acceptable to the agencies whom we are nurturing relationships in order to achieve these performance measures. We would be coming off as being in an “ivory academic tower�, especially to our smaller numerous rural providers and we cannot realistically mandate in regulation that these agencies name someone to assume this role with the current wording. They would immediately enlist legislators to come to their assistance to block us, even if their medical directors did not. Please consider replacing “coordinator� with “advocate�.
EMSC has responded to all comments received regarding this measure with revisions and the following FAQ:
Q: Does the PECC need to be on staff at the EMS agency?_x000D_
A: No. Ideally, the Pediatric Emergency Care Coordinator (PECC) should be a member of the EMS agency and be familiar with the specific day-to-day operations and needs of the agency. Some states/territories utilize county or regional models of emergency care; if there is a designated individual who coordinates pediatric activities for a county or region, that individual could serve as the PECC for one of more individual EMS agencies within the county or region._x000D_
Q: Will there be a toolkit available for EMS agencies which provide a job description for a PECC? _x000D_
A: Yes, the EMS for Children Program resource centers will develop toolkits, fact sheets, and webinars to assist State Partnership Grantees in the implementation of the new performance measures. _x000D_
Q: Can you add the word ‘injury’ to the PECC role so that it reads ‘promote agency participation in pediatric injury prevention program’? _x000D_
A: No. As written, the specific function does not exclude injury but rather encompasses all types of prevention programs. Injury prevention is just one type of prevention activity that a PECC could engage in—other prevention programs can include asthma or other childhood illnesses-- so the EMS for Children program wants to keep the role more broadly defined. In addition, the EMS for Children program wanted to be consistent with what is recommended in the IOM report, “Emergency Care for Children: Growing Pains� (2006). _x000D_
Change/ addition to wording,Definition added
170 12/31/2015 EMSC Virginia EMSC
EMSC 03 Use of Pediatric-Specific Equipment The scoring method is not realistic for small EMS agencies who see very few pediatric patients, and who do not have the luxury of field training officers or supervisors monitoring their care. They need a definable process by which they can demonstrate specific skills on specific equipment to someone who can mentor them if their technique is not what it should be; simple and patient-centered, and not dependent upon records of past calls, etc. EMSC has responded to all comments on Measure EMSC 03 with revisions and the following FAQs:
Q: To achieve the skill-checking measure a state would have to reach an ‘8’ on the scale. This score is too high and unrealistic. Can you consider lowering the score for achievement?

A: Yes. After reviewing the measuring scale and supporting evidence, the program has decided that a state/territory would achieve this measure by scoring a ‘6’ or higher on the scale.

Q: The skill-checking measure is too broad. Can it be revised to include the specific pieces of equipment that the program is interested in?

A: No. At one point in the development process, the measure did include specific pieces of life-saving equipment but as we field-tested the data collection instrument, we learned that there was variability among agencies as to what pieces of equipment were considered out of scope or if the medical director allowed agencies to use that equipment. As a result, the program determined that knowing if a process to skill-check, rather than which specific pieces of equipment they tested on, would help to understand how prepared agencies are to care for children. We assume that agencies who invest the resources to skill-check will select the pieces of equipment that are most crucial or very rarely used in the field to care for children.

Q: Our state certifies EMS instructors and only state certified EMS instructors can teach EMS providers, how does this figure into how EMS agencies will respond to the questions in the skill-checking measure?

A: EMS agencies in states which require certified EMS instructors will need to take state regulations into consideration when they respond to the survey questions which ask about the process their agency uses to skill-check.

Q: How do national courses such as PEPP and PALS and certifications like NREMT CCP fit into the skill-checking measure?

A: As long as it is an in-person course, an agency would respond in a way that reflects the methods used in the course or certification (i.e. simulation, skill-station) and frequency of the skill-checking activities whether the process occurred via a national course or not.

Q: Does an agency have to use or adopt all three of the methods listed (skill-station, simulation, and field encounter) to meet the measure?

A: No, the measure can be achieved by using only two of the three described methods.

Q: Can HRSA consider using the state’s recertification/re-licensure period as the time period for when skills check should occur?

A: At this time, a couple of states recertify/re-license annually so this would make the performance measure target burdensome. In addition, a couple of states recertify/relicense EMS providers every five years which would include a very long period of time before pediatric skills check would be required. The comment was considered but may not work well for the states with shorter and longer periods of recertification/re-licensure.

Change/ addition to wording,Definition added
171 12/31/2015 EMSC Virginia EMSC
EMSC 04 Pediatric Medical Emergencies Virginia does not have a problem with the goal or the performance measure, but would prefer that “statewide or regional standardized system that are able to stabilize and/or manage pediatric medical emergencies� be better defined.

Is compliance with the minimum standards set forth by the AAENA/ACEP consensus document (most current version) by emergency departments considered to meet that definition, or is it more appropriate to construct a multi-level recognition/ categorization/ designation system?
HRSA appreciates the comments received related to the current performance measures to assure systems are prepared to stabilize and manage pediatric medical and traumatic emergencies. The comments will be used to begin the discussion as we develop the next generation of Hospital-Based performance measures. The comments are indeed very helpful and will be shared with additional subject matter experts as we work to build a consensus on what should be the next generation of EMSC hospital-based performance measures. We will keep EMSC stakeholders informed throughout the development. Change/ addition to wording,"Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
172 12/31/2015 EMSC Virginia EMSC
EMSC 05 Pediatric traumatic emergencies The problem here seems to be in not defining “statewide or regional standardized system that recognizes hospitals that are able to stabilize and/or manage pediatric trauma�, which makes the current wording and the target percentage appear arbitrary and excessive. We also wonder how the 50% figure was chosen and/or validated, as we would argue that some states with robust trauma systems that do address pediatrics would not be able to meet this 50% metric._x000D_
Advocating for “regionalization� of care, then requiring that half of hospitals achieve “recognition� of specialized pediatric trauma capabilities seems to be sending a mixed message. Virginia does not want 50% of hospitals to be designated as trauma centers—it would make no sense. In order to be licensed now under Virginia code, every hospital must agree to honor statewide trauma triage guidelines that have been developed through the Trauma System Oversight & Management Committee and approved by the Board of Health (which now contain specific pediatric components), and the hospital must transfer or redirect patients meeting those criteria to a designated trauma center. So, the hospitals DO participate in such a system as the performance measure implies—they are just not formally designated or recognized like the designated trauma centers (other than by licensure). Unfortunately, we would never be able to achieve the 50% goal stipulation with the way Performance Measure EMSC 05 is currently worded.
HRSA appreciates the comments received related to the current performance measures to assure systems are prepared to stabilize and manage pediatric medical and traumatic emergencies. The comments will be used to begin the discussion as we develop the next generation of Hospital-Based performance measures. The comments are indeed very helpful and will be shared with additional subject matter experts as we work to build a consensus on what should be the next generation of EMSC hospital-based performance measures. We will keep EMSC stakeholders informed throughout the development. Change/ addition to wording,"Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
173 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA AH 1 Adolescent Well Visit I am not sure if AH1 is intended to be a LEAH measure. If the domain is assigned to LEAH, I would recommend requesting data for Tier 1 and 2 only.

The Tier 2 list would benefit from using the same categories as the other 2 PMs in this domain and the inclusion of some other metrics. For example, in addition to peer-reviewed publications, products should include invited reviews, commentaries, chapters and other scholarly works, many of which have significant impact on the field. The category Outreach/Information Dissemination/Education might be split into Education to include Learning collaboratives and CME/CEU/CE and Outreach to include work with professional organizations.

To the last category Referral/, I would add “Access.� A new category on Research/Program Development and one on Outcomes such as Chlamydia screening would help capture components of the well visit. Although it might seems simple to know how “many are reached,� these data are not available and would require significant funding and new methodology to begin to estimate. Currently, programs do not know how many individuals actually receive information through education or outreach. Similarly for Tier 4, the enrollment should include all teens; all insurers could be encouraged to report this information directly to state MCH programs. Further discussion might be helpful.
Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
174 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA AH 2 Injury Prevention As above, I am not sure if LEAHs will report on any of the elements of AH 2. If helpful to MCHB, reporting on tier1 and 2 would be feasible whereas gathering data for Tier 3 and 4 would require a shift in methodology and significant resources either added or diverted from training.

In Tier 2, I would match to the other 2 PMs in this domain and add to Research Program Development. � For the second section of Tier 2, I would add to Motor Vehicle traffic � a word such as accidents � or Policy � or DUI �. Traumatic Brain Injury � should include Concussion � and a category to include Opioids � added. Youth violence � should include Intimate partner violence � or Dating violence �. The age ranges are different for well visits and injuries but likely related to current Data collection systems. If completed by LEAHs, the form on page 30 would need to use Yes/No � checkboxes but not numbers of those reached (see above).
Change to Motor Vehicle traffic crashes �
Use Traumatic Brain Injury, including concussion �
Change to Prescription Drug Overdose, including Opioids �
Keep just "Youth Violence" as this includes dating violence.
Change/ addition to wording
175 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA AH 3 Screening for Major Depressive Disorder As in two PMs above, I am not sure if LEAH will be reporting this PM. If helpful to MCHB, the reporting should be restricted to Tier 1 and 2. The Tier 2 list would be similar to categories in this domain of 3 PMs. On Tier 4, “treatment� needs to be defined and sources of data identified since state administrative data would be missing services to youth provided under self pay and likely other areas as well. In addition, the wording of the current PM only indicates “screening� and not “treatment� and would need editing if broader goal desired. Fix Heading Under Tier 2.

Definition added,"Grammar/ spelling/ error issue, now fixed"
176 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA CB 1 State capacity for advancing the health of MCH populations The comments for this section are similar to those for the adolescent health domain above. If LEAH is included in this PM, I would restrict to reporting of information for Tier 1 and 2. Tier 2 would benefit from an “Other� category. LEAHs would need significant resources to provide estimates for Tier 3 or Tier 4. Tier 3 and 4 are best answered by an adolescent health information center or developed internally at MCHB. Comment relates to ability to report. No changes neccessary
177 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA CB 2 Technical Assistance TA is currently reported and would be easier if in both Tiers 2 and 3 the “check all that apply� were available. As noted in AH1, estimating “how many are reached� is not data currently available because of the ripple effect. If completed by LEAHs, the table on page 42 would need to use “Yes/No� responses rather than estimates. Do “participants/public� include Youth/Families or Schools? Yes, Tier 2 is Check All that Apply

Change/ addition to wording
178 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA CB 3 Impact Measurement The overall statement is good and helpful for LEAHs using Tier 1 and 2, although the categories in Tier 2 would need additions to be relevant to goals, including leadership positions of trainees, products, CE, etc. If training grants are assigned Tier 3, then there are additional categories to add such as return on investment, focus groups, trainee feedback, qualitative analysis, and “Other�. Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
179 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA CB 4 Sustainability Not/Applicable to training programs. MCH is the only funding source for Adolescent Medicine training and for interdisciplinary training – a very important focus. Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
180 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA CB 5 Scientific Publications I believe that the wording of the PM would benefit from including scholarly contributions – i.e. “the percent of programs supporting scientific discovery and scholarly products� (could also use “scholarly contributions�). Tier 1 would then need to be reworded to match PM. The Tier 2 phrasing should also be reworded to reflect changes to Products and Publications (see below). I would delete “submitted� and include those published electronically or in print. If an articles is e-published as a final product or “epub in advance� or “in press,� the required categories on the MCH collection form cannot be used. Other mechanisms for Tier 2 include funding conferences, teaching writing skills, resources for publication charges, time, academic promotion, and mentorship. Similarly, the number “reached� in Tier 3 is unknown. Tier 4 should include “check all that apply� and add websites, professional organizations, books, chapters, reviews, lay organizations, and “Other.� Changes were made based on comment provided. Change/ addition to wording
181 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA CB 6 Products For Tier 2, the wording “with grant support� needs to be clarified, or preferably deleted. Most products from training grantees are dependent upon faculty and trainees who may not be supported directly by the MCH grant but benefit from the role the LEAH funding plays in creating the overall interdisciplinary environment for success. Thus many of the projects and products include indirect funding but not direct MCH funding. To the Tier 2 list, reviews, commentaries, etc should be added to Reports and monographs as noted under forms (see pages 13-14). Tier 3 “how many are reached� is unknown. Define 'Grant Supported': If they are a co-author, then yes; if it is about their program, then yes; if created within the confines of the MCHB funded, then yes.

Use this for CB 5 as well.
Change/ addition to wording,Definition added
182 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA Core 1 Grant Impact Since the PM relates to meeting aims at the “end of the current grant cycle,� I interpret this to imply the question needs to be answered once every 5 yrs. While this interval is simple, I would prefer to also capture progress toward goals and objectives and changes in direction or programs undertaken because of changes in technology, state/federal legislation, payment models, etc. Interim assessment is captured annually in section C. of the narrative so perhaps duplicative. This is reported once every 5 years and allows for a view of achievement of aims across programs.

No changes neccessary
183 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA Core 2 Quality Improvement Currently QI initiatives are required by hospital accreditation organizations, residency and fellowship Boards, American Board of Pediatrics for recertification, payors, and others. Thus the requirement would be easily met by clinical programs but may not be applicable to SOPH and research grants, For Tier 2 for both structure and aims, “check all that apply� needs to be an option and “Other� added. Similarly for Tier 3 methodology, “check all that apply� is important. For Tier 4, I would combine to one question “Are there data to support improvement in population health, clinic or organization metrics and processes as a result of QI activities?� An example could be included in the narrative. Check all that apply has been added. Change/ addition to wording
184 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA Core 3 Health Equity – MCH Outcomes This measure seems particularly applicable to clinical programs. Perhaps for SOPH, the text could add “teaching� – “Are you promoting, facilitating, or teaching about health equity..� If applicable for research grants, further rewording is needed. For Tier 2, MCHB may wish to combine topics such as race/ethnicity, gender/sex/sexual orientation, urban/rural/suburban, etc. “Check all that apply� should be added. Tier 3 – “has the program set goals� - may depend upon type of program and similarly for “met goals� the wording suggest this data are measured annually in contrast to the earlier Core 1 metric which uses the phrase “at end of grant cycle�.

Table 1 page 51 should add a column for Professional Organizations/Universities. Particularly important, this measure needs explanations and examples of these titles so that everyone filling out the forms is using the same set of definitions. For training programs (with perhaps exception of LEND program that receive more funding), these boxes should be a checkboxes for “Yes/No� (see page 2). Given the limited resources for LEAH training, I am hopeful that LEAHs would not be asked to record the number of services, referrals or other new data. A full time data coordinator would be needed to accurately record and catalog activities of fellows, faculty and staff and would not add to the fundamental goal to train leaders and augment the MCH workforce!
This is measured annually, and is based on grantee-specific goals.

No changes neccessary
185 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA Training 1 MCH Training Program Family Member/Youth/Community Member participation This measure is important for capturing the involvement of youth, families, and community members in training grants. The “Yes/No� format is an improvement. There is likely overlap in the three categories since youth and family members also represent the community in many instances and the overlap could be further acknowledged in the Definition and Significance sections.

For item #4, I wonder if this item should be included only for grants in which there is a line item for “compensation.� MCH LEAH grants have been level funded for 20 years and will need to reduce faculty FTEs and trainee stipends further. Perhaps this could be a LEND program item since I believe they have specific funding for this compensation. That said, we do compensate our peer leaders because we want to recruit teens from the local community who would otherwise need to get a job. In addition, labor laws also require compensation if “volunteers� are doing a “job�.



Item #5 might be changed to include trainees and faculty – “Train MCH/CSHCN staff, providers, faculty and trainees� unless the PM is meant only for training state Title V staff.

Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
186 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA Training 2 MCH Training Program Cultural Competence This PM includes 6 “Yes/No� queries and the shift from scoring is appreciated.

For item # 3 Data, the title could be more descriptive if it was changed (from “Data�) to “Research and Quality Improvement.�

For item #4, I would change the text to “The grantee has programs (or initiatives) to address the cultural and linguistic and gender/sex diversity of faculty, trainees, and staff with goal of matching the populations served.� For the present, despite pipeline programs and other efforts, more time and creative projects and funding are needed for programs to achieve racial/ethnic/gender/sex diversity similar to either population percentages or client percentages.

For item #5 Professional Development, text should add “trainees� – “…Program staff, faculty and trainees participate…�

For #6 Measure Progress, a standard assessment might be helpful in the future; I would also include other issues related to diversity including LGBTQ, disabilities, health literacy, etc. The title of this item could also change to “Measurement of Progress� to have parallel titles.
The primary focus of cultural and linguistic competence in this measure is around race and ethnicity. No revisions will be made. No changes neccessary
187 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA Training 6 Demonstrate Field Leadership The addition of assessing trajectory of trainees who are 2 years post MCH training makes sense for a 5 year grant cycle. It is important to realize that some LEAH trainees may still be in training and working toward a degree. However, I assume that a student status can still include leadership activities. To the list of “disseminated information…� I would add to the list in parentheses “reviews, commentaries, and chapters� pages 70 and 72 under #1 bullet 1. If Disseminated information is the same in the other sections, do respondents get credit for more than one category? I wonder with the concerns about “lobbying� versus education whether the advocacy activities need more careful phrasing. There is a fine line between educating and trying to influence MCH-related legislation.

On page 71, should the text in item 3, last bullet say “influenced legislation for the benefit of MCH populations� rather than “MCH related legislation�? The biggest issue for this PM is the overlapping definitions which either could be changed or perhaps it would be easier is to leave the same and in the Introduction to the section, let the respondents know that they may be checking off the same text under more than one category.
Added additional examples to "disseminated information" as suggested; Revised language around educating policymakers. Change/ addition to wording,Definition added
188 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA Training 7 Diversity of Long-Term Trainees This measure is fairly straightforward to fill out with one exception: the element “2 or more races� prevents recording the race of the trainee. For example, if a trainee is Black and Asian, he/she will not be included in the Black (or Asian) percentages if placed in the “2 or more� category. This could be fixed if the “2 or more� was a stand alone “Yes/No� question, but would produce minor discontinuity in MCH data collection. The race and ethnicity categories reflected in this measure align with the data collected as part of the U.S. Census data and adhere to the 1997 Office of Management and Budget (OMB) standards on race and ethnicity which guide the Census Bureau data collection. The race and ethnicity categories will not be revised for this measure. No changes neccessary
189 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA Training 8 Title V Collaboration I am worried that this measure will significantly increase time spent by training project grantees in data collection and ascertainment of State Title V versus MCH-related programs since funding of various initiatives may come from multiple sources. In addition, each category will need examples of what counts as “activities� to try to promote clarity in definitions. The definitions of TA could also be further explored; it is unclear why needs assessments “of consumers of training program services� are not included since presumably they are part of MCH population and may provide valuable feedback on services needed.

To keep the time spent in data collection and entry for projects sustainable, I would favor “Yes/No� responses and examples in the narrative, not adding the number of activities in this next grant cycle.

Definition for clinical activity has been added; other definitions/guidelines will be provided by Project Officers. Definition added
190 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA Training 9 Interdisciplinary Practice I agree that adding follow-up at 2 years makes sense given the 5 year cycles. For the trainees, I assume the list will say “check all that apply.� Thus the categories will add up to more than 100% but the use of the term “at least one…� should be easily translated to number and percentages for work in interdisciplinary manner. The item with the phrase “Utilized that information ….� could benefit from rewording since not clear what is intended. The next item would benefit from “Promoted� or “Facilitated� decision-making rather than “Established.� I would also suggest adding an item that encompasses research, quality improvement, and program development e.g “Promoted interdisciplinary collaboration in quality improvement initiatives, research projects, and program development…� Lastly, an item should address technology such as “Developed methods to communicate with interdisciplinary teams using technology…� or “Clarified utilization of information with interdisciplinary team to promote collaboration..� Yes - former trainee survey will say "Check all that apply"; The list of interdisciplinary skills was developed by the DMCHWD performance measure workgroup and will not be revised at this time. No changes neccessary
191 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA Training 10 Diverse Adolescent Involvement (LEAH-specific) I am worried that this measure by adding “activities� will significantly increase time spent by grantees in data collection and ascertainment of how to count "activities." I would suggest “Yes/No� and examples in the narrative. I would suggest that the word “parent� in the second and fourth items be replaced by “parents/families/guardians� to make the PM more inclusive. Activities column has been deleted. Change/ addition to wording
192 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA Training 12 Work with MCH Populations The trainee grads answering this survey often do not understand the meaning of “MCH populations� so the definition on page 89 should be included in the survey and an open ended question such as “Tell us what you have been doing since graduation� included. The definition of MCH populations does appear in the former trainee survey; Indiviudal programs can make modifications/additions to the former trainee survey as they administer it to former trainees, an open ended question has limited utility to the Bureau due to time-intensive nature of coding qualitative, open-ended questions. No changes neccessary
193 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA Training 13 Policy Development, Implementation, and Evaluation The “Yes/No� format works well. Advocacy versus education should be defined so lobbying is excluded. For item #2 (6th bullet) the word “non-scientific� should be deleted since all writing should have a scientific or evidence-based perspective. Item #2 (8) consider alternative wording such as “track a bill using credible Internet sites…�

Item #3 – because of the variable time in programs (a few months to 3 years), it would be helpful to define timing of post-assessment and 2-3 validated questions.

For #4-6, the local, state, national should be “check all that apply� since often overlapping.

For item #5 I would suggest defining what is meant by “MCH advocacy networks� and make sure not lobbying.

For #6 I would add, “communicating research findings, program development, QI, qualitative studies and focus groups, etc…� and delete “(both original and non-original)�.

Deleted non-scientific from bullet #2; Clarified that item #3 applies only to long-term trainees; Changed wording to "all policy areas" in items #4-6 as suggested; Question #6 was developed specifically by the DMCHWD performance measures workgroup, so no changes are recommended. Change/ addition to wording,Definition added
194 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA General Forms/ ADEs
These forms would benefit from some clarification and definitions of an “activity�. I assume that TA activities can have more than one recipient. The forms should also allow the project to check more than one target audience and more than one location. Title V could include all other labels – ie TA could be “Title V�, “within state�, and “national�. Should Title V be an additional primary target audience?

List A – we recommend adding “Other� and/or “Emerging Issue� and also “youth Involvement�

List B- Topics to be considered for inclusion are substance abuse, health disparities, cultural/linguistics competency, faculty development, case development as well as “Other� and/or “Emerging Issue�

Recipient of TA/Collaborator – add “Other�?

To address the following question in the narrative, “C. In the past year have you provided technical assistance on emerging issues that are not represented in the topic list above? YES/ NO. If yes, specify the topic(s):__________________________,� the data need to be collected through the use of “Other/Emerging Issue� in A. and B.

Are we collecting the total number of recipients for TA activities? As noted above, measurement of “number of people reached� in CB 2 is not possible with current resources and would benefit from further study and discussion. Some activities have 1 person reached, some have 200. For the Target audience, it is important to be able to “check all that apply.�

Number of TA encounters, not people reached by TA. Other category was added, otherwise topics are consistent with other measures and MCHB investments. Definition added
195 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA General Forms/ ADEs,Training Forms
The form would benefit from defining an “activity� and multiple examples provided. _x000D_
For List A: Add “youth� to family involvement. _x000D_

See above – need to add “Other� and/or “Emerging Issues� options to answer question on emerging issue._x000D_

I actually liked the 5-10 most noteworthy CE and the ability to highlight emerging issues and other activities but this section could be in narrative. _x000D_
No definition will be added No changes neccessary
196 1/5/2016 Workforce Development/ Training Boston LEAH Program Boston, MA General Forms/ ADEs
This title of this section “In Press peer-reviewed publications in scholarly journals� needs to be renamed as either “Published…� or “Published and in press…� because otherwise there is no section for “published� articles, and the “in press� (i.e. accepted but not published) articles do not have the required data elements of vol/page numbers for this section. In addition, many articles are published electronically first and then in print, others just electronically and the citations do not include the elements on the data form. I would suggest this category be limited to “published� and the data fields be limited to two - year and “citation� -- to deal with the changing landscape of epublications.

Missing entirely from this form are non-peer reviewed, published and often invited and edited review articles, commentaries, editorials, etc. which can have a huge impact on the field and the clinician and public health workforce. There could be a separate category or these could be added to “Reports and monographs� or just change the above first category to delete the word “peer-reviewed� and rename as “Published publications…� or “Published and in press publications’. I would favor changing to published articles that include both peer-reviewed and invited articles, but two categories is also fine.

The proposed data entry forms also further split into “primary author… published� and “contributing author… published�, a distinction that does not have a good definition, is not currently included in CVs, and could result in duplicative entries for articles in which there are both contributions as primary and contributing authors from the training grantee.

I would delete “Submission(s) of peer-reviewed publications to scholarly journals� since being rejected has the potential to overinflate the number of articles. If this category is important for MCHB, I would favor a simple count – ie entry of a number such as “Submissions: 12� – with no further data entered.

It would be beneficial and save time if the “to obtain copies (URL)� for publications, reports, and posters were optional, deleted or defaulted to Pubmed or Google or other search engines; alternatively the field could be optional and the project could enter other information if desired. Books are easily found through Google and Amazon search engines so the field should be optional. Unpublished (and submitted) articles cannot be shared or publishers will not accept the article.

The Conference presentations and posters presented category may be duplicative of CE data collection. Data are counted in both categories via different information.

Web-based products: It would be helpful to add a category that would capture health guides, as they are similar to blogs, podcasts, video, individual products

Press communications: We recommend adding online interviews. This will allow reporting of interviews done for online articles. There is rarely a title for an interview so this field should be optional.

Newsletters: the forms for newsletter needs clarification. Is each issue, each year, etc reported? For example, a newsletter might be weekly or twice a year or other frequency and the dates may be 2014 and 2015. Is each newsletter counted or just entered as two entries. In the reporting period of July 1, 2014 – June 30, 2015, are there 2 entries for the newsletter with different years.

Distance Learning Modules: How to differentiate between these and web-based products/electronic products? Sub-categories are over-lapping and some entries could be reported in both categories.

OER reviewed these comments, and revisions were made to include additions as appropriate while maintaining relevance for different types of programs. Change/ addition to wording
197 1/5/2016 Workforce Development/ Training The Mountain States Genetics Regional Collaborative
PIH 3 Newborn Screening Add facilitation of collaboration between states to Tier 2. Members of the MSGRC, specifically those in the Newborn Screening Work group, have previously played a role on specific issues such as emergency preparedness. Outcome measures as suggested in
Tier 4 will only be obtainable with state and other partnerships. Tier 4 will only be obtainable with state and other partnerships.
While this is certainly important it would not provide meaningful responses in this structure.
No changes neccessary
198 1/5/2016 CSHCN The Mountain States Genetics Regional Collaborative
CSHCN 3,CSHCN 2,CSHCN 1 Transition to Adult Health Care, Access to and Use of Medical Home, Family Engagement Keep Tier 2 points on processes and mechanisms being promoted for considering projects and their outcome measures. Add facilitation of collaboration to Tier 2 measures to emphasize the importance of partnerships with other organizations and MCHB funded programs around consumer engagement, medical home and transition.

Tier 4 points on “training� and measurable links to be medical home should be incorporated into work plans and are attainable measures for both the MSGRC and NCC.

Facilitation of collaboration was added to Tier 2 measures Change/ addition to wording
199 1/5/2016 CSHCN The Mountain States Genetics Regional Collaborative
LC 1 Adequate Health Insurance Coverage Add facilitation of collaboration to Tier 2. Regional collaboratives that have partnered with each other and other organizations to review existing gaps and needs around health insurance coverage. The outcome measures in Tier 4 will be difficult for RCs to measure without such collaboration.

Not being added at this time.

No changes neccessary
200 1/5/2016 CSHCN The Mountain States Genetics Regional Collaborative
CB 1 State capacity for advancing the health of MCH populations Tier 2 measures (delivery of training program, support of state strategic planning activities, provide expertise on priority topics and facilitate state level partnerships) could be developed further for RCs.
Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
201 1/5/2016 CSHCN The Mountain States Genetics Regional Collaborative
CB 2 Technical Assistance Technical!assistance!has!been!a!focal!discussion!point!for!NCC!and!the!RCs!and!is!an!_x000D_
activity!that!can!be!achieved!by!MSGRC.!“Genetics!technical!assistance�!should!be!_x000D_
added!as!an!additional!topic!to!Tier!3.!
Yes.

Change/ addition to wording
202 1/5/2016 CSHCN The Mountain States Genetics Regional Collaborative
CB 6,CB 5,CB 4 Products, Scientific Publications, Sustainability CB 4 (program sustainability) and 5 (production of scientific publications) are feasible measures. Peer reviewed publications should be included in Tier 2 measures. CB6 is currently being accomplished by MSGRC. All of these measures should be considered as a potential focus for the RCs. Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
203 1/5/2016 CSHCN The Mountain States Genetics Regional Collaborative
Core 3,Core 2,Core 1 Health Equity – MCH Outcomes, Quality Improvement, Grant Impact Core 1 (meeting stated aims) and 2 (QI and outcome measurement) are both attainable for grantees. Necessitates more QA and QI to be integrated into program.

Core 3 (equity) is also appropriate as a goal of ensuring all individuals have improved access to genetic services, information, and expertise. MSGRS has focused on underserved populations in region in past.
No resolution needed.

No changes neccessary
204 1/5/2016 CSHCN The Mountain States Genetics Regional Collaborative
Training 2,Training 1 MCH Training Program Cultural Competence, MCH Training Program Family Member/Youth/Community Member participation Achievable and currently being sustained through MSGRC’s engagement of consumer advocates and our involvement with and emphasis on underserved populations. This should be considered a strength of the regional collaborative model. Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
205 1/5/2016 CSHCN Family-Led Organization Newark, NJ Not Related to Measure
Comment Area 1: The necessity and utility of the proposed information collection for the proper performance of the agency's functions

In general, we support the stated need and proposed use of the information sought for the proper performance of HRSA and MCHB’s functions. We agree that the use of scale-based measures to convey program impact can tend to be limiting, and we generally support the structure of the proposed DGIS performance measures in providing a more thorough assessment of impact. We believe that this revision will enhance reporting and convey a more accurate picture of the diverse services that DGIS grantees provide.

No resolution needed. No changes neccessary
206 1/5/2016 CSHCN Family-Led Organization Newark, NJ Not Related to Measure
Comment Area 2: The accuracy of the estimated burden

We agree with the agency’s estimate of 41 burden hours per respondent for a total of 28,700 burden hours across all reporting discretionary grantees, except for Family to Family Health Information Centers, some of whom do not have sophisticated data collection and reporting systems and therefore this data collection and reporting may take them much more than 41 hours. While we support the structural change in how grantees will report compliance with new performance measures, we also believe that the new revisions create a justification for this significant increase in estimated burden hours per grantee. We feel that the increased specificity of the data that each grantee must provide on performance measures, in addition to providing narratives on annual grant reports, are adequate grounds for the increase in estimated burden hours.

However, we note that the data collection required of F2Fs for a grant that is much smaller than the usual MCHB discretionary grant is comparatively much larger and burdensome than that required of larger grants and grantees that usually have more sophisticated data systems
.
Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
207 1/5/2016 CSHCN Family-Led Organization Newark, NJ Not Related to Measure
Comment Area 3: Ways to enhance the quality, utility, and clarity of the information to be collected

In general, we support the overall framework for the updated DGIS Performance Measures. In particular, we support the alignment with already existing Title V Performance Measures in order to create a more cohesive picture of MCHB’s overall purpose and impact. We believe that alignment of priorities across Title V, Home Visiting, Healthy Start, and Healthy People 2020 will not only provide more useful, comparable data across MCHB, but will also provide discretionary grantees with a better project framework for meeting MCHB objectives. However, we do believe that improvements can be made in order to enhance the quality, utility, and clarity of these performance measures to ensure the collection of accurate and useful data.

Specific comments for improvement are attached to specific measures.
No resolution needed. No changes neccessary
208 1/5/2016 CSHCN Family-Led Organization Newark, NJ WMH 4 Depression Screening Under “Tier 4: What are the related outcomes?� the current measurement asks for “% of women screened for depression using a validated tool�. We suggest that “validated tool� be changed to “evidence-based tool.�

Already within Program Specific Measures, Division of Healthy Start and Perinatal Services, Goal 5: Perinatal Screening; the performance measure states “All HS [Healthy Start] participants should receive a perinatal depression screening using an evidence-based depression tool.� This distinction is necessary to ensure that grantees are using screening tools that are consistent with those being used and promoted across all MCHB programs, such as those available through SAMHSA (Substance Abuse and Mental Health Services Administration). Furthermore, we believe the distinction “evidence-based� is important in order to promote screening that combines clinical expertise and scientific evidence with the unique perspectives of patients to ensure that the needs of the population served are being considered and met.

We also note that, although depression screening tools may be validated for certain populations, they may not be validated for other racial/ethnic/linguistic groups. We would encourage the Department to include “culturally and linguistically appropriate evidence-based tool.�

Validated would include this. No changes neccessary
209 1/5/2016 CSHCN Family-Led Organization Newark, NJ WMH 5 Severe Maternal Mortality/Morbidity Under this measure we suggest that an additional level of assessment be added to account for the specific populations that grantees are targeting in providing training and/or services related to maternal mortality/morbidity.

According to the American Public Health Association, some of the largest disparities in risk of maternal death are by race/ethnicity, maternal age, and income (Policy Statement 201114: Reducing Maternal Mortality as a Human Right). We believe that providing data on the populations targeted by grantees in this measure will provide MCHB with a more accurate picture of the communities at greatest risk for experiencing maternal mortality/ morbidity. We base this suggestion in part on the Improving Pregnancy Outcomes projects of the Statewide Parent Advocacy Network, which target underserved women and connect them with proper preconception, prenatal, and interconception care to reduce infant mortality. The projects, which include a focus not only on improving pregnancy outcomes but also on reducing birth defects and developmental disabilities, specifically target outreach to communities of color, immigrants, low-income, uninsured women, and other communities at risk for poor pregnancy outcomes. We believe collecting data on the types of at-risk communities that grantees target will serve to advance Goal 4 of the Maternal and Child Health Equity Blueprint Draft (p. 15) to increase access to quality MCH care and reduce disparities in access for underserved communities.



Disparities exist across all measures, so unless this was considered across the board, it is not reasonable to apply to just this. No changes neccessary
210 1/5/2016 CSHCN Family-Led Organization Newark, NJ CH 3 Developmental Screening Under Tier 4: What are the related outcomes? � we suggest also adding an outcome that assesses the number of physicians/providers trained to use evidence-based/evidence-informed developmental screening tools that are appropriate for diverse populations, and who are trained to use them with diverse populations, including training to communicate effectively with parents from diverse racial, ethnic, language, and socio-economic backgrounds._x000D_
Parents can often be the most reliable source of information when it comes to their children ™s development. Evidence-based screening tools that use parent information can help foster systematic communication about a child ™s development and create a positive relationship between providers and families (CDC, Developmental Monitoring and Screening for Health Professionals, November 2015). Promoting developmental screening tools and educating physicians on screening and referral creates a more integrated system of care that enhances a family-centered medical home for all children. Ensuring that physicians are trained in effective communication with families from diverse backgrounds will help to reduce the gap in screening, follow up evaluation when needed, and access to needed services by addressing the cultural barriers that too often lead to children from diverse backgrounds being lost to follow-up � after screening reveals the need for further evaluation._x000D_
_x000D_
It is also important to track actual use of those screening tools as well as age, by at least race/ethnicity and language, of when potential problems are identified via screening as well as of diagnosis, if any. This is a critical area of health disparities that can have lifelong consequences._x000D_
Missing: Access to and Use of Medical Home_x000D_
We suggest that a performance measure be added to the Child Health domain to address promoting and/or facilitating medical home access for all children._x000D_
_x000D_
A key factor in the evolution of the medical home concept is that it has expanded beyond children and youth with special health care needs to include all children and adults. Promoting medical home access for all children, not just CSHCN, is aligned with Healthy People 2020, MICH Objective 30, to increase the proportion of all children who have medical home access. Furthermore, this performance measure would correspond with the Maternal and Child Health Equity Blueprint Draft goals to increase access to quality MCH care (Goal 4, p. 15) and strengthen MCH systems of care (Goal 5, p. 16). Facilitating medical home access for all children ensures we still reach non-CSHCN populations that would greatly benefit from coordinated, family-centered care. We recommend that this performance measure include a component that continues to allow us to gauge the extent to which CSHCN have access to a medical home within the overall population of children._x000D_
_x000D_
Missing: Family Engagement_x000D_
We suggest that a performance measure be added to the Child Health domain to address promoting and/or facilitating family engagement in children ™s health systems._x000D_
_x000D_
MCH programs are most successful when they engage the families impacted by the policies, systems, and services they promote at all levels. Including families at the policy and planning level and engaging parent leaders and parent leader organizations has proved to be a successful and efficient strategy in CSHCN systems. Engaging families ensures that MCH services are properly targeted and that resources are not unnecessarily wasted. Family Voices has already begun to explore engagement of family leader organizations regarding non-CSHCN measures within Title V. This measure is aligned with the Healthy People 2020, 10 Essential Public Health Services; as well as the Maternal and Child Health Equity Blueprint Draft goal to strengthen MCH systems of care (Goal 5, p. 16)._x000D_
It is important to note that family engagement is a shared responsibility that also requires partnerships with family leaders, family organizations including Family to Family Health Information Centers and Family Voices State Affiliate Organizations, EI/education-focused parent centers, parent to parent programs, parent advisory councils, Federation of Families for Children ™s Mental Health chapters, community-based organizations and immigrant organizations that serve diverse families, etc._x000D_
_x000D_
Family leaders from diverse backgrounds and family-led organizations can play critical roles in helping health institutions and professionals understand how to more effectively engage, support, and partner with, diverse families, including but not limited to families of children with disabilities and special health care needs, limited English proficiency/ English language learners, of color, from lower socio-economic backgrounds, diverse religious backgrounds, etc. Family leaders from diverse backgrounds and family-led organizations can also serve as family cultural brokers, � helping to strengthen connections between health organizations and the children and families they serve._x000D_
_x000D_
Family and family organization engagement indicators must be developed and integrated into existing data systems. Further, MCH programs must be encouraged to use data from family organizations such as data from F2Fs, FV SAOs, FFCMH chapters, EI/education- focused parent centers, and parent to parent programs, as well as to work with family organizations to develop, disseminate, and analyze results of surveys, focus groups, and other mechanisms that are most likely to garner diverse family feedback._x000D_
While important, there are not sufficient programs funded for this type of work to justify this suggesstion. No changes neccessary
211 1/5/2016 CSHCN Family-Led Organization Newark, NJ CSHCN 3 Transition to Adult Health Care Transition (CSHCN 3)

Under “Tier 4: What are the related outcomes?� we suggest adding an outcome measurement that focuses on the number of adult/general family doctors who are trained on providing adult health care services to people with disabilities. In addition, providers should be connecting, and documenting connections of, families to transition resources such as those provided by the Centers for Independent Living found at www.ncil.org.

Traditionally, physicians who have expertise in caring for persons with disabilities have practiced primarily in pediatric medicine. One of the major barriers to transition to adult health care for children and youth with special health care needs has been a lack of physician knowledge about transition and an attitude that they are a distinctly separate population as opposed to adults with the condition or characteristic of having a disability (The New Jersey Action Blueprint for Transition to Adult Health Care, p. 13). We believe that this is an important outcome to measure in ensuring that CSHCN are not only ready to transition to adult health systems themselves, but that networks of informed and trained general physicians are ready to serve them to maintain the presence of a medical home through adulthood.

Comment noted but will not include the suggested outcome at this time. No changes neccessary
212 1/5/2016 CSHCN Family-Led Organization Newark, NJ AH 3 Screening for Major Depressive Disorder We suggest adding Healthy People 2020, MHMD 11.2 – “Increase the proportion of primary care physician office visits where youth aged 12 to 18 years are screened for depression� to the “Benchmark Data Sources� section of this performance measure.

One of the Tier 4 outcomes on this measure examines “% of adolescent well care visits that include screening for MDD.� The Healthy People 2020 objective will provide relevant benchmark data for this particular outcome measure. In addition to evidence-based models such as Teen Screen, the issue of network adequacy must be addressed, including the use of innovative models such as the NJ Children’s Primary Care Psychiatry Collaborative which, along with programs in over 30 other states, addresses specialist shortages by utilizing child psychiatrists in a consultative model with primary care.

Add Healthy People 2020, MHMD 11.2 – “Increase the proportion of primary care physician office visits where youth aged 12 to 18 years are screened for depression� to the “Benchmark Data Sources� section of this performance measure Change/ addition to wording
213 1/5/2016 CSHCN Family-Led Organization Newark, NJ CB 2 Technical Assistance Under “Tier 2: To whom are you providing TA?� missing from the list are “program participants/the public.� The “Significance� portion of this measure as well as the “Data Collection Form for #CB 2� both indicate that the public are to be included in the technical assistance data collection; it therefore should also be listed under Tier 2. Fix: In Tier 2, missing from the list are “program participants/the public.� -- it is referenced elsewhere in the measure. Change/ addition to wording
214 1/5/2016 CSHCN Family-Led Organization Newark, NJ Core 2 Quality Improvement Under “Tier 2: QI Initiative,� we encourage the inclusion of a question that asks whether grantees are engaging families/consumers in their quality improvement initiatives. Keeping in mind the goal to align priorities with Title V, we believe it is important to ensure that grantees implementing quality improvement strategies engage program participants, diverse families, and family-led organizations in their process(es). Family/ consumer and family-led organization engagement in quality improvement activities and strategies is consistent with, and required by, the Title V Maternal and Child Health Services Block Grant to the States Program Guidance. While valuable, this recommendation will not be implemented at this time. No changes neccessary
215 1/5/2016 CSHCN Family-Led Organization Newark, NJ Core 3 Health Equity – MCH Outcomes Under “Tier 2: Please select within which of the following domains your program addresses health equity� there is an exhaustive list of domains. We believe that this list should be edited to include the domains of “Religion,� “Age,� “Mental Health Status;� and “Other� to create a more comprehensive and open-ended list of domains._x000D_
_x000D_
Within the Maternal and Child Health Equity Blueprint Draft (p. 4), “health disparities� has been defined to mean those groups who experience “greater obstacles to health based on their racial or ethnic group; religion; socioeconomic status; gender; age; mental health; cognitive, sensory, or physical disability; sexual orientation or gender identity; geographic location; or other characteristics historically linked to discrimination or exclusion.� With this definition in mind, we believe that a performance measure related to health equity should include the aforementioned missing domains, as well as provide an open-ended list for grantees to identify domains not explicitly stated that they may be targeting in their programs based on their own data and population characteristics as well as family/consumer input. Here again network inadequacies must be addressed as these lead to health disparities and poor outcomes._x000D_
Look at using this for health disparities-- cross check with list. Change/ addition to wording
216 1/5/2016 CSHCN Family-Led Organization Newark, NJ Training 1 MCH Training Program Family Member/Youth/Community Member participation Family/Youth/Community Engagement in MCH Training Programs (Training 1)_x000D_
We strongly support the inclusion of this measure within the MCH Workforce Development program. We believe that family engagement and the creation of parent/youth/community leaders ultimately leads to better overall health outcomes for MCH populations and gives grantees and MCH a more complete understanding of the population they serve—increasing efficiency and effectiveness of MCH programs. This measure should specify that relevant professional development opportunities should be provided to diverse family leaders and family organizations who are a key component of the MCH workforce._x000D_
This is a training-specific measure, and therefore not to be assigned by other types of programs. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
217 1/5/2016 CSHCN Family-Led Organization Newark, NJ Training 2 MCH Training Program Cultural Competence Cultural Competence in MCH Training Programs (Training 2)

We strongly support the inclusion of this measure within the MCH Workforce Development program. Building a culturally and linguistically competent workforce within MCH is crucial to closing the health equity gap and reducing health disparities in MCH populations. We recommend that this measure clarify that culture is not just race, ethnicity, or language but also involves religion, geography, socio-economic status, etc., as per the definition from the National Center for Cultural Competence:

“Culture is an integrated pattern of human behavior which includes but is not limited to - thought, communication, languages, beliefs, values, practices, customs, courtesies, rituals, manners of interacting, roles, relationships, and expected behaviors of a racial, ethnic, religious, social or political group; includes gender, sexual orientation, etc.�

The primary focus of cultural and linguistic competence in this measure is around race and ethnicity. No revisions will be made at this time. No changes neccessary
218 1/5/2016 CSHCN Family-Led Organization Newark, NJ Training 5 MCH Pipeline Program – Work with underserved or vulnerable populations MCH Pipeline Program (Training 5)_x000D_
We generally support the inclusion of this measure in MCH Workforce Development. We believe it is essential that MCH have a trained workforce that mirrors its targeted populations—culturally, ethnically, linguistically—in providing quality healthcare to vulnerable and underserved communities._x000D_
_x000D_
However, we feel that the definition used to identify “vulnerable populations� is too limiting here. The performance measure gives a limited list by clarifying “vulnerable populations� to mean “i.e. Immigrant Populations Tribal Populations, Migrant Populations, Uninsured Populations, Individuals Who Have Experienced Family Violence, Homeless, Foster Care, HIV/AIDS, etc.� We believe that the use of a broader definition of “vulnerable populations�—such as that given by the CDC—would ensure that populations such as CSHCN and LGBTQ youth would be included in health equity measures. The CDC’s definition includes “race/ethnicity, socio-economic status, geography, gender, age, disability status, risk status related to sex and gender, and among other populations identified as at-risk for health disparities� which includes populations such as “cancer survivors, immigrants and refugees, incarcerated men & women, persons who use drugs, pregnant women, veterans, etc.� (Centers for Disease and Control Prevention, Minority Health, Other At Risk Populations, February 2014). We feel it is important to have an inclusive definition of “vulnerable populations� in order to ensure gaps in equity are truly met._x000D_
The primary focus of cultural and linguistic competence in this measure is around race and ethnicity. No revisions will be made at this time. No changes neccessary
219 1/5/2016 CSHCN Family-Led Organization Newark, NJ Training 10 Diverse Adolescent Involvement (LEAH-specific) Diverse Adolescent Involvement (Training 10)_x000D_
We strongly support the inclusion of this measure within the MCH Workforce Development program. We believe that consumers of health care services – children and families – should play a critical role in informing policy and driving program activities that are relevant to the services they consume. Involvement of diverse families and adolescents in the training of future leaders in adolescent health is paramount to ensuring a culturally competent workforce able to serve MCH populations._x000D_
Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
220 1/5/2016 CSHCN Family-Led Organization Newark, NJ F2F 1 Provide National Leadership for families with children with special health needs Other comments_x000D_
Please note that the data collection form for F2F 1 does not include the race/ethnicity category of Native American/American Indian or Alaskan Native, and it should. These groups should not be lumped under “Other.�_x000D_
Per the recommended comment, the "Native American/American Indian or Alaskan Native category" will be added to the race category within item A1b. Change/ addition to wording,Definition added
221 1/5/2016 Healthy Start Healthy Start - Dallas - Parkland Health and Hospital System Dallas, TX WMH 1 Prenatal Care Because Healthy Start programs serve the highest risk women, they often recruit women who have denied their pregnancies, or not sought prenatal care because of financial or other problems. Healthy Start fills the gap and recruits and serves women who do not qualify for many other home visiting programs (e.g., homeless). No Resolution needed, just noting that Healthy Start programs are likely to have far lower rates than the national average for this measure. No changes neccessary
222 1/5/2016 Healthy Start Healthy Start - Dallas - Parkland Health and Hospital System Dallas, TX WMH 3 Well Woman Visit/ Preventive Health Care Well women visits are different from prenatal care. Should there be something identifying that either prenatal visits are counted or only non-pregnant women should be included in the denominator? The Healthy Start workgroup recommended the following definition/ clarification for this: For purposes of reporting, a prenatal visit or postpartum visit during the twelve month period would meet the standard. Change/ addition to wording
223 1/5/2016 Healthy Start Healthy Start - Dallas - Parkland Health and Hospital System Dallas, TX WMH 5 Severe Maternal Mortality/Morbidity This would require a good definition of what constitutes 'a woman needing services to address maternal mortality and morbidity'. There should be very good guidance to let programs know the criteria for inclusion in a standardized manner. Measure was removed. Measure or portion referenced was removed.
224 1/5/2016 Healthy Start Healthy Start - Dallas - Parkland Health and Hospital System Dallas, TX PIH 2 Breast Feeding Bother denominators should clarify that it includes only infants who were born into the program - not infants who may have been recruited into the program after they were born (inter conception). Also, the denominator for 6 months breastfeeding should only include infants who have reached 6 months of age - not all infants. Addressed in comment 299.

Addressed in other similar or identical comment.
225 1/5/2016 Healthy Start Healthy Start - Dallas - Parkland Health and Hospital System Dallas, TX PIH 3 Newborn Screening Newborn screening is outside of the range of activities performed by Healthy Start who are generally not health care providers. Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
226 1/5/2016 Healthy Start Healthy Start - Dallas - Parkland Health and Hospital System Dallas, TX CH 3 Developmental Screening For Healthy Start consider the older age being reduced to 24 months since that's as long as they are served in this program. Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
227 1/5/2016 Healthy Start Healthy Start - Dallas - Parkland Health and Hospital System Dallas, TX LC 1 Adequate Health Insurance Coverage The health insurance measure is problematic for programs in states where they did not accept the Medicaid expansion. Interconception women mostly do not qualify for any insurance in these states and there are no options for them. For the measure regarding adequate insurance, it is important that you provide a very clear definition of what "adequate" means - criteria. We were told at a Healthy Start meeting that we could not count Pregnancy Medicaid as insurance since it is temporary, limited to the time the woman is pregnant. Please clarify and allow that to count. Definition has been added.

Definition added
228 1/5/2016 Healthy Start Healthy Start - Dallas - Parkland Health and Hospital System Dallas, TX LC 2 Tobacco and eCigarette Cessation While we appreciate the importance of a smoke-free environment, we will not have control over someone in the household who smokes since we are not serving the family (e.g., grandmothers, aunts, etc.). The measure regarding infants and children who live in households where someone smokes would not be a good one. We would be more amenable to a measure regarding whether the mother with children smokes because we can possible have an effect on that. Addressed with the addition of the population measure and impact measure, as recommended by the Healthy Start workgroup. No changes neccessary
229 1/5/2016 Healthy Start Healthy Start - Dallas - Parkland Health and Hospital System Dallas, TX LC 3 Oral Health Would HRSA provide an oral health risk assessment instrument? Can you please clarify what that would consist of? The infant oral health measure should be rephrased to include 12-24 months since that is the population served by this program. Guidance by project officers upon assigning measures, as necessary. No changes neccessary
230 1/4/2016 CSHCN,Workforce Development/ Training Association of University Centers on Disabilities

Silver Spring, MD Training 1 MCH Training Program Family Member/Youth/Community Member participation There are 8 Program-specific measures developed by DMCHWD that are relevant to the LENDs; we also understand that some of these will likely be assigned to the autism training resource center. Comments on these are as follows:_x000D_


Training 1 (Family member/youth/community member participation): The use of “Family members/youth/community members� in this performance measure is confusing. It is not clear whether a program needs to have all of these categories of participants for an element to indicate a YES response for each element, or whether a program just needs one of these groups. We recommend including some clarifying language for this.
Added and/or changed language to clarify instructions. Change/ addition to wording
231 1/4/2016 CSHCN,Workforce Development/ Training Association of University Centers on Disabilities

Silver Spring, MD Training 13 Policy Development, Implementation, and Evaluation Training 13 (Policy): Most of the elements for this measure (2-6) include additional data collection in addition to a YES/NO response. The usefulness of the level of detail being requested for training programs is unclear. In particular, Element 3 requires the documentation of the percentage of trainees reporting increased policy knowledge and increased policy skills. This type of data (increased knowledge and skills) is not collected for any other training area and it is unclear why policy should be singled out for this, adding additional reporting burden to programs. In addition, it is not specified which trainees should be reported (i.e. long-term, medium-term, etc.). We recommend that simple YES/NO responses be required for all of the elements of this performance measure. This measure was developed in coordination with the DMCHWD performance measure workgrup and will not be revised. The pre/post knowledge item was clarified to indicate that it only applies to long-term trainees No changes neccessary
232 1/4/2016 CSHCN,Workforce Development/ Training Association of University Centers on Disabilities

Silver Spring, MD General Forms/ ADEs
Technical Assistance/Collaboration Form: This form includes a new “List B� for grantees to select the topic(s) of technical assistance/collaboration. This new list no longer includes the topics most relevant for LEND training programs and the autism training resource center such as Early Childhood Health/Development, CSHCN/Developmental Disabilities, and Autism. In addition, there is no “Other� category. Given that the funding for LEND training programs and the autism training resource centers is currently authorized under the Autism CARES legislation and MCHB must report specifically on activities related to autism and related developmental disabilities, we suggest including at minimum the topics CSHCN/Developmental Disabilities and Autism. We also suggest adding in an “Other� response selection. DMCHWD has own CE form with relevant topics; DMCHWD made suggestions for addition to TA form for topics related to Autism CARES legislation Change/ addition to wording
233 1/4/2016 CSHCN,Workforce Development/ Training Association of University Centers on Disabilities

Silver Spring, MD General Forms/ ADEs
Continuing Education Form: The same comments and recommendations made for the Technical Assistance/Collaboration form above would apply to the Continuing Education form. We understand, however, that the DMCHWD may be replacing this with another form that addresses these comments. DMCHWD will have CE-specific form, therefore this comment is erroneous for DMCHWD programs.

No changes neccessary
234 1/4/2016 CSHCN,Workforce Development/ Training Association of University Centers on Disabilities

Silver Spring, MD Core 1 Grant Impact Core Measure 1 (Grant Impact): LEND training grants and the autism training resource center grant are all currently 5 years in duration. It is not unusual for programs to make some revisions to objectives during the grant period in the process of continuous quality improvement. It is unclear how this will be addressed within this performance measure. We have concerns that with prepopulated objectives from the grant application, some programs that make changes to their objectives may have the appearance of poor performance. If this measure is to be used, we recommend that there be a process whereby programs, in consultation with their project officers if needed, are able to change the prepopulated objectives. This is prepopulated from FOA, not based on individual project objectives.

No changes neccessary
235 1/4/2016 CSHCN,Workforce Development/ Training Association of University Centers on Disabilities

Silver Spring, MD Core 2 Quality Improvement Core Measure 2 (Quality Improvement): In Tier 4, the related outcomes listed do not correspond to all of a program’s potentially reported quality improvement aims (Tier 2); therefore, important information could be missing related to positive outcomes of quality improvement efforts.

We recommend that for Tier 4, the related outcomes be expanded to match the aims reported in Tier 2, and that programs only be required to complete Tier 4 outcomes for the aims selected in Tier 2.
Add 3rd cat. to Tier 4: Systems improvement -- Call it Cross Sectorial Collaboration Change/ addition to wording
236 1/4/2016 CSHCN,Workforce Development/ Training Association of University Centers on Disabilities

Silver Spring, MD CB 1 State capacity for advancing the health of MCH populations Capacity Building 1 (State capacity): It is our understanding that this performance measure is intended for programs of a national scope such as resource centers and therefore would not be appropriate for LEND training programs. We understand that it could potentially be assigned to the autism training resource center, however. This performance measure in its current form is very confusing. It is unclear what information goes in the blank that is to be “prepopulated with program focus� in Tier 1. Given that this impacts most of the responses in all of the following Tiers, it is difficult to make specific recommendations on the usefulness and appropriateness of this measure for the autism training resource center. We can comment that many of the Tier 2-4 elements do not appear to apply to the autism training resource center based on its current function, however, and therefore this may not be a useful measure to assign. No resolution needed. No changes neccessary
237 1/4/2016 CSHCN,Workforce Development/ Training Association of University Centers on Disabilities

Silver Spring, MD CB 2 Technical Assistance Capacity Building 2 (Technical Assistance): The data required in this measure overlap substantially with the data that LEND training grantees and the autism training resource center already provide in the Technical Assistance/Collaboration form. In addition, particularly for the LEND training programs, we do not feel that the reporting burden for the level of detail requested in this measure is reasonable. We would recommend that this performance measure NOT be assigned to LEND training programs.

If this measure must be assigned in some fashion, we recommend that Tier 3 NOT be assigned and that there be an auto-population of data between this performance measure and the Technical Assistance/Collaboration form so grantees are not entering the same data twice.
Duplication will not be an issue because data system will autofill. No changes neccessary
238 1/4/2016 CSHCN,Workforce Development/ Training Association of University Centers on Disabilities

Silver Spring, MD CB 3 Impact Measurement Capacity Building 3 (Impact Measurement): LEND training programs report impact data on trainees as part of their program-specific performance measures, therefore it is not clear how useful this measure would be for these programs. We would suggest that this measure NOT be assigned or that only Tiers 1 and 2 be assigned if necessary. This comment relates to ability to report. No changes neccessary
239 1/4/2016 CSHCN,Workforce Development/ Training Association of University Centers on Disabilities

Silver Spring, MD CB 5,CB 6 Scientific Publications, Products Capacity Building 5 & 6 (Scientific Publications and Products): The data required in these two measures overlap completely with specific data required for the Products, Publications and Submissions Data Collection Form. If these measures must be assigned, we recommend that there be an auto-population of data between these two performance measure and the Products, Publications and Submissions Data Collection Form so grantees are not entering the same data twice. Will autopopulate, therefore mot creating duplication No changes neccessary
240 1/4/2016 CSHCN,Workforce Development/ Training Association of University Centers on Disabilities

Silver Spring, MD CSHCN 2 Access to and Use of Medical Home It is our understanding that some Domain Specific Measures may be assigned to LEND training programs and the autism training resource center. There are only three Domain Specific Measures that we think could be considered for these programs: CSHCN 2 (Medical Home), and CSHCN 3 (Transition), and Child Health 3 (Developmental Screening)._x000D_


CSHCN 2 (Medical Home): Tiers 1 and 2 of this measure could be considered for LEND training programs. The additional data required in Tiers 3 and 4 is not reasonable for these programs as medical home is not a core area of focus.
Noted. This can be taken into consideration when we assign measures later this year. No changes neccessary
241 1/4/2016 CSHCN,Workforce Development/ Training Association of University Centers on Disabilities

Silver Spring, MD CSHCN 3 Transition to Adult Health Care CSHCN 3 (Transition): Tiers 1 and 2 of this measure could be considered for LEND training programs. The additional data required in Tiers 3 and 4 is not reasonable for these programs as transition is not a core area of focus for these programs. Noted. This can be taken into consideration when we assign measures later this year. No changes neccessary
242 1/4/2016 CSHCN,Workforce Development/ Training Association of University Centers on Disabilities

Silver Spring, MD CH 3 Developmental Screening Child Health 3 (Developmental Screening): Tiers 1 and 2 of this measure could be considered for LEND training programs. The additional data required in Tiers 3 and 4 is not reasonable for these programs. Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
243 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL Not Related to Measure
'- The reporting process seems more streamlined compared to earlier processes; projects will only respond to measures that specifically apply to them.

- It is helpful to have the program specific measures assigned by project officers who are most familiar with the projects and can choose the most applicable measures. It may be beneficial to indicate somewhere throughout the measures that it is preferable for organizations that are recipients of cooperative agreements to partner with their project officer to determine what specific measures apply to them.

- Under the Impact Measurement goal and Capacity Building Domain, it would be helpful to better describe some of the tools, such as case reports and qualitative assessment.

- In general for larger national level projects (such as national technical assistance centers) it is often challenging to answer many of the Tier 4 questions about related outcomes due to the difficulty in calculating the target population in such a large catchment area.

- In general, the Academy supports the new tiered response format as it is clear and direct. However, it is suggested that an additional option be given to grantees that respond “No� in Tier 1. Given that MCHB wants to collect useful data, adding an additional question such as "How or why is the measure not applicable to you?" would clarify what is expected in terms of a response and also would help grantees provide more meaningful data to project officers as well as resource centers. There is no guarantee that 100% of grantees assigned a measure will respond "Yes" to it. Knowing why a grantee would say "No" is important from a quality improvement perspective, as well.

- The Core Measures are important for all grantees and the AAP is supportive of retaining them.
- The necessity and utility of the proposed information collection for the proper performance of the agency’s functions is laudable.

- It is difficult to assess the accuracy of the estimated burden in a meaningful manner given the information available for public comment; however, upon review it appears that the estimated burden will be less as compared to what was required from discretionary grantees previously.

- The document and information contained within includes some ways to enhance the quality, utility and clarity of the information to be collected.

- It is difficult to assess from the available information whether or not the use of automated collection techniques or other forms of information technology to minimize the information collection burden will truly decrease the burden related to same for discretionary grantees.
Needs to be handled on a case-by-case basis-- if someone responds no, then the PO needs to follow up on that. No changes neccessary
244 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL PIH 2 Breast Feeding Consider including an additional measure under Tier 4—Percent of premature infants (less than 37 weeks) who exclusively received human milk in the Neonatal Intensive Care Unit to address the increased risk of sepsis and necrotizing enterocolitis. Consider including the following numerator—Premature infants of program participants who were exclusively fed human milk while in the NICU. Consider including the following denominator—Premature infants of program participants. No grants address this specifically-- so while the idea is good, this is not reasonable for anyone to report. No changes neccessary
245 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL PIH 3 Newborn Screening '- As written, this measure appears applicable only to state newborn screening and follow up programs. There are numerous discretionary grantees who engage in work/activities focused on various aspects of newborn screening to whom this performance measure may apply (eg, critical congenital heart defects, early hearing detection and intervention, genetics/family history). If these discretionary grants are taken into consideration and required to report information related to this performance measure, much of the information listed in the definition tiers will need to be modified for relevance._x000D_
- The Bright Futures National Center (BFNC) is funded through a cooperative agreement with MCHB HRSA. Section 2713 of the ACA (Coverage of Preventive Health Services) recognizes the importance of preventive care for children by including a critical provision to ensure that children enrolled in all individual and group non-grandfathered health care plans receive the preventive care as recommended in the Bright Futures Guidelines (and on the Bright Futures/AAP Periodicity Schedule). Newborn Screening is on the Bright Futures/AAP Periodicity Schedule. Newborn screening is promoted through the BFNC. However, individual screening and follow-up could not be reported through BFNC as it measured at the community and/or health care provider level.
Noted. This can be taken into consideration when we assign measures later this year.
Full response: The measure does not only apply to state NBS programs and that the PO will determine whether the PM is appropriate and if so, what Tier is relevant to the grant activities. For example, Tiers 1, 2, and 3 will apply to programs that support the NBS process in general, for example, our currently funded NBS Clearinghouse, NBS Data Repository and TA Center, and the Regional Genetic Services Collaboratives. Grants that support implementing NBS at the state level, whether CCHD or Hearing or a newly added condition, could be asked to report on Tier 4 in addition to 1,2, and 3. Tier 4, timeliness of reporting, applies no matter what the specific condition is.

"Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
246 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL CH 1 Well Child Visit '- The Bright Futures National Center (BFNC) is funded through a cooperative agreement with MCHB HRSA. Section 2713 of the ACA (Coverage of Preventive Health Services) recognizes the importance of preventive care for children by including a critical provision to ensure that children enrolled in all individual and group non-grandfathered health care plans receive the preventive care as recommended in the Bright Futures Guidelines (and on the Bright Futures/AAP Periodicity Schedule)._x000D_
- There are 31 recommended child well visits on the Bright Futures/AAP Periodicity Schedule. Child well visits are promoted through the BFNC. However, the % of children enrolled could not be reported through BFNC as it measured at the community and/or health care provider level._x000D_
- Suggest considering ways to incorporate language from Bright Futures into these measures (eg, when “annual screenings� are referenced).
Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
247 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL CH 2 Quality of Well Child Visit As noted above, there are 31 recommended child well visits on the Bright Futures/AAP Periodicity Schedule. Child well visits are promoted through Bright Futures. However, measuring the subjective value of “quality� is very difficult. The proposed measure should more clearly define “quality� when gathering the % of providers conducting the recommended well child visits. Definition from Bright Futures should be referenced. Definition added
248 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL CH 3 Developmental Screening '- Tier 3—Clarification is needed regarding what “# receiving education through outreach� means and how the number related to same should be calculated. The form does not provide adequate information for grantees completing/measuring same in order to ensure consistency in responses among grantees. In addition, it may be difficult to differentiate between different groups, such as consumers versus providers/professionals, especially for ongoing outreach activities, such as newsletters. The resources required to do this in an accurate and methodical manner would be excessive.

- Tier 4—The measure focuses on developmental screening and follow-up, but the outcome (numerator and denominator) only focuses on the completion of developmental screenings. Consider adding another outcome related to developmental screening referrals/follow-up to align with the measure and to ensure that action is being taken when a positive developmental screen is found.

- Developmental screening is on the Bright Futures/AAP Periodicity Schedule (9 month, 18 month, and 30 month well child visits). Developmental screening and surveillance is promoted through Bright Futures for all well child visits. However, individual screening and follow-up could not be reported through BFNC as it measured at the community and/or health care provider level.
Added clarification on how to tally for each data collection form. Further guidance will also be provided when measures are assigned.

Change/ addition to wording,Definition added
249 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL CSHCN 1 Family Engagement - Performance Measure, Goal, Measure and Tier 1 ”Suggest clarifying the following question: Is the program promoting/facilitating family engagement among FAMILIES of children and youth with special health care needs (and perhaps youth themselves)? As it reads now, the language implies CSHCN engagement._x000D_
- Tier 2 ”Suggest adding an option related to engagement of families in strategic planning/advisory capacity._x000D_
- Tier 3 ”Suggest clarifying how # educated/receiving information � is different from # receiving TA �; also suggest further clarification and guidance is needed in order to adequately and accurately track the # educated/receiving information �._x000D_
- Tier 4 ”Need clarification regarding what constitutes teams � in several items included in this tier; need further guidance on where the number and denominator information for this tier can be found so that grantees are able to report consistently and in line with how/what other grantees are reporting; catchment area implies that this information may be applicable only to local/community/state grantees, not national grantees and, as such, needs clarification; and guidance is needed to help grantees determine what constitutes racial and ethnic family and CSHCN leaders �.
Refer to response for Comment 62, Project Officers will provide guidance to grantees about data collection and technical assistance regarding measures._x000D_
_x000D_

_x000D_
-Refer to response for Comment 69_x000D_
_x000D_
-Project Officers will provide technical assistance to gran
Change/ addition to wording,Definition added,"Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
250 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL CSHCN 2 Access to and Use of Medical Home '- Tier 2—Suggest clarifying “referral/care coordination.� Some programs may be providing direct care coordination services, while others (national technical assistance centers) do not necessarily coordinate care for families but provide resources for others to do so.

- Tier 3—Suggest clarifying “# receiving tracking and monitoring.� Does this refer to tracking and monitoring number of medical homes in the grant’s catchment area, or monitoring number of CSHCN who receive care in medical homes, or monitoring in general of grant activities for evaluation purposes?

- Tier 3—Suggest clarifying how to distinguish between what constitutes “# trained� versus “#educated/receiving information�. Need clarification regarding what is meant by “# referred� and “# receiving tracking and monitoring�.

- Tier 4—Suggest clarifying what “direct linkage� means and how to define and measure it.

- Significance—How can a “cultivated partnership� be measured in a quantifiable and meaningful way?
-Referral/care coordination in Tier 2 are for those grantees providing direct services/information about care coordination and/or improving the referral process to facilitate care coordination._x000D_
-Tracking and monitoring refers to tracking and monitoring t
Change/ addition to wording,Definition added
251 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL CSHCN 3 Transition to Adult Health Care '- Tier 2—Consider including youth involvement in designing and implementing grantee activities. If added, this would need a related measure in Tier 3 (eg, “#youth involved�)._x000D_
- Tier 3—Suggest further clarifying what tools can/should be used to assess readiness (“# assessed for readiness�)._x000D_
- Consider encouraging grantees to utilize the MCHB-funded Got Transition materials; specifically those focused on the 6 core elements of healthcare transition and related measures, tools, materials and resources._x000D_
- Significance—Suggest language that is more appropriate for this age group. Perhaps language that emphasizes youth/young adult involvement in and responsibility for their own health care.
Project Officers will provide technical assistance to grantees regarding the performance measures No changes neccessary
252 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL AH 1 Adolescent Well Visit -- Consider including the importance of preparing for the transition to adult health care.
-- There are 11 recommended adolescent well visits on the Bright Futures/AAP Periodicity Schedule. Adolescent well visits are promoted through Bright Futures. Howev
The Adolescent and Young Adult Health National Resource Center (AYAH- NRC) largely focuses on increasing access to and the quality of preventive health services for adolescents and young adults. A large part of its activities revolves around an AYAH-CoIIN, which has developed a set of national strategies. In general, the three national strategies are sub-divided to address the needs of adolescents and of young adults as strategies may differ based on the age group and its developmental needs (short titles: Improve access, improve quality of clinical care, and improve service delivery systems). Each national strategy has a list of proposed specific tactics/approaches. The listed tactics/approaches do not specifically mention transition of care from pediatric to adult care because a different MCHB-supported resource center, the Center for Health Care Transition Improvement (gottransition.org), works to improve transition from pediatric to adult health care through the use of new and innovative strategies for health professionals and youth and families. The two resource centers have a cooperative relationship. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
253 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL AH 3 Screening for Major Depressive Disorder - Adolescent depression screening is on the Bright Futures/AAP Periodicity Schedule for 11-21 year olds. Depression screening is promoted through Bright Futures for all adolescent well visits. However, individual screening and follow-up could not be repor To allow reporting from the largest number of programs, changes have been made to allow reporting on any adolescents served, with a place to report age range. Change/ addition to wording
254 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL LC 3 Oral Health Consider adding the following measure:
-- Percent of program participants aged 6 months to 5 years who received topical fluoride varnish application during the last year. ------- numerator: infants and children involved with the program who received topical fluoride varnish application in the reporting year.
--denominator: infants and children involved with the program during the reporting year.
-- Consider incorporation of oral health needs and challenges specific to the CYSHCN population
Not widely reportable, not likely to be valuable in the bigger picture, so not added at this time. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
255 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL CB 2 Technical Assistance - Tier 2—Suggest adding “Families�.

- Tier 2—Although a definition/description of technical assistance is provided, the definition is so broad and all encompassing that it has the potential for grantees to include far too much related to their wo
Define categories to clarify where 'families' would be included, etc. Definition added
256 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL CB 3 Impact Measurement - Tier 1—Suggest clarifying/defining what “impact measurement� means.
- Tier 2—Suggest specifying what a “case report� means and clarifying how this relates to all discretionary grantees.
- Tier 4—Suggest rethinking and reframing this tier a
Need definition for Case Report. Definition added
257 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL CB 4 Sustainability Tier 3—Need clarification on what “How many are reached through those activities?� and what N/A means; this is confusing as presented. NA-- There is no Tier 3 Measure for this.

Change/ addition to wording
258 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL CB 5 Scientific Publications Tier 4—Is tracking of dissemination vehicles a way to assess outcomes? Also, there are numerous challenges related to tracking the information listed correctly and adequately; doing so potentially requires an inordinate amount of resources and related capacity to monitor/track same. We are assessing how it is disseminated, not how many people are reached by it.

No changes neccessary
259 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL Core 1 Grant Impact - Will there be an opportunity for grantees to indicate if they have changed an objective during the course of the project or if they have partially met an objective?_x000D_
- This appears to be a somewhat streamlined approach to what was used in the past; the
Yes, grantees can work with their project officer to ensure that this measure reflects their up-to-date objectives. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
260 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL Core 2 Quality Improvement '- It would be helpful to give some examples of quality improvement initiatives so that grantees have a better sense of what HRSA is attempting to collect information about. Similarly, health equity may bring to mind different concepts for different grantees, therefore, it would be helpful to define health equity._x000D_
- Why is the focus on “organizational� quality improvement and what does that mean?_x000D_
- The aims listed as examples are high level and not specific; it may be challenging for grantees to categorize their quality improvement project aims/measures in the categories listed. Those listed are too specific and also too variable for any type of reliable and consistent grant reporting._x000D_
- Tier 4—Why is the focus only on population health and how is that measured/quantified in a meaningful manner given the broad definition of same? Why is the focus on “organizational� improvement as opposed to (or in addition to) individual improvement? Not all quality improvement is organizationally focused.
The focus of this measure is organizational quality improvement because organizations are funded through the various MCHB programs.

Project officers will provide additional guidance and examples when assigning measures.
Change/ addition to wording,"Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
261 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL Training 1 MCH Training Program Family Member/Youth/Community Member participation The AAP recommends that measures related to family engagement and cultural competence (Training 1 and Training 2) be modified so that they are applicable to Healthy Tomorrows’ grantees (eg, change the wording from MCH Training Programs to DMCHWD Programs). Training 1 and 2 have been modified to include Healthy Tomorrows grantees Change/ addition to wording
262 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL Training 2 MCH Training Program Cultural Competence The AAP recommends that measures related to family engagement and cultural competence (Training 1 and Training 2) be modified so that they are applicable to Healthy Tomorrows’ grantees (eg, change the wording from MCH Training Programs to DMCHWD Programs). Training 1 and 2 have been modified to include Healthy Tomorrows grantees Change/ addition to wording
263 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL Training 3 Healthy Tomorrows Title V Collaboration It is very important for Healthy Tomorrows grantees to establish meaningful linkages with Title V and other MCH-related programs as these programs are excellent resources and partners for grantees.

Miscellaneous_x000D_
ï‚· The AAP supports the following four additional measures for HT grantees:_x000D_
o CB 3 Impact_x000D_
o CB 4 Sustainability_x000D_
o CB 6 Products_x000D_
o CH1 Well Child Visit_x000D_
We understand that these additional measures may not be applicable to some HT grantees, but many of our grantees address a wide range of topics and could potentially provide meaningful data with regard to these elements.
Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
264 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL HS 2 Medical Home Suggest defining medical home and/or breaking it down into a few measurable characteristics. The numerator, as described, would be incredibly challenging to calculate in a meaningful manner. Changed from Medical Home to Usual Source of care, and definition has been added. Definition added,Change/ addition to wording
265 1/5/2016 CSHCN,Workforce Development/ Training American Academy of Pediatrics

Elk Grove Village, IL General Forms/ ADEs
The AAP agrees with the removal of the “Infrastructure Building� category on Form 4. Many projects that do not provide direct care to patients will benefit from the new “Public Health Services and Systems� category. In general, the budget forms appear to be straight forward since there is not a requirement for grantees to break apart expenditures into different categories. Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
266 1/5/2016 CSHCN Indiana's Center of Excellence for Bleeding & Clotting Disorders Indianapolis, CSHCN 1 Family Engagement 1. It will be important to accurately define "family engagement" to achieve the best measure of this activity._x000D_
2. ner 2 and 3- similarly,it will be important to clearly define "technical assistance","product development" and "quality improvement" initiatives._x000D_
3. ner 4-define "CSHCN leaders" and explain the types of community/state/nationallevelteams that are being referenced. What would constitute a family or CSHCN being "trained" and how will "increased knowledge,skill, ability and self-efficacy" be measured by these leaders to serve the population?_x000D_
Measure has been edited, and additional guidance on reporting will be provided when measures are assigned.

Change/ addition to wording
267 1/5/2016 CSHCN Indiana's Center of Excellence for Bleeding & Clotting Disorders Indianapolis, CSHCN 2 Access to and Use of Medical Home 1. Tier 1- define "medical home."_x000D_
2. Tiers 2 and 3- define fields referenced on the "Activity Data Collection Form."
-MCHB-wide decision about common definitions for the fields in Table 1._x000D_ No changes neccessary
268 1/5/2016 CSHCN Indiana's Center of Excellence for Bleeding & Clotting Disorders Indianapolis, CSHCN 3 Transition to Adult Health Care 1. Tiers 2 and 3 -define fields referenced on the "Activity Data Collection Form"._x000D_
2. Tier 4- define "assessed for readiness" and "deemed ready" to transition._x000D_
_x000D_
_x000D_
Regarding the reporting burden,the FederalRegister/Vol. 80,No.215/Friday,November 6,2015 references an estimated 41hours per response.This is likely a reasonable estimate for the IHTC to gather information to complete the report but will be highly dependent on the ease of interpreting and therefore compiling information requested. Clear definitions for terms used will ease the burden of reporting._x000D_
-MCHB-wide decision about common definitions for the fields in Table 1._x000D_
-The terms “assessed for readiness� and “deemed ready� refer to language utilized by gottransition.org. _x000D_
-Transition Readiness can be defined as:_x000D_
Assessing youth’s trans
Definition added
269 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
WMH 3 Well Woman Visit/ Preventive Health Care On page 7, regarding the well woman visit, we note that under well child visit there is a measure for quality, yet none is included in the well woman visit and in timely prenatal care. We recommend you consider adding to be consistent. Suggest no, as there is a fair amount of challenge defining the quality of well child visit measure. No changes neccessary
270 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
WMH 4 Depression Screening On page 8, regarding depression screening, could you consider broadening to screen for mental health issues in general? For example, by identifying women with anxiety disorders more women might also be identified and receive treatment for substance use disorders. Also, the number of women referred to treatment is important, but we suggest consider measuring how many are lost in the system. No changes made in order to keep consistent with HS benchmarks. No changes neccessary
271 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
CH 3 Developmental Screening On page 18, regarding Developmental Screening - under "Grantee Data Sources" it lists NOM#12, which is newborn screening. Please consider that NPM#6 (developmental screening) aligns with the Tier 4 outcome measures and may seem the better match. add NPM 6 as Grantee data source Definition added
272 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
CH 4 Injury Prevention On page 20, regarding Injury Prevention – please consider if NPM#7 should also be listed under "Grantee Data Sources.� Fix: NPM#7 should also be listed under "Grantee Data Sources.� Definition added
273 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
CSHCN 1 Family Engagement On page 21 regarding Family Engagement: Tier 4: numerator: What does “meaningful� mean? How will this be measured? Will grantees be expected to identify numbers of CYSHCN in catchment area? It’s not clear if this is currently possible. They can measure at state level through NSCH, but will grantees have access to county/community level numbers on CYSHCN as part of NSCH? Same questions as Comment 249
-Definitions will be provided (as shown below) when measures are assigned
- Catchment area refers to the target population identified in the applicant/grantee applications.

------------------------------------------------------------------------------
DGIS Instructions Regarding
TA, Training, Conferences, etc.


Technical Assistance: Response to an individualized request for assistance to help an organization accomplish its mission and/or strategic, organizational needs. In this case, factor in the number of instances of TA provided. TA is the overarching response and may include a series of trainings, focus group discussions, meetings, etc. If a training session is involved as part of the TA, factor in the number of participants under the “training� section.



Training: Teaches a person/group a skill… could include

•Presentations at conference, symposium, seminar, or meeting

•Includes one-on-one instruction to an individual

•Targeted learning objectives



Outreach/Information Sharing: Provide general information to build awareness, educate, or communicate a topic and/or organization’s services to the public or specific group of individuals.



Catchment area refers to the target population identified in the applicant/grantee applications.



HHS Definition - Training has the meaning given to the term in section 4101 of title 5, United States Code. – See also: http://www.hhs.gov/grants/contracts/contract-policies-regulations/conference-policy/index.html#definitions



(d) Mission-related training is training that supports agency goals by improving organizational performance at any appropriate level in the agency, as determined by the head of the agency. This includes training that:



(1) Supports the agency's strategic plan and performance objectives;



(2) Improves an employee's current job performance;



(3) Allows for expansion or enhancement of an employee's current job;



(4) Enables an employee to perform needed or potentially needed duties outside the current job at the same level of responsibility; or



(5) Meets organizational needs in response to human resource plans and re-engineering, downsizing, restructuring, and/or program changes.





•Conference: “a formal meeting in which many people gather in order to talk about ideas or problems related to a particular topic (such as medicine or business) usually for several days, or a formal meeting in which a small number of people talk about something.� (http://www.merriam-webster.com/dictionary/conference)



•Symposium: “a formal meeting at which experts discuss a particular topic� (http://www.merriam-webster.com/dictionary/symposium)



•Seminar: “a meeting in which you receive information on and training in a particular subject, or a class offered to a small group of students at a college or university.� (http://www.merriam-webster.com/dictionary/seminar)



Therefore, (1) meetings and events falling within the plain meaning of conference, symposium, and seminar where attendees travel, and (2) training activities that are considered to be conferences under 5 CFR 410.4041, are also considered conferences for the purposes of this policy.





Addressed in other similar or identical comment.
274 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
CSHCN 2 Access to and Use of Medical Home On page 23 regarding Medical Home; Tier 4 numerator will be potentially difficult to determine what % of target population demonstrate a direct linkage to coordinated medical home community as a direct result of activities conducted by the projects. Tier 4 will not be assigned to all programs, and discussions as to ability to report should be had with Project Officers as measures are being assigned.

"Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
275 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
CSHCN 3 Transition to Adult Health Care On Page 26, regarding Transition - similar to above, NPM#6 is listed under "Grantee Data Sources.� We suggest considering if NPM#12 would be a better fit. Yes Change/ addition to wording
276 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
LC 1 Adequate Health Insurance Coverage On page 33, for the percent of programs promoting and/or facilitating adequate health insurance coverage, consider adding a category under Tier 2 regarding policy work i.e. engaging stakeholders, developing recommendations, participating in coalitions/work groups, etc. Additionally, this appears to be an area where the measures might not match the activities. "Program participants" are not always the recipient of the TA, especially if we hope that programs are educating not only families/consumers, but other leaders in their states. We suggest further work to bridge this potential gap. Finally, please consider including discussion of the significance for adults here or explicitly state in the DGIS performance measure that kids are the target (as you do in the Title V NPM). Yes. Change/ addition to wording
277 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
LC 2 Tobacco and eCigarette Cessation On page 36, where data collection tables are first introduced, pregnant women and adolescents are included, but women in general are not. This isn’t reflective of life course and cross-cutting without including women preconception, interconception, and postpartum. We suggest considering a domain for women of childbearing age. Women age 25 and over are included in the budget forms by types of individuals served. Add category for pre-conception and interconception women. Cross check with other forms/ training measures-- verify that these are current definition that the bureau is using. Change/ addition to wording
278 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
CB 4 Sustainability On Page 44, regarding the measure of percent of programs providing technical assistance on MCH priority topics, it is unclear what is being counted. Is it the number of people who received TA on how to address this measure in their states/communities? Or is it the number of people served by the organization receiving the TA? Please clarify. TA is intended to count TA encounters, rather than total reach of TA, which would be very challenging to accurately quantify. Definition added
279 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
CB 6 Products On page 47, regarding the percent of programs supporting the development of informational products and through what means, and related outcomes. To our knowledge, MCHB doesn’t provide TA or standard recommendations on how to use Google analytics or other resources to determine how resources are being downloaded. While AMCHP does this (in general) it is a challenge for a number of grantees, especially smaller ones. Some additional guidance on means through which these metrics are collected will facilitate comparability across grantees. No metrics are necessary, just dissemination methods, No changes neccessary
280 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
Core 1 Grant Impact On page 48, regarding introduction of the core measures, it is presumed but not clear from the narrative that all grantees will report on these three measures. It would be helpful for that to be clarified and would make much more sense to have the core measures appear first in the document. As written, they are buried in the middle and not clearly marked as required for all grantees. Yes. Move Capacity Measures to the front as well. Change/ addition to wording
281 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
Training 13 Policy Development, Implementation, and Evaluation On page 94, regarding Training for Policy Development, consider adding “Writing an Op-Ed or Letter to the Editor� under possible activities in tier 2. The list of activities corresponds to the policy activities listed in the MCH leadership competencies. This measure was developed with the DMCHWD performance measure workgroup. No changes are planned No changes neccessary
282 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
HS 8 Father/ Partner Involvement during Pregnancy On page 134, regarding father / partner involvement, we are pleased to see this proposal and are strongly supportive of inclusion of these measures. No Resolution necessary. No changes neccessary
283 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
F2F 1 Provide National Leadership for families with children with special health needs On page 143, under Models of Family Engagement, please consider adding “Children’s Hospitals� as a specific entity under section b. Per the recommended comment, "Children's Hospitals" will be added as an optional organization type for item B1b. Change/ addition to wording
284 1/5/2016 CSHCN,Workforce Development/ Training Association of Maternal & Child Health Programs
General Forms/ ADEs
On page 150, where the terms “Direct Health Care, Enabling Services and Population Based Services� are introduced for the first time, we strongly encourage you to include the definitions used in the MCH Block Grant Guidance so there is consistency. Yes, reference has been added to the definitions in MCH Block Grant guidance. Definition added
285 1/5/2016 CSHCN Heartland Genetics Services Collaborative
Not Related to Measure
To ensure access to quality genetic services,the Health Resources and Services Administration (HRSA) Maternal and Child Health Bureau, Genetics Services Branch funds a National Coordinating Center (NCC) at the ACMG,a National Genetic Education and Consumer Network at the Genetic Alliance, and seven Regional Genetic Service Collaboratives (RCs). This NCC/RC system has a mission to develop national infrastructure for public health and clinical providers to address gaps and improve direct and enabling services for families and individuals affected by genetic conditions.

In its national evaluation,the NCC/RC system currently uses HRSA Performance Measure #41and questions from the National Survey of Children's Health (NSCH) to assess its contributions to Healthy People 2020 objectives. This effort will be strengthened by the addition of several measures in the DGIS, but only if the NSCH retains the question that is currently in field tests that asks whether the respondent "Has a doctor or other health care provider EVER told you that this child has... Genetic or inherited condition (response A16)". The addition of a heritable condition response category will give MCH constituents,other survey users, and the MCH Genetics Services Branch critical information by which to analyze variables in the NSCH to report on DGIS measures. And if the respondent were
asked to specify the genetic condition formatted similarly to A27 Other Mental Health Condition, then even more information would be available to characterize the genetic conditions.

In addition,to having national population estimates of children with heritable conditions, we want to make note of two other important factors regarding the proposed DGIS measures. First, many of the Tier 4 measures are not data that could be obtained at a regional level or are already obtained through states entering data into a national database funded by HRSA. Second,the regional genetics collaboratives have had the flexibility to select HRSA priority areas that are important to the constituents in each region. Therefore,for the 2016-2017 grant year,we request flexibility in selecting measures that reflect the work conducted for the past five years on specific HRSA priority areas provided in the grant application guidance.
Noted. This can be taken into consideration when we assign measures later this year. "Relates to ability to report, and should be taken into consideration when assigning measures further down the road."
286 1/6/2016 Healthy Start Healthy Start EPIC Center/ JSI
LC 2 Tobacco and eCigarette Cessation Recommending two measures (for Tier 4): a population-based and program -impact measure._x000D_
_x000D_
--- A participant is considered to have abstained from smoking cigarettes if she has not smoked during her pregnancy.
--- A participant is considered to have stopped smoking cigarettes if she quits smoking during her pregnancy.
--- Smoking includes all tobacco products and e-cigarettes._x000D_
_x000D_
The impact measure captures the effect of the Healthy Start program on encouraging the participant to quit smoking during her pregnancy._x000D_

--- Population Measure:
Numerator: Number of HS prenatal participants who abstained from smoking cigarettes (including all tobacco products and e-cigarettes) for their entire pregnancy.
Denominator: Number of HS prenatal participants.

-- Impact Measure
Numerator: Number of HS participants who stop smoking cigarettes (including all tobacco products and e-cigarettes) during their pregnancy.
Denominator: Number of HS prenatal participants who smoked at the beginning of their pregnancy.
Yes. Definition added
287 1/6/2016 Healthy Start Healthy Start EPIC Center/ JSI
HS 3 Interconception Planning Recommend the following definitions and rationale:

-- Numerator: Number of HS women participants whose current pregnancy was conceived within 18 months of the previous birth.

-- Denominator: Total number of HS women participants enrolled before the current pregnancy who had a prior pregnancy.



Rationale: The interval between the most recent pregnancy and previous birth is derived from the delivery date of the birth and the date of conception for the most recent pregnancy. Optimal spacing of 18 months applies to both live births and stillbirths.

Consistent with Healthy Start Benchmark #10
Measure has been revised in accordance with this feedback and anticipated Healthy Start data collection efforts.

Change/ addition to wording,Definition added
288 1/6/2016 Healthy Start Healthy Start EPIC Center/ JSI
Suggested Addition,CH 1 Well Child Visit Healthy Start Benchmark requires reporting on infant well-child visit. So, recommend that this be extended to ages 0-24 months, or add same measure in PIH domain. Infant well-child visit will be reported as HS measure.

Change/ addition to wording
289 1/6/2016 Healthy Start Healthy Start EPIC Center/ JSI
HS 4 Early Elective Delivery1 Recommend following definitions and exclusions.
Numerator: Number of HS prenatal participants with elective delivery (i.e., exclude medically necessary delivery) before 39 weeks.
Denominator: Total number of HS prenatal participants enrolled prenatally who gave birth.

A participant is included in the denominator if she is enrolled in the program prior to delivering. Excludes women enrolled only at the time of delivery.


NOTE: EPIC has requested guidance on how to identify a medically necessary vs. elective delivery from our MCH OB/GYN expect as followup.

Consistent with Healthy Start Benchmark #12
Measure has been revised in accordance with this feedback and anticipated Healthy Start data collection efforts.

Change/ addition to wording,Definition added
290 1/6/2016 Healthy Start Healthy Start EPIC Center/ JSI
HS 5 Perinatal Depression Screening Recommend following definitions and exclusions.
-- Numerator: Number of women participants who were screened for depression with a standardized tool within reporting year.

-- Denominator: Number of HS women participants.


A participant is considered to have been screened and included in the numerator if a standardized screening tool which is appropriately validated for her circumstances is used. Several screening instruments have been validated for use to assist with systematically identifying patients with depression and ( a list is provided)



NOTE: EPIC is checking to see if questions included in screening tool constitute a “validated� tool.

Consistent with one part of Healthy Start Benchmark #13
Measure has been revised in accordance with this feedback and anticipated Healthy Start data collection efforts.

Change/ addition to wording,Definition added
291 1/6/2016 Healthy Start Healthy Start EPIC Center/ JSI
HS 6 Perinatal Depression Follow Up Recommend the following definitions/ clarifications for the measure:
-- Numerator: Number of women participants who had a referral for follow-up services.
-- Denominator: Number of HS women participants who screened positive for depression.
A participant is considered to have been referred for follow-up services and included in the numerator if she is referred to a qualified practitioner for further assessment for depression. Referral can be to either an internal or external provider depending on availability and staffing model.

Consistent with the second part of Healthy Start Benchmark #13
Measure has been revised in accordance with this feedback and anticipated Healthy Start data collection efforts.

Change/ addition to wording,Definition added
292 1/6/2016 Healthy Start Healthy Start EPIC Center/ JSI
HS 7 Intimate Partner Violence Screening Recommendation for definitions:
-- Numerator: Number of HS women participants who received intimate partner violence screening using a standardized screening tool during the reporting year.
-- Total number of HS women participants

NOTE: EPIC is checking to see if questions included in screening tool constitute a “validated� tool._x000D_
_x000D_
A list of validated IPV screening tools is included in the Data Dictionary as a reference._x000D_

Consistent with Healthy Start Benchmark #15
Measure has been revised in accordance with this feedback and anticipated Healthy Start data collection efforts.

Change/ addition to wording,Definition added
293 1/6/2016 Healthy Start Healthy Start EPIC Center/ JSI
LC 1 Adequate Health Insurance Coverage Recommendation for definitions:
-- Numerator: Number of women and infant HS participants with health insurance as of their last HS contact.
-- Denominator: Number of total women and infant HS participants.
-- Comments
: Include instruction in manual that undocumented participants and participants who do not quality for subsidy under ACA are included in denominator.

Consistent with Healthy Start Benchmark #1
Yes. Definition added
294 1/6/2016 Healthy Start Healthy Start EPIC Center/ JSI
HS 1 Reproductive Life Plan Recommendations for definitions:

--- Numerator: Number of HS women participants with a documented reproductive life plan.
--- Denominator: Number of HS women participants excluding women who initially enrolled in their first 6 months of pregnancy.

--- Comments (Awaiting clarification from Suz): I am unable to resolve this issue without further discussion. There is no reference in the literature to when to conduct a RLP. The language of the RLP to set “goals for having or not having children� suggests that it would not be applicable for a woman who enrolled pregnant. However, in the event of an early pregnancy, wouldn’t you still want to evaluate the participant’s choice to have and/or keep the baby?

Consistent with Healthy Start Benchmark #3
Measure has been revised in accordance with this feedback and anticipated Healthy Start data collection efforts.

Change/ addition to wording,Definition added
295 1/6/2016 Healthy Start Healthy Start EPIC Center/ JSI Bow, NH WMH 2 Perinatal/ Postpartum Care Recommend the following definitions:
--- Numerator: Number of HS women participants who had a postpartum visit between 4-6 weeks after delivery.

--- Denominator: Total number of HS participants who enrolled before 6 weeks postpartum after delivery and who delivered 6 weeks or more prior to the end of the reporting year.

--- Comments: From the most recent ACOG guidelines dated 2012 7th Edition “Postpartum visit is approximately 4-6 weeks after delivery.� I have attached the source document.

Consistent with Healthy Start Benchmark #3
Cross check with current Tier 4. Change/ addition to wording
296 1/6/2016 Healthy Start Healthy Start EPIC Center/ JSI
HS 2 Medical Home Recommend the following definitions:
--- Numerator: Total number of HS women and infant participants that have a medical home as of their last HS contact.
--- Denominator: Total number of women and infant HS participants.
--- Comments: Clarify definition of “medical home�: A participant is considered to have a medical home and included in the numerator if the participant has a regular source of primary care. That is, the participant identifies a regular place where she can go for routine and sick care other than an emergency room. A participant receiving regular prenatal care from a prenatal provider is considered to have a medical home.

Consistent with Healthy Start Benchmark #4

Changed to "usual source of care" with definition provided. Change/ addition to wording,Definition added
297 1/6/2016 Healthy Start Healthy Start EPIC Center/ JSI Bow, NH WMH 3 Well Woman Visit/ Preventive Health Care Recommend the following definitions:
--- Numerator: Number of HS women participants who received a well-woman or preventive visit in the past 12 months (includes prenatal and postpartum visit).
--- Denominator: Total number of HS women participants.
--- Comments:Clarify definition of well-woman/preventive visit: A participant is considered to have a well-woman or preventive visit and included in the numerator if she has a documented health assessment visit where she obtained recommended preventive services that are age and developmentally appropriate within twelve months of her last contact with the Healthy Start Program. For purposes of reporting, a prenatal visit or postpartum visit during the twelve month period would meet the standard.

Note: Dr. Lu and MCH measure require annual well-woman/preventive visit.


Consistent with Healthy Start Benchmark #5
Cross check with current Tier 4. Change/ addition to wording,Definition added
298 1/6/2016 Healthy Start Healthy Start EPIC Center/ JSI
PIH 1 Safe Sleep Recommend the following definitions:
-- Numerator: Number of HS infants (0-12 mo) whose parent reports that they are always placed to sleep following safe sleep practices._x000D_
--- Denominator: Number of HS infant participants aged 0 to 12 months old.
--- Comments: Clarify definition of safe sleep practices: A participant is considered to engage in safe sleep practices and included in the numerator if she always follows the American Academy of Pediatrics (AAP) recommendations that babies be placed to sleep: 1) on their backs; 2) on a firm sleep surface free of soft objects or loose bedding; and 3) with no bed-sharing. Note: The requirement is that the baby is placed on his back to sleep. If the baby rolls over onto his stomach after being placed to sleep on his back, the standard is met.
_x000D_
Although safe sleep behaviors are self-reported, programs are encouraged to observe safe sleep practices during home visits, as possible. _x000D_
Check with Erin/ PIH group-- can this replace what is currently T4, or be in addition. Change/ addition to wording
299 1/6/2016 Healthy Start Healthy Start EPIC Center/ JSI
PIH 2 Breast Feeding Recommend the following definitions for the breastfeeding measure:
Recommend several Tier 4 measures, one population and one program impact for each of ever breastfed and breastfed through 6 months--
Population measure for ever breastfed:
--- Numerator: Number of HS infants whose parent reports they were ever breastfed or fed breast milk.
--- Denominator:Total number of HS infants aged 0 -24 months.



Program impact measure for ever breastfed:
--- Numerator: Number of HS infants whose parent reports they were ever breastfed or fed breast milk.
--- Denominator: Total number of HS infants 0 to 24 months born to women enrolled prenatally or at the time of birth.



Population measure for breastfed at 6 months:

--- Numerator: Number of HS infants whose parent reports they were breastfed or fed breast milk through 6 months of age.
--- Denominator: Total number of HS infants aged 6-24 months.

Program impact measure for breastfed at 6 months:
--- Numerator: Number of HS infants whose parent reports they were ever breastfed or fed breast milk.


--- Denominator: Total number of HS infants aged 7 mo to 2 years whose mother was enrolled prenatally or at the time of birth

Consistent with Healthy Start Benchmark #7 and #8
Measure revised in accordance with Healthy Start workgroup definitions and 3Ps reconciliation.

Change/ addition to wording,Definition added
303 1/4/2016 CSHCN,Workforce Development/ Training Association of University Centers on Disabilities

Silver Spring, MD Suggested Addition
Missing: Access to and Use of Medical Home

We suggest that a performance measure be added to the Child Health domain to address promoting and/or facilitating medical home access for all children.

A key factor in the evolution of the medical home concept is that it has expanded beyond children and youth with special health care needs to include all children and adults. Promoting medical home access for all children, not just CSHCN, is aligned with Healthy People 2020, MICH Objective 30, to increase the proportion of all children who have medical home access. Furthermore, this performance measure would correspond with the Maternal and Child Health Equity Blueprint Draft goals to increase access to quality MCH care (Goal 4, p. 15) and strengthen MCH systems of care (Goal 5, p. 16). Facilitating medical home access for all children ensures we still reach non-CSHCN populations that would greatly benefit from coordinated, family-centered care. We recommend that this performance measure include a component that continues to allow us to gauge the extent to which CSHCN have access to a medical home within the overall population of children.
Only promoted through CSHCN, and this is included in block grant reporting. No changes neccessary
304 1/4/2016 CSHCN,Workforce Development/ Training Association of University Centers on Disabilities

Silver Spring, MD Suggested Addition
Missing: Family Engagement
_x000D_
We suggest that a performance measure be added to the Child Health domain to address promoting and/or facilitating family engagement in children’s health systems._x000D_
_x000D_
_x000D_
_x000D_
MCH programs are most successful when they engage the families impacted by the policies, systems, and services they promote at all levels. Including families at the policy and planning level and engaging parent leaders and parent leader organizations has proved to be a successful and efficient strategy in CSHCN systems. Engaging families ensures that MCH services are properly targeted and that resources are not unnecessarily wasted. Family Voices has already begun to explore engagement of family leader organizations regarding non-CSHCN measures within Title V. This measure is aligned with the Healthy People 2020, 10 Essential Public Health Services; as well as the Maternal and Child Health Equity Blueprint Draft goal to strengthen MCH systems of care (Goal 5, p. 16)._x000D_
_x000D_
It is important to note that family engagement is a shared responsibility that also requires partnerships with family leaders, family organizations including Family to Family Health Information Centers and Family Voices State Affiliate Organizations, EI/education-focused parent centers, parent to parent programs, parent advisory councils, Federation of Families for Children’s Mental Health chapters, community-based organizations and immigrant organizations that serve diverse families, etc._x000D_
_x000D_
_x000D_
_x000D_
Family leaders from diverse backgrounds and family-led organizations can play critical roles in helping health institutions and professionals understand how to more effectively engage, support, and partner with, diverse families, including but not limited to families of children with disabilities and special health care needs, limited English proficiency/ English language learners, of color, from lower socio-economic backgrounds, diverse religious backgrounds, etc. Family leaders from diverse backgrounds and family-led organizations can also serve as “family cultural brokers,� helping to strengthen connections between health organizations and the children and families they serve._x000D_
_x000D_
_x000D_
_x000D_
Family and family organization engagement indicators must be developed and integrated into existing data systems. Further, MCH programs must be encouraged to use data from family organizations such as data from F2Fs, FV SAOs, FFCMH chapters, EI/education- focused parent centers, and parent to parent programs, as well as to work with family organizations to develop, disseminate, and analyze results of surveys, focus groups, and other mechanisms that are most likely to garner diverse family feedback._x000D_
Only promoted through CSHCN, and this is included in block grant reporting. No changes neccessary
305 1/5/2016 CSHCN Family-Led Organization Newark, NJ Suggested Addition
Missing: Access to and Use of Medical Home_x000D_

We suggest that a performance measure be added to the Women’s/Maternal Health domain to address promoting and/or facilitating medical home access for women before, during, and after pregnancy. The presence of a medical home creates a continuum of care for women across their lifespan—to link preconception care, wellness, and follow-up care later in life. Having integrated care through a medical home is particularly important for low-income women, women who are uninsured, and women who exhibit other factors that make them susceptible to poor pregnancy outcomes (ACOG Women's Medical Home Policy, Principles for a Patient-Centered Medical Home for Women, February 2009).Though the medical home concept originated with children and youth with special health care needs, it has become a universal approach to integrated care for all (Maternal and Child Health Equity Blueprint Draft, p. 8) and we believe it is a crucial measure for ensuring equity and access to quality healthcare for all women. It is particularly important to ensure access to a medical home for those women who, by reason of immigrant or socio-economic status, do not have access to sufficient health insurance coverage.
Only promoted through CSHCN, and this is included in block grant reporting. No changes neccessary
File Typeapplication/vnd.openxmlformats-officedocument.spreadsheetml.sheet
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy