1801 Evaluation_Supporting Statement Part A_6.8.2020 CLEANX

1801 Evaluation_Supporting Statement Part A_6.8.2020 CLEAN.DOCX

Evaluation of the DP18-1801 Healthy Schools Program

OMB: 0920-1302

Document [docx]
Download: docx | pdf



SUPPORTING STATEMENT FOR THE


EVALUATION OF THE DP18-1801 HEALTHY SCHOOLS PROGRAM


PART A














Submitted by:

Sarah M. Lee, PhD

School Health Branch

Division of Population Health

National Center for Chronic Disease Prevention and Health Promotion

4770 Buford Hwy, NE Mail stop K78

Atlanta, GA 30341
770-488-6162 (voice); 770-488-5964 (fax)

E-mail: [email protected]

Centers for Disease Control and Prevention
Department of Health and Human Services

06/08/2020



Shape1

BRIEF EXPLANATION OF PART A SUPPORTING STATEMENT (JUSTIFICATION)

  • Goal of the study

The goal of this evaluation is to examine three selected DP18-1801 Healthy Schools Program (DP18-1801) grantees to provide a complex picture of implementation activities, context, successes and challenges, key partnerships, lessons learned, and impact on program outcomes.


  • Intended use of the resulting data

The Centers for Disease Control and Prevention’s (CDC) School Health Branch (SHB) will use the resulting data to (1) inform stakeholders about implementation and impact of the 1801 program within State Education Agencies (SEAs), Local Education Agencies (LEAs), and schools, and (2) improve the quality of programming implemented to enhance nutrition, physical activity, and management of chronic health conditions in schools.


  • Methods to be used to collect data

The evaluation approach is a multisite, longitudinal, embedded case study design, consisting of both process and impact components, focusing on three 1801 state grantees and a subset of their targeted LEAs and schools. Two primary data collection methods will be used: (1) key informant interviews (KII), and (2) Web-based surveys. All data for wave 1 (i.e., fall 2020) will be collected virtually (by phone or web-based survey) due to the COVID-19 outbreak. Wave 2 (2022) may include virtual and/or in-person data collection, pending future circumstances.


  • The subpopulation to be studied

The populations and subpopulations included in this evaluation are SEA, LEA and school staff.


  • How data will be analyzed

Evaluation data will be analyzed using a combination of descriptive quantitative statistics and qualitative thematic analysis techniques.



List of Attachments

Attachment 1: Authorizing Legislation

Attachment 2: 60-day Federal Register Notice

Attachment 2a: 60-day Federal Register Notice Response to Comments

Attachment 3: Evaluation Questions and Data Sources for Intensive Evaluation

Attachment 4: State Education Agency Key Informant Interview Guide and Participant Consent

Attachment 5: Local Education Agency Key Informant Interview Guide and Participant Consent

Attachment 6: School Key Informant Interview Guide and Participant Consent

Attachment 7: State Education Agency Web-Based Implementation Survey and Participant Consent

Attachment 8: Local Education Agency Web-Based Implementation Survey and Participant Consent

Attachment 9: School Web-Based Implementation Survey and Participant Consent

Attachment 10: ICF Institutional Review Board Approval Memorandum




  1. Justification

A.1 Circumstances Making the Collection of Information Necessary

School-based programs designed to create healthier nutrition environments, improve the quality of physical education and physical activity, and manage chronic health conditions in schools have long-lasting health impacts on students.1,2,3 The Centers for Disease Control and Prevention’s (CDC) School Health Branch (SHB) requests a 3-year OMB approval to conduct a new information collection entitled DP18-1801 Healthy Schools (DP18-1801) Program Evaluation. The DP18-1801 Healthy Schools Program builds upon previous CDC efforts designed to enhance the capacity of state education agencies (SEAs) to adopt and implement evidence-based policies, practices, and programs that support health among the nation’s youth. The purpose of the DP18-1801 Healthy Schools Program is to: (1) increase the number of students who consume nutritious food and beverages (i.e., those aligned with the Dietary Guidelines for Americans); (2) increase the number of students who participate in daily physical education and physical activity; and (3) increase the number of students who can effectively manage their chronic health conditions.

Seventeen State Educational Agencies (SEAs) were funded in 2018 to build school health infrastructure and capacity, and to provide professional development, training opportunities, and technical assistance activities to Local Educational Agencies (LEAs) and schools to support the development and implementation of policies and practices with the goal of improving student health and academic achievement. To minimize burden on the award recipients, reporting requirements for funded states is limited.

The 1801 program has a total of 8 performance measures. Seven of these measures are collected through surveillance systems: School Health Profiles, conducted every even year, and Youth Risk Behavior Survey, conducted every odd year. These systems capture information about school health practices and programs that are in place (School Health Profiles) as well as dietary and physical activity behaviors of students (Youth Risk Behavior Survey). While states are required to support the collection of this data, CDC is responsible for data analysis and reporting of those measures. Additionally, states are responsible for submitting annual evaluation reports to CDC. These reports reflect evaluation priorities of the state and are not prescriptive in nature. No data are collected or reported by awardees for implementation processes or strategies that states, districts, and schools are using to improve school nutrition, school-based physical education and activity, and school health services to support students with chronic health conditions. Therefore, the only way to assess implementation of the DP18-1801 Healthy Schools Program is to conduct separate evaluation activities beyond those required of awardees.

CDC is authorized to collect the data described in this request by Sections 301(a) and 317(k)(2) of the Public Health Service Act [42 U.S.C. Sections 241 and 247(k)(2)], as amended. A copy of this enabling legislation is provided in (Attachment 1). CDC contracted with ICF (a public health consulting company) to plan and lead this implementation and impact evaluation (hereafter referred to as the evaluation).

This process and impact evaluation has a multisite, longitudinal, embedded case study design, focusing on three 1801 state grantees and a subset of their targeted LEAs and schools. The evaluation will assess implementation of strategies and activities at the state, local, and school levels and their integration across levels; fidelity of implementation; implementation facilitators and barriers; and contributions of national and state level technical assistance (TA) towards program achievements.

A.2 Purpose and Use of Information Collection

The purpose of this evaluation is to inform the Division of Population Health/School Health Branch and its stakeholders about implementation of the DP18-1801 program within SEAs, LEAs, and schools, and will be used to improve the quality of programming implemented to enhance nutrition, physical activity, and management of chronic health conditions in schools.

The specific aims of this evaluation are as follows:

  1. Identify implementation strategies at the SEA, LEA, and school level as well as barriers and facilitators of the DP18-1801 program implementation to inform program improvement.

  2. Identify the extent to which SEAs developed a strong school health infrastructure throughout the state and among LEAs and schools.

  3. Identify the extent to which SEAs and LEAs supported the development and implementation of school-based health policies and practices.

  4. Identify the extent to which SEAs and LEAs provided quality professional development, training, and technical assistance to LEA and school staff.

  5. Identify the extent to which SEAs increased healthful behaviors and improved the management of chronic health conditions.

This evaluation will examine the implementation strategies; areas of program strength and areas in need of refinement; and barriers and facilitators to implementation among a subset of grantees. In addition, this evaluation will demonstrate the impact of the DP18-1801 on SEAs’ and LEAs’ provision of training and technical assistance (TA) on schools’ implementation of health policies and practices, and on students’ behaviors. The results will allow the SHB to make recommendations for improvement of program implementation, to set priorities for future funding and research, and to make policy decisions. SEAs, LEAs, and schools may use the results to improve their programs and practices.

Attachment 3 shows the evaluation questions along with the corresponding data collection methods, respondent types, and year of data collection activities. Data collection will involve SEA, LEA, and school personnel, and will include: (1) key informant interviews (KII) (KII guides in Attachments 4, 5, and 6), and (2) Web-based surveys (Attachments 7, 8, and 9).

Primary data will be collected at two time points from the same cohort of SEAs, LEAs, and schools during the fall semester of 2020 (Program Year 2) and 2022 (Program Year 4). Exhibit 1 summarizes the proposed timing of data collection for each type of data collection activity.


Exhibit 1. Data Collection Method and Timing for Data Collection

Data Collection Method

Program

Year 2

Program

Year 3

Program

Year 4

SEA staff Key informant interviews (KII)

X


X

LEA staff KIIs

X


X

School staff KIIs

X


X

SEA, LEA, and school Implementation Survey

X


X

A.3 Use of Improved Information Technology and Burden Reduction

All surveys will be web-based and hosted by SurveyMonkey®. The instruments were carefully developed and tested to be accessed electronically, which greatly reduces the burden on respondents to safely access, complete (e.g., as a result of appropriately programmed skip patterns) and submit surveys. To meet the needs of all respondents, hardcopy surveys, consent forms and self-addressed envelopes will be available upon request. The web-based survey results will also facilitate rapid tabulation of the data, which in turn will allow interviewers to review the survey findings prior to conducting the interviews and avoid duplicative collection of information.

A.4 Efforts to Identify Duplication and Use of Similar Information

The data collection activities proposed in this Information Collection Request do not duplicate existing efforts. Neither the web-surveys nor the interviews guides duplicate other survey efforts or program monitoring activities associated with this or similar programs. There are no existing data collected by SEAs or LEAs that can be used to generate data similar to the information collected under this ICR.

A.5 Impact on Small Businesses or other Small Entities

The planned data collection does not involve small businesses or other small entities.

A.6 Consequences of Collecting the Information Less Frequently

This is a one-time evaluation with two waves of data collection. Two time points are necessary for this evaluation to assess change and progress over time in terms of implementation activities occurring at state, district, and local levels. If we collect data only once, these data would only provide a single snapshot of information without any indication of how school environments changed over the course of the program. Without this study, CDC would not be able to effectively assess implementation of the DP18-1801 program and its impact in schools and students’ health. CDC would also lack information to implement program improvement and corrective actions.

A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

There are no special circumstances. The activities outlined in this package fully comply with all guidelines of 5 CFR 1320.5

A.8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

A.8.a Federal Register Announcement

As required by 5 CFR 1320.8(d), CDC published a 60-day Notice in the Federal Register on July 25, 2019, Vol. 84, No. 143, pages 35863-35864 (see Attachment 2). One public comment was received on 8/08/2019 and posted on 8/13/2019. The comment recommended that all information should be made public immediately. CDC determined that the comment was not substantive to require a formal response as the evaluation plan is publicly available via this Federal Register Notice (see Attachment 2a).

A.8.b Consultation with Various User Communities and Experts

The DP18-1801 Program Evaluation team consulted with CDC staff when developing the study design and data collection instruments. Exhibit 2 provides information about the CDC subject matter experts.

Exhibit 2 CDC Staff* Consulted for the DP18-1801 Evaluation

Name

Contact Information

Sarah M. Lee, PhD

Phone: 770-488-6126

[email protected]

Seraphine Pitt Barnes, PhD

Phone: 770-488-6115

s[email protected]

Adina Cooper

[email protected]

Melissa Fahrenbruch

[email protected]

Project Officers


Bridget Borgogna

[email protected]

Chris Kissler

[email protected]

Jyotsna Blackwell

[email protected]

Patricia Patrick

[email protected]

Trevor Newby

[email protected]

* All staff are from CDC’s Division of Population Health, School Health Branch

A.9 Explanation of Any Payment or Gift to Respondents

Obtaining high response rates is critical to the rigor of the evaluation. To encourage participation in the evaluation, data collection procedures are designed to be low-burden and modest incentives will be offered to LEA and school staff to compensate them for their time and effort in responding to evaluation activities. Exhibit 3 describes the incentives plan by respondent type and data collection method. ICF will manage and distribute the incentives to participants. No incentives will be provided to State Education Agency participants. Each Local Education Agency (LEA) participant will receive $25.00 (12 units per period) for their participation in the Key Informant Interviews (KIIs) and $20.00 (30 units per period) for participation in the web-based survey. Each school staff will also receive $25.00 (54 units per period) for their participation in the Key Informant Interviews (KIIs) and $20.00 (210 units per period) for participation in the web-based survey. Each school participating in the evaluation (18 units per period) will receive a school supply credit of $100.00. Additionally, all schools that complete the web-based survey by the deadline will be entered in a drawing for school supply credit of $500.00 (2 units per period).  

Exhibit 3. Summary of Participation Incentives 

Activity 

Incentive 

Recipient 

Unit Cost 

Units Per Period 

Total Cost Per Period 

Total Cost (Period x 2) 

LEA Level  

LEA Key Informant Interviews 

Gift card  

Respondent 

$25.00 

12 

$300.00 

$600.00 

LEA Implementation Survey 

Gift card 

Respondent 

$20.00 

30 

$600.00 

$1,200.00 

School Level

School participation 

Supply credit 

School 

$100.00 

18 

$1,800.00 

$3,600.00 

School key informant interviews 

Gift card 

Respondent 

$25.00 

54 

$1,350.00 

$2,700.00 

School Implementation Survey  

Gift card 

Respondent 

$20.00 

210 

$4,200.00 

$8,400.00 

Raffle for supply credit 

School 

$500.00 

2 

$1,000.00 

$2,000.00 

  

  

 

  

  

$9,250.00 

$18,500.00 

A.10 Protection of the Privacy and Confidentiality of Information Provided by Respondents

This evaluation does not collect sensitive, personal, and/or personally identifiable information from participants. Only the name and work email address of the individuals responding to the web-surveys and participating on the interviews will be collected. We will not collect personal information about the individuals entering programmatic data beyond their name and email address.

  1. Privacy Act Determination. CDC and the contractor, ICF, do NOT intend to retrieve nor file information in identifiable form so the Privacy Act does not apply. All data will be collected at the institution (SEA, LEA, school) level. Although the name of the contact person submitting survey data and/or participating in interviews is maintained for each responding organization, the contact person provides information about the program implementation, and not personal information other than stating their official role. The contact person's name and email address will be maintained until the end of the data collection. The name and email address will be used to send the link to the web-survey and for scheduling the interviews. After data collection is complete names and email addresses will be deleted and replaced by the name of the SEA, LEA, or school which the respondent works for. Responses, which will be all pertaining to programmatic activities, will be linked to the name of the institution (i.e., SEA, LEA, or school) only, never to individual respondents.

  2. Safeguards. The information collection involves use of web-based data collection methods. The survey website does use cookies, and access to the web-based survey is only possible using a unique link provided only to the SEA, LEA, or school staff who will complete the survey. ICF will maintain information in secure electronic files that will only be accessible to authorized members of the evaluation team. Electronic files will be stored on secure network servers, and access will be restricted to approved team members identified by user ID and password. Respondent names will NOT be linked any data collected. Names collected for communication via email will be kept separate from data in a different password-protected file on the secure server. The same is true for email addresses.

  3. Consent. Consent forms will include the following: 1) the description and purpose of the data collection, 2) the voluntary nature of participation, including the ability to stop/skip questions at any time, 3) the risks and benefits of participation, 4) the gift for participation and 5) the contact information of the principal investigator.

Interview Consent. The interview consent will be emailed prior to the interview, and the interviewer will also read the consent prior to starting the interview (Attachments 3, 4, and 5). The interviewer will request the interviewee to verbally respond if he/she agrees to participate.

Web-Survey Consent. The survey consent page will appear when the respondent first opens the survey link (Attachments 6, 7, and 8). If the respondent agrees to participate, consent is actively conferred by selecting the “Next” button to start the survey.

  1. Nature of Response. Participation is voluntary and participants can discontinue participation at any time.

A.11 Institutional Review Board and Justification for Sensitive Questions

ICF’s Institutional Review Board has reviewed the description and supporting materials submitted for the DP18-1801 Healthy Schools Program evaluation and determined that the activities qualify for Exemption under 45 CFR 46.104.2 (Attachment 10) because the evaluation involves only surveys and interviews; it includes identifiers but adequate privacy protections are in place; and disclosure would not place subjects at risk.

A.12 Estimates of Annualized Burden Hours and Costs

Estimated Annualized Burden Hours

Respondents will participate in surveys and interviews one time in each data collection period (i.e., wave). One staff member from three SEAs, up to 10 LEAs per state (total up to 30), and up to 210 schools total (across all 3 states) will be recruited to respond to the web-based survey. In addition, a total of nine SEA staff, 12 LEA staff, and 54 school staff will be invited to participate in interviews. Length for each of the data collection methods was estimated base on internal pilot testing of the instruments. The estimated annualized burden is provided in Exhibit 4. The burden of hours for each year of data collection is 398 hours (Exhibit 4).

Exhibit 4. Estimated Annualized Burden Hours

Type of Respondents

Form Name

No. of Respondents

No. of Responses per Respondent

Average Burden Per Response (in hours)

Total Burden (in hours)

2020 Data Collection (1st Wave)

SEA staff

Web-Survey

3

1

1.25

3.75

Key-Informant Interview

9

1

1.25

11.25

LEA staff

Web-Survey

30

1

1.25

37.5

Key-Informant Interview

12

1

1.25

15

School staff

Web-Survey

210

1

1.25

262.5

Key-Informant Interview

54

1

1.25

67.5

Total 1st Wave

398

2022 Data Collection (2nd Wave)

SEA staff

Web-Survey

3

1

1.25

3.75

Key-Informant Interview

9

1

1.25

11.25

LEA staff

Web-Survey

30

1

1.25

37.5

Key-Informant Interview

12

1

1.25

15

School staff

Web-Survey

210

1

1.25

262.5

Key-Informant Interview

54

1

1.25

67.5

Total 2nd Wave

398

Total Both Waves

796


Annualized Costs to Respondent

Cost estimates for the SEA and LEA respondents are based on average hourly rates for “managers, all others” reported on the Department of Labor Statistics website for May 20181. Thus, estimates are $41.14 an hour for the SEA staff, and $43.27 an hour for the LEA staff. Cost estimates for the school respondents are based on average hourly rates for “Elementary School Teachers” reported on the Department of Labor Statistics website for May 20182. Thus, estimate is $29.9 an hour for the school staff. Exhibit 5 presents the calculations for the estimated cost of respondent hours. The annual cost to respondents is estimated to be $12,766.9.

Exhibit 5. Estimated Annual Cost to Respondents

Type of Respondents

Form Name

Total Burden Hours

Average Hourly Wage Rate ($)

Estimated Cost ($)

20192020 Data Collection (1st Wave)

SEA staff

Web-Survey

3.75

41.1

154.1

Key-Informant Interview

11.25

41.1

462.4

LEA staff

Web-Survey

37.5

43.3

1,623.8

Key-Informant Interview

15

43.3

659.5

School staff

Web-Survey

262.5

29.9

7,848.8

Key-Informant Interview

67.5

29.9

2,018.3

Total 1st Wave ($)

12,766.9

20212022 Data Collection (2nd Wave)

SEA staff

Web-Survey

3.75

41.1

154.1

Key-Informant Interview

11.25

41.1

462.4

LEA staff

Web-Survey

37.5

43.3

1,623.8

Key-Informant Interview

15

43.3

659.5

School staff

Web-Survey

262.5

29.9

7,848.8

Key-Informant Interview

67.5

29.9

2,018.3

Total 2nd Wave ($)

12,766.9










A.13 Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers

No capital, start-up, or maintenance costs are involved.

A.14 Annualized Cost to the Government

This evaluation is one component of a larger project funded under Contract No. 200-2014-61102. The portion of the ICF contract that covers this evaluation is $849,482 over 5 years. To estimate the annual cost of the project to the federal government, we subtracted the cost of the base year of the contract for evaluation planning and design (no data collection or analysis will take place in the base year), bringing the remaining portion of the evaluation to $747,806. We estimated 10% of the remaining total contract cost for recruitment, 40% for data collection, 30% for data management and analysis, and 20% for reporting and dissemination. We divided these totals by the 4 remaining years in the contract to arrive at an average annualized cost for each activity (see Exhibit 6). Thus the annualized contract cost is $186,951.5.

Additional costs will be incurred indirectly by the government in personnel costs of staff involved in oversight of the study and in conducting data analysis. It is estimated that 4 CDC employees will be involved for approximately 20%, 20%, 15%, and 15% of their time at salaries of $63.38, $53.80, $53.80 and $63.38 per hour, respectively. The direct annual costs in CDC staff time will approximate $85,310.32 annually. The total cost for the study over a 36-month period, including the contract cost and federal government personnel cost is $ 816,785.4. The annualized cost to the government for the study will be $272,261.8.

Exhibit 6. Itemized Annual Cost to the Federal Government

Activity

Annualized Respondent Cost (average across option years 1-4)

Contract Costs


Recruitment (SEAs, LEAs, schools)

18,695.15

Data collection

74,780.60

Data Management & Analysis

56,085.45

Reporting & Dissemination

37,390.30

Subtotal

186,951.5

CDC Staff Cost


Federal Employee Time Cost – at 20%

26,367.09

Federal Employee Time Cost – at 15%

19,775.32

Federal Employee Time Cost – at 20%

22,381.66

Federal Employee Time Cost – at 15%

16,786.25

Subtotal

85,310.32

Average Annualized Cost

272,261.8

A.15 Explanation for Program Changes or Adjustments

None.

A.16 Plans for Tabulation and Publication and Project Time Schedule

Tabulation

All data collection activities will result in new data sets that can be used in analyses for each of the instruments and measures administered to respondents. The plan for data preparation and management involves the following:

  • Assessment of nonresponse or missing data

  • Performance checks of data quality: completeness of data, verify accuracy, check validity and examine summary statistics

  • Implementation of procedures to address any data quality issues

Quantitative Data

SPSS or STATA will be used for all quantitative analyses. ICF will select appropriate quantitative methods on the basis of the research question, type of outcome variable assessed (i.e., nominal, ordinal, interval, or ratio), completeness of the data, and the need to introduce covariates into the analysis. We will analyze performance and surveillance data obtained from baseline to year 5 of the grant to identify changes in reach and outcomes over time, primarily using tests of proportions or generalized estimating equations for dichotomous outcome variables.

Qualitative Data

Recordings from all interviews will be professionally transcribed and uploaded into MAXQDA qualitative research software. ICF will develop a thematic codebook with deductive codes associated with the evaluation questions and questions from each of the interview and focus group guides. We will apply the thematic codes to all relevant narrative text segments, add relevant inductive codes that arise, and create code reports when coding is complete. Coding will be conducted by a two-person team, following training and practice to establish at least 80% inter-coder reliability, and code the data in preparation for thematic analyses using MAXQDA. We will then review the coded data to identify themes and patterns related to implementation common within and across sites, as well as potential outliers specific to an individual site

Publication

The results from this evaluation will be synthesized into a project report and a scientific manuscript for publication in a peer-reviewed journal.

Project Time Schedule

A three-year clearance is being requested. Exhibit 7 provides detailed list of the activities and time schedule for implementation of this evaluation.

Exhibit 7. Project Activities Time Schedule

Activity

Apx. months after OMB approval

Optimal Dates

2020 Wave 1 – Program Year 2

Recruit SEAs and schedule interviews and survey deployment

Prior to approval*

September 2019

Recruit LEAs and schedule interviews and survey deployment

Upon approval

July 2020

Recruit schools and schedule interviews and survey deployment

1 month

August 2020

Collect data

1 – 4 months

August – November 2020

Clean and analyze data

3 – 5 months

October – December 2020

Write Wave 1 report

5 months

December 2020

Activity

Apx. months after OMB approval

Optimal Dates

2022 Wave 2 – Program Year 4

Schedule second interviews and survey deployment with SEAs

16 months

November 2021

Schedule second interviews and survey deployment with LEAs

18 months

January 2022

Schedule second interviews and survey deployment with schools

18 months

January 2022

Collect data

18 – 22 months

January – May 2022

Clean and analyze data

22 – 24 months

May – July 2022

Write Wave 2 report

25 months

August 2022

Write Final report

N/A**

August 2023

Write manuscript

N/A**

August 2023

* Total number of participants >= 9; ** Outside of period of 36 months

A.17 Reason(s) Display of OMB Expiration Date is Inappropriate

Not applicable. All data collection instruments will display the expiration date for OMB approval of the information collection.

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions

Not applicable. There are no exceptions to the certification.



References

  1. Murray, N. G., Low, B. J., Hollis, C., Cross, A. W., & Davis, S. M. (2007). Coordinated school health programs and academic achievement: A systematic review of the literature. Journal of School Health, 77(9), 589–600. https://doi.org/10.1111/j.1746-1561.2007.00238.x.

  2. Rasberry, C. N., Slade, S., Lohrmann, D. K., & Valois, R. F. (2015). Lessons Learned From the Whole Child and Coordinated School Health Approaches. Journal of School Health, 85(11), 759–765. https://doi.org/10.1111/josh.12307.

  3. Chiang, R. J., Meagher, W., & Slade, S. (2015). How the Whole School, Whole Community, Whole Child Model Works: Creating Greater Alignment, Integration, and Collaboration Between Health and Education. Journal of School Health, 85(11), 775–784. https://doi.org/10.1111/josh.12308.

  4. Bureau of Labor Statistics. (May 2017) Occupational Employment and Wages. http://www.bls.gov/oes/current/oes119151.htm. Accessed 15 February, 2019.


1 Bureau of Labor Statistics. Occupational Employment and Wages, May 2018. https://www.bls.gov/oes/current/oes119199.htm. Accessed March, 2019.


2 Bureau of Labor Statistics. Occupational Employment and Wages, May 2018. https://www.bls.gov/oes/current/oes252021.htm. Accessed March, 2019. Annual salary was divided by 2,080 hours which is the standard used by the Bureau of Labor Statistics for calculating hourly wages.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleInformation Collection Request Procedures
AuthorCenters for Disease Control and Prevention, Agency for Toxic Sub
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy