Water Toolkit SS Part B

XXH8-SHB-WATER-SSB.docx

Information Collections to Advance State, Tribal, Local and Territorial (STLT) Governmental Agency System Performance, Capacity, and Program Delivery

Water Toolkit SS Part B

OMB: 0920-0879

Document [docx]
Download: docx | pdf









School Health “Water Toolkit”: Assessing Awareness, Satisfaction, and Utility


OSTLTS Generic Information Collection Request

OMB No. 0920-0879





Supporting Statement – Section B







Submitted: 8/11/14








Program Official/Project Officer

Sarah Sliwa

School Health Branch/Division of Population Health

National Center for Chronic Disease Prevention and Health Promotion

Centers for Disease Control and Prevention

4770 Buford Hwy, NE, MS E-70, Atlanta, GA 30341

cubicle 6221.4, mailstop F78

Phone: 770-488-0946

Fax: 404-488-5771

[email protected]


Section B – Information Collection Procedures


  1. Respondent Universe and Sampling Methods

Data will be collected through a two-phased assessment administered to the respondent population. The respondent universe for both phases consists of 85 State Department of Public Health and 6 State Department of Education staff acting in their official capacities to implement strategies to support healthier school nutrition environments, which encompasses all 50 states and the District of Columbia. These staff are State Public Health Actions to Prevent and Control Diabetes, Heart Disease, Obesity and Associated Risk Factors and Promote School Health/1305 (“1305/State Public Health Actions”) grantees funded by the Centers for Disease Control and Prevention (CDC).


The universe includes state health officers, public health information officers, school health coordinators, directors in chronic prevention sections of state health departments, as well as school health coordinators and similar positions based in state health and education departments. Implementing strategies to support physical activity, nutrition, and coordinated management of chronic conditions in school settings are the responsibilities of both the health department and public school staff. Staffing configurations vary among the 1305/State Public Health Actions grantees such that in some states, the primary contact for the implementation of school-based strategies is based in the health department and in others the main contact is in the Department of Education or Public Instruction. In order to reliably reach those leading the implementation of activities to improve students’ access to drinking water, the respondent universe includes personnel from both settings.


The universe comprises the individuals responsible for the School Health Strategies as identified through the primary staff contact list provided by the CDC program officers for 1305/State Public Health Actions. The universe comprises 85 officials from the Department of Health and 6 from Departments of Education or Public Instruction across the 50 states and the District of Columbia who will be contacted through this assessment. No sampling procedures are required as everyone in the respondent universe will be asked to participate in the assessment.


  1. Procedures for the Collection of Information

Data will be collected through a two-phased assessment administered to the respondent population described above. The online assessments have been developed using Survey Monkey, a commercial off-the-shelf software application that is customizable to include skip patterns, which will ensure that questions are relevant to respondents based on previous responses, and will reduce the response burden.


Awareness and Planning Assessment Instrument (Phase 1: September 2014):

An email notification will be sent to the respondent universe with a link to the online assessment tool and instructions for completing it. The notification email (see Attachment F—Notification Email) will explain:

  • Duration of survey period: The assessment will remain open for three weeks (15 business days) to allow ample time for respondents to complete it.

  • Instructions for completing the survey: Respondents are encouraged to complete the assessment in one sitting, but may complete the assessment in multiple sessions, if necessary.

  • Considerations for respondents’ privacy: The introduction to the survey describes how the data will only be reported in aggregate and without any information that could be used to identify individual respondents.

A reminder email will be sent to non-respondents in the second week (8th business day) of the assessment period (see Attachment H—Reminder Email Week 1), a final reminder will be sent the day before the end of the assessment period (14th business day) (see Attachment I—Reminder Email Week 2),


Utilization Feedback Tool (Phase 2: April/May 2015):

An email notification will be sent to the same respondent universe from the phase 1 assessment. Respondents will receive a link to the online assessment tool and instructions for completing it. The notification email (see Attachment J—Notification Email) will explain:

  • Duration of survey period: The assessment will remain open for three weeks (15 business days) to allow ample time for respondents to complete it.

  • Instructions for completing the survey: Respondents are encouraged to complete the assessment in one sitting, but may complete the assessment in multiple sessions, if necessary.

  • Considerations for respondents’ privacy: The introduction to the survey describes how the data will only be reported in aggregate and without any information that could be used to identify individual respondents.

A reminder email will be sent to non-respondents in the second week (8th business day) of the assessment period (see Attachment K—Reminder Email Week 1) and a second/final reminder will be sent the day before the assessment ends (14th business day) (see Attachment L—Reminder Email Week 2).


After data collection is complete, data will be exported from Survey Monkey into an Excel spreadsheet file. Data will be reviewed for completion and simple descriptive statistics will be run looking at response frequencies. Depending on the response distribution, frequencies may be cross-tabulated to identify response similarities and differences between respondents that work for the department of health as compared to those based at the department of education.


  1. Methods to Maximize Response Rates Deal with Nonresponse

Although participation in the assessment is voluntary, the project team will make every effort to maximize the rate of response. The two tools were designed with particular focus on providing multiple choice options instead of narrative responses, and streamlining questions to allow for skipping questions based on responses to previous questions, thereby minimizing response burden.




Awareness and Planning Assessment Instrument:

Reminder emails (see Attachments H, I) will be sent to those who have not completed the assessment at two points during the assessment period (business days 8 and 14).


Utilization Feedback Tool:

Reminder emails (see Attachments K, L) will be sent to those who have not completed the assessment at two points during the assessment period (business days 8 and 14).


  1. Test of Procedures or Methods to be Undertaken

The Word version of the questionnaire was revised and refined with 4 individuals at the CDC. The online version of the assessment tool was pilot tested by CDC employees who were familiar with the “Water Toolkit” or had experience working with state health department staff in the respondent universe. Pilot test participants were not involved in the development of the instrument, nor are they part of the respondent population. A pilot test questionnaire was developed to standardize feedback received from the pilot test participants. Pilot participants were asked to provide feedback on content (question wording and comprehensiveness of questions), layout (look and feel), and length. In addition, the assessment team members were assigned hypothetical roles to ensure the longest possible navigation pattern. Subsequent tests were conducted to ensure the skip patterns function as intended. The assessment instrument was refined based on feedback received from the pilot test of the entire survey (e.g., no skip patterns initiated).


The estimates for burden hours are based on pilot tests of the assessment instruments.


Awareness and Planning Assessment Instrument:

In the pilot test, the average time to complete the instrument including time for reviewing instructions, gathering needed information and completing the instrument, was approximately 10 minutes. Based on these results, the estimated time range for actual respondents to complete the instrument is 7-12 minutes. For the purposes of estimating burden hours, the upper limit of this range (i.e., 12 minutes) is used.


Utilization Feedback Tool:

In the pilot test, the average time to complete the instrument including time for reviewing instructions, gathering needed information and completing the instrument, was approximately 7.5 minutes. Based on these results, the estimated time range for actual respondents to complete the instrument is 5-11 range minutes. For the purposes of estimating burden hours, the upper limit of this range (i.e., 11 minutes) is used.





  1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


Sarah Lee

Sarah Sliwa

Health Scientist

ORISE Fellow

School Health Branch

School Health Branch

DPH/NCCDPHP

DPH/NCCDPHP

770-488-6126

770-488-0946

[email protected]

[email protected]




LIST OF ATTACHMENTS – Section B

Note: Attachments are included as separate files as instructed.


  1. Assessment Instrument Notification Email

  2. Assessment Instrument Reminder Email Week 1

  3. Assessment Instrument Reminder Email Week 2

  4. Feedback Tool Notification Email

  5. Feedback Tool Reminder Email Week 1

  6. Feedback Tool Reminder Email Week 2


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorgel2
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy