“Assessing Education Agency Staff Perceptions of School Climate and Youth Access to Services”
OMB #0920-1048
Expiration Date: 2/28/2018
REVISION
Supporting Statement Part A
January 26, 2018
Supported by:
Centers for Disease Control and Prevention
Catherine Rasberry, PhD
CDC/OID/NCHHSTP, Health Scientist
(404) 718-8170 (phone)
(404)-718-8045 (fax)
Table of Contents
A. 1 Circumstances Making the Collection of Information Necessary
A. 2 Purpose and Use of Information Collection
A. 3 Use of Improved Information Technology and Burden Reduction
A. 4 Efforts to Identify and Use of Similar Information
A. 5 Impact of Small Businesses or Other Small Entities
A. 6 Consequences of Collecting the Information Less Frequently
A. 7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5
A. 8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency
A. 9 Explanation of Any Payment or Gift to Respondents
A. 10 Assurance of Confidentiality Provided to Respondents
A. 11 Justification for Sensitive Questions
A. 12 Estimates of Annualized Burden Hours and Costs
A. 13 Estimates of Other Annual Cost Burden to Respondents or Record Keepers
A. 14 Annualized Cost to Federal Government
A. 15 Explanation for Program Changes or Adjustments
A. 16 Plans for Tabulation and Publication and Project Time Schedule
A. 17 Reason(s) Display of OMB Expiration Date is Inappropriate
A. 18 Exceptions to Certification for Paperwork Reduction Act Submissions
Exhibit 12.1: Estimated Annualized Burden Hours
Exhibit 12.2: Estimated Annualized Burden Costs
Exhibit 14.1: Annualized Cost to the Government
Exhibit 16.1: Project Time Schedule
List of Attachments
Attachment Number |
Document Description |
1 |
Public Health Service Act Legislation |
2 |
60 Day FRN |
2a |
60 Day FRN Public Comments and Response |
3 |
MS Word Data Collection Instrument for Broward County Public Schools |
4 |
MS Word Data Collection Instrument for Los Angeles Unified School District |
5 |
MS Word Data Collection Instrument for San Francisco Unified School District |
6 |
Web-based Data Collection Instrument for Broward County Public Schools |
7 |
Web-based Data Collection Instrument for Los Angeles Unified School District |
8 |
Web-based Data Collection Instrument for San Francisco Unified School District |
9 |
School Climate Index Interview Guide for District-Level Administrators |
10 |
School Climate Index Interview Guide for School-Level Administrators |
11 |
School Climate Index Interview Guide for School Staff |
12 |
Organizations and Individuals Providing Consultation on the Information Collection |
13 |
Contractor’s IRB Approval for Web-based Data Collection Instrument and Protocol |
14 |
Contractor’s IRB Approval for School Climate Interview Guide |
15 |
Consent Statement for Web-based Data Collection Instrument for Broward County Public Schools |
16 |
Consent Statement for Web-based Data Collection Instrument for Los Angeles Unified School District |
17 |
Consent Statement for Web-based Data Collection Instrument for San Francisco Unified School District |
18 |
Consent Statement for School Climate Index Interviews |
19 |
Example Data Analysis Table Shells |
Goal:
The
goal of the project is to assess
HIV and STD prevention efforts occurring in three local education
agencies (LEA) funded by the Centers for Disease Control and
Prevention (CDC) under cooperative agreement PS13-1308. The LEAs
have conducted a variety of program activities to increase student
access to key health services (such as HIV/STD testing) and improve
school climate. This information collection will allow CDC and
participating LEAs to better understand program activities and make
recommendations for improving similar work. This ICR will cover the
final data collection for a 5-year project.
Intended use of resulting data:
The
information collection will be used for LEAs to refine their program
strategies and for CDC to revise or refine the strategies they
recommend LEAs use in future HIV/STD prevention efforts.
Methods: Data
will be collected through 2
separate, but complementary, information collections. The first uses
a Web-based instrument to collect information from up to 735 LEA
employees across the 3 LEAs funded for strategy 4 of PS13-1308 in
order to learn about professional development, referral practices,
community linkages/partners, school climate, school policies and
practices, and staff comfort levels in helping address the needs of
youth at risk. The second data collection uses an interview guide
to gather information from up to 44 LEA employees (2 district-level
staff and up to 6 school level employees in each of 7 schools) in 1
LEA (Broward County Public Schools in Fort Lauderdale, Florida) to
learn about six domains that can impact school climate. Subpopulation
to be studied: LEA and
school staff who work in schools participating in the prevention
project funded by CDC’s PS13-1308. Data
analysis: Plans
for tabulation and publication of data from this information
collection include analyzing data for differences in key outcomes
between data from this final data collection and the baseline and
mid-point data. Findings will be presented in written reports for
the LEA and possibly, peer-reviewed journals.
Responses for each should be no more than 2 or 3 sentences to orient the reviewer to the contents of the package. The information collection request must show a clear link between the methods, the goal, and the use of the data.
Section A: Justification for Information Collection
Background
The Centers for Disease Control and Prevention (CDC) requests a 1-year OMB approval for the revision of the information collection OMB #0920-1048. The information collection uses 2 separate, but complementary, information collections to conduct assessment of prevention efforts that are taking place in three local education agencies (LEA) funded by the Centers for Disease Control and Prevention (CDC) under PS13-1308: Promoting Adolescent Health through School-Based HIV/STD Prevention and School-Based Surveillance. This revision to the previous OMB approval is intended to cover the final data collection for a series of data collections covered under the previous ICR (OMB# 0920-1048). The data collection will provide data and reports for the funded LEAs, and will allow the LEAs to identify areas of the program that are working well and other areas that will need additional improvement. In addition, the findings will allow CDC to determine if changes in key outcomes took place following the implementation of currently recommended strategies and make changes to those recommendations if necessary. This revision request involves no changes to the instruments, methods, protocols, or burden estimates per cycle of data collection; however, annualized burden estimates have technical changes from the previous approval due to changes in the number of data collections planned and the length of clearance period requested.
HIV infections are disproportionately high among remain high among certain subgroups of youth,1,2 and sexual risk behaviors associated with HIV, other sexually transmitted diseases (STD), and pregnancy often emerge in adolescence. For example, 2011 Youth Risk Behavior Surveillance System (YRBSS) data revealed 41.2% of U.S. high school students reported having had sex, and among those who had sex in the previous three months, 43.1% reported having not used a condom during last sexual intercourse.3 Given the disproportionate risk for HIV among some youth, it is important to find ways to reach the youth with interventions to decrease sexual risk behaviors and increase health-promoting behaviors such as routine HIV testing. Schools provide one opportunity for this.
CDC awarded funds to implement PS13-1308: Promoting Adolescent Health through School-Based HIV/STD Prevention and School-Based Surveillance in order to build the capacity of state and local education agencies and support the efforts of national, non-governmental organizations (NGOs) to help schools develop and implement sustainable adolescent-focused program activities. Within that cooperative agreement, three local education agencies and one national, non-governmental organization were funded prevention work focused on reaching youth at the highest risk for HIV/STD. Program activities focus on increasing access of students (particularly those at highest risk for HIV and STDs) to key health services and improve the school climate. Activities are include implementation of referral systems for connecting youth to key services, programs and professional development to improve school climate, social marketing campaigns that address key outcomes, and educating staff on existing policies that are related to access to care or the school environment.
The assessments included in the initial information collection request were designed to provide baseline and mid-point measurements of several more proximal project outcomes of interest. The current proposed revision to the information collection request is designed to cover the final data collections for the program to provide end-point data on outcomes such as: number of students referred to existing community services, school staff knowledge of community organizations that offer services to youth, and school climate. Results of assessments will identify possible program effects and be used to inform development and refinement of program strategies and approaches for these districts and possibly for similar districts in future projects.
Consistent with the procedures approved under the initial information collection request, data will be collected from LEA employees through 2 separate, but complementary, information collections. The first information collection will involve collecting information from a total of up to 735 LEA employees across 3 LEAs (the 3 LEAs funded for strategy 4 of cooperative agreement PS13-1308) through a Web-based instrument. The instrument will include items that ask education agency staff about professional development, referral practices, community linkages/partners, school climate, school policies and practices, and staff comfort levels in helping address the health needs of youth. This data collection will be the third and final in a series of 3 data collections used to evaluate the 5-year program. Baseline and mid-point data were collected under the previously approved ICR.
The second information collection will be conducted in only 1 LEA (Broward County Public Schools in Fort Lauderdale, Florida) and is designed to provide an in-depth assessment of one LEA as a way to supplement the Web-based data collection with more detailed information. This information collection will involve in-person interviews with up to 44 LEA employees (2 district level employees, and up to 6 school level employees in each of 7 schools) to learn about six domains that can impact school climate: policy, practice, programs, professional development, place, and pedagogy. To allow for flexibility in scheduling, interview guides have been constructed to also allow for small group interviews functioning as focus groups. These information collection activities are proposed only for Broward County Public Schools (BCPS) because BCPS expressed interested in receiving assistance to conduct more detailed evaluation activities to examine the activities they implement through the cooperative agreement. This data collection will be the second and final in a series of 2 data collections used to evaluate school climate impact of the 5-year program. Baseline data were collected under the previously approved ICR.
CDC is authorized to collect the data described in this request by Section 301 of the Public Health Service Act (42 USC 241). A copy of this enabling legislation is provided in Attachment 1. In addition to this legislation, this data collection also supports initiatives such as Healthy People 2020, which provides national health objectives and outlines a comprehensive plan for health promotion and disease prevention in the United States. Of the Healthy People 2020 objectives, 31 objectives align specifically with PS-13-1308 activities related to reducing HIV infection, other STD, and pregnancy among adolescents.
The privacy act does not apply as no individually identifiable information will be collected. Any PII that is provided by the local education agency for the purposes of participant recruitment will remain completely separate from the information gathered through the Web-based instrument and the in-person interviews/focus groups, and will kept secure by the project team.
The information collection system consists of (1) Web-based questionnaires (see Attachments 3-8) and (2) in-person interview/focus group guides (see Attachments 9-11). Both are explained in detail below. These instruments have not changed from those approved under the previous ICR.
Web-based Instrument
The information collection system consists of a Web-based instrument designed to assess education agency staff for their experiences with and perceptions of professional development, referral practices, community linkages and partners, school climate, school policies and practices, and staff comfort levels in helping address the needs of youth. The project team created 3 versions of the Web-based instrument—one version for each LEA participating in the project. Each instrument contains the same set of 42 core questions, and each instrument contains an additional set of 4-5 supplemental questions that were requested additions by the LEAs. The instrument for Broward County Public Schools includes a maximum of 47 questions. The instrument for Los Angeles Unified School District contains a maximum of 46 questions. The instrument for San Francisco Unified School District contains a maximum of 47 questions. Although the number of questions is slightly different for the 3 instruments (47 questions for 2 LEAs and 46 questions for 1 LEA) the response burden is consistent across all instruments.
The information collection instrument will be administered as a Web-based instrument which uses programmed automatic skip patterns, so it is expected that not all respondents will be presented with all questions. (As a note, skip patterns are noted in the Microsoft Word versions of the instrument for the reviewers of this information collection request, but are not visible to respondents in the Web-based versions that they will use.) Two questions are open ended and all other questions are multiple choice. For several of the multiple-choice questions, respondents have the opportunity to add another response option under the “Other, please specify” answer choice. The instruments will be distributed and data will be collected using the Web-based data collection tool, SurveyMonkey®. The Web-based instruments will collect information on the following:
Respondent’s work characteristics. This information will be collected with 3 questions that ask the name of the respondent’s school, his/her role at the school (e.g., teacher, administrator), and the length of service in his/her role at the agency (in multiple-choice format)
Perceptions of students’ experiences at school related to safety, absenteeism, and bullying. This information will be collected with 15 questions—all multiple choice.
Staff perceptions of and experiences with a referral system for connecting youth to existing services in the community. This information will be collected with 18 questions (16 multiple choice and 2 open-ended). Respondents will be asked about their awareness of a referral protocol in the school as well as the estimated number of referrals they have made to connect youth to existing services. Respondents will also be asked about their awareness of services in the community for youth. The two open ended questions will allow respondents to write in the names of the organizations they know of in their community that help to serve youth.
Professional development and policies. This information will be collected through 6 questions that assess the topics on which staff have received professional development (2 questions) and the policies they report having at their school (4 questions). All 6 questions are multiple choice.
Supplemental LEA-specific topics of interest. This information will be collected in additional questions on each instrument. On the Web-based data collection instrument for Broward County Public Schools, 5 questions address use of the school climate-related support guide, and staff awareness of a specific policy and curriculum. On the Web-based data collection instrument for Los Angeles Unified School District, 4 questions address referrals and professional development. On the Web-based data collection instrument for San Francisco Unified School District, 5 questions address interactions with school wellness centers and experiences with certain types of bullying. All items are multiple choice.
The information collection instrument was originally pilot tested by 5 individuals (3 teachers or former teachers and 2 members of the project team). Feedback from this group was used to refine questions as needed, ensure accurate programming and skip patterns and establish the estimated time required to complete the information collection instrument. Use of the data collection instrument under the previously approved ICR went as planned and provided support for continued use with the same instrument and methods.
Interview Guide
The information collection system consists of in-person interviews/focus groups designed to assess education agency staff for their experiences with and perceptions of six school-related domains that can impact or influence health and school climate among students: policy, practice, programs, professional development, place (the physical environment of a school), and pedagogy.
The complete interview guide is divided into 3 sections (in essence, 3 separate interview guides)—one for each respondent type (i.e., district-level administrator, school-level administrator, and school staff). These sections (3 distinct interview guides) are provided in attachments 9, 10, and 11 of this package. Each of these 3 sections (or guides) is further divided into 6 domains of interest—(1) policy (part A is district-level policy and part B is school-level policy); (2) practice; (3) programs; (4) professional development (part A is district-level professional development and part B is school-level professional development); (5) place; and (6) pedagogy. In addition, each section of the interview guide includes primary questions and secondary questions. The primary questions are the most essential and will be priority questions for the interviewers. If there is time remaining in the interview/focus group, the interviewer will ask some or all of the secondary questions. For district administrators, there are 10 primary questions and 8 secondary questions; these will be split between 2 respondents. For school administrators, there are 16 primary questions and 9 secondary questions; these will be split between the 2 respondents per school (in 7 schools). For school staff, there will be 13 primary questions and 12 secondary questions; these will be asked during group interviews/focus groups with up to 4 respondents per school (in 7 schools). Questions are open-ended.
The interview guide (as a whole, which includes all three sections) will collect information on the following:
Policy. There are 27 questions (7 primary and 5 secondary questions for district administrators; 4 primary and 2 secondary questions for school administrators; 4 primary and 5 secondary questions for school staff)
Practice. There are 23 questions (7 primary and 5 secondary questions for school administrators; 6 primary and 5 secondary questions for school staff)
Programs. There are 3 questions (1 primary and 2 secondary questions for school staff)
Professional development. There are 10 questions (3 primary and 3 secondary questions for district administrators; 2 primary questions for school administrators; 2 primary questions for school staff)
Place. There are 2 questions (2 primary questions for school administrators)
Pedagogy. There are 3 questions (1 primary and 2 secondary questions for school administrators)
The instrument will be administered as a series of in-person interviews/focus groups with up to 44 different education agency employees from the district and 7 schools. The information collection instrument (interview guide) was originally pilot tested with 3 individuals with evaluation and/or school experience. Feedback from this group was used to refine questions as needed and establish the estimated time required to complete the interview/focus group segments. Use of the data collection instrument under the previously approved ICR went as planned and provided support for continued use with the same instrument and methods.
Data gathered from the Web-based data collection instruments and interviews/focus groups will allow the funded education agencies to assess their program activities conducted under PS13-1308. This ICR represents 1 of three information collections being used to evaluate PS13-1308, which is scheduled run through July 2018. This ICR is for data collections from school staff; the other two ICRs are for data collection from students in a participating LEA (OMB # 0920-1035) and community partners—specifically, health and wellness center partners—that are working with the funded LEAs (#0920-1084). In combination, the three ICRs allow a robust assessment of outcomes from PS13-1308 from the perspectives of multiple stakeholders—students, school staff, and community partners. Under the revision of this ICR, end-point data from school staff will be collected for analysis with the baseline and mid-point data collected in the previous years. Findings will allow the LEAs (and the CDC) to determine if their activities are impacting outcomes such as school climate and staff referrals to existing services for youth.
Data collected under the previously approved ICR have been summarized and provided in reports for the participating LEAs. They have used this information to identify existing strengths, weaknesses, and gaps to shape program development and refine future activities. The previously collected baseline and mid-point data will be used for comparison with the data collected under this proposed ICR to determine program impact. Analysis of data from the Web-based data collection instrument will include frequencies, tests for differences among types of school staff, and comparisons across data collection points in years to come. Analysis of interview/focus group data will involve iterative code development, use of qualitative data analysis software (such as ATLAS.ti or MAXQDA), and identification of major themes within the data. The findings from this information collection also have practical utility to the government because they can impact both the activities used by the CDC-funded LEAs and the strategies and approaches CDC recommends for use in schools.
Without this final data collection, the LEAs would be unable to fully determine if their program activities had the desired impact. In addition, without collecting this data, CDC would have little evidence on several of the innovative strategies being used to enhance prevention efforts in schools.
Web-based Instrument
Questionnaire data will be collected via Web-based instruments allowing respondents to complete and submit their responses electronically. This method was chosen to reduce the overall burden on respondents. The information collection instrument was designed to collect the minimum information necessary for the purposes of this project through the use of automated skip patterns (i.e., limited to a maximum of 47 questions and a possibility of skipping up to 17 questions). In addition, the Web-based administration allows respondents to easily access the data collection instrument at a time and location that is most convenient for them.
This aspect of the information collection system involves using a Web-based information collection instrument. Respondents will be sent a link directing them to the online instrument only (i.e., not a website). No website content will be directed at children.
Interview Guide
Interviews/focus groups will be conducted in-person by trained interviewers. Interviews/focus groups will be audio-taped with permission of the respondents. This may help reduce the amount of time required of participants because the interviewer will not have to pause for note-taking.
Web-based Instruments
In preparation for collection of data from school staff in all three LEAs, the project team reviewed the literature for any existing instruments or data collection activities that asked about professional development, referral practices, community linkages/partners, school climate, school policies and practices, and staff comfort levels in helping address the health needs of youth. Most of these topics were not covered by existing instruments or data collections. There was no instrument or data collection that collected all of the information we sought to collect. For this reason, the project team developed the school staff data collection instruments (one tailored to each district). Where possible, we adapted questions from existing questionnaires. For example, professional development and policy questions were adapted from CDC’s School Health Policies and Practices Study questionnaire. The newly developed questionnaire will allow the project team to collect the relevant data from the specific schools that are participating in this project. There is no other source of information that can provide the relevant data specifically from the project’s participating schools.
Interview Guide
In preparation for collection of data from school staff in one LEA through in-person interviews/focus groups, the project team reviewed the literature for any existing instrument or data collection activities that provide in-depth information about the six domains that can impact school climate. To do this more effectively, the project team partnered with representatives from Hetrick-Martin Institute (HMI). HMI staff had conducted similar work in the past, so the interview guide used in this project was based on existing interview guides. Although HMI had conducted similar work in the past, it was not conducted in the LEA participating in this project. In addition, the review of the literature enabled further development of the guide to ensure that it reflected a broad range of elements that could impact school climate. There is no other source of information that can provide the relevant in-depth information on school climate from the schools participating in this project.
No small businesses or other small entities will be involved in or impacted by this data collection.
This ICR covers a one-time data collection in 2018 (the 2017-2018 school year). The previously approved version of this ICR covered data collections in 2014-2015 and the 2016-2017 school years. These time points align with the initiation of program activities, an approximate mid-way point, and the end point (2018) of program activities funded under PS13-1308. Because we are not collecting personal information from respondents, no attempt will be (or could be) made to include the same participants in each collection; the samples are drawn independently. There are no technical or legal obstacles to reduce the burden.
There would be a number of consequences to collecting the data less frequently. First of all, this was designed to use the fewest data collections to achieve project goals. The first data collection is essential to provide a clean baseline for the assessment and present an accurate picture of what staff perceptions and experiences were like prior to initiation of program activities and strategies. Without this first data collection, it would have been impossible to determine if the program had any impact. Furthermore, this initial baseline data collection provided critical information that LEA staff used to determine the most appropriate focus of their activities. It allowed them to determine areas of greatest need and incorporate that into program planning. The second data collection (in the 2016-2017 school year) was really essential for good public health practice. One of the key purposes of the assessment was not simply to know if activities worked but to be able to make mid-course corrections to improve the likelihood that future activities can have even greater impact. The mid-program data collection allows LEAs to assess impact midway through the program and to make improvements based on strengths or weaknesses revealed by the data. Finally, the 2018 data collection is essential for determining the full impact of the funded activities and will allow time for program activities to actually result in changes in the school staff experiences (particularly those related to making referrals). Without these three data collection points, the LEAs and CDC would not be able to achieve both goals of improving program activities and assessing their impact.
There are no special circumstances with this information collection package. This request fully complies with the regulation 5 CFR 1320.5 and will be voluntary.
As required by 5 CFR 1320.8(d), a 60-day Notice was published in the Federal Register on August 17, 2017, Vol 82, No. 158, pages 39,125-39,127 (see Attachment 2). Four public comments were received. The comments and responses are documented in Attachment 2a.
The local education agencies involved in this information collection were consulted to discuss all aspects of the data collection. They provided extensive feedback on the availability of existing data, other data collections in their LEAs and the frequency of data collection for this project. In addition, CDC contractors provided extensive input into the clarity of instructions, reporting formats, and the data elements that will be reported. LEA staff also reviewed and approved this information.
These consultations took place in 2013 and 2014. A list of organizations and individuals consulted is provided in Attachment 12. There were no major problems that arose during the consultation, and all issues raised were resolved.
Web-based Instruments
For the web-based instrument aspect of this data collection, no payment, gift, or incentive will be provided to the respondents for their participation.
Interview Guide
Tokens of appreciation for data collection participation are an important tool used in studies and are particularly important for the population in this information collection. Educators (including teachers, principals, school counselors, school nurses, and other school staff) work within extremely regimented schedules that offer little room for flexibility or variation in the way they spend the time during their work days. In the study team’s extensive experience working with schools and school staff, we have consistently heard that time is extremely hard to come by for school staff. In our experience, the lack of time for school personnel is such a substantial concern for school administrators, that local education agencies often restrict the commitments they allow school personnel to make for tasks such as data collection. A study funded by the U.S. Department of Education helped document some of the time constraints faced by school staff. In that study of middle school teachers, researchers identified a number of time-related challenges, two of which included “feeling overwhelmed” and “lack of discretionary time”.4 Discretionary time, in that study, was defined as “the time when teachers are free from scheduled responsibilities and can decide what to do,” and the study found that true discretionary time for teachers was rare. Administrators typically set teachers’ schedules, and the majority of their time was spent with students. Even “free time” was often spent with set responsibilities such as team meetings, parent conferences, student meetings, supervising lunch rooms, and moving students from one place to another.4 It is precisely this lack of discretionary time that can make achieving high response rates among educators a challenge. In this particular data collection, it is expected that many school staff will need to participate in the data collection outside of their regular work hours, which produces an additional burden for them that threatens to impact response rates. Other researchers have found that providing incentives for school staff such as school counselors5 and school principals6 have increased their likelihood of participation.
Given the considerations outlined above and the estimated burden on the in-person interviews/focus groups, gifts to respondents in the form of $25 Visa gift cards are proposed for participants. These interviews/focus groups are not part of the participants’ job duties and are completely voluntary in nature. Furthermore, interviews/focus groups are estimated to be 1 hour or 1.5 hours (depending on the role of the individuals participating), and this represents an extremely large block of time for personnel working in schools. Scheduling time for these interviews/focus groups will require a great commitment from school staff, and not only will it be an activity in addition to their regular duties, it is likely many interviews/focus groups will take place outside of regular working hours. In addition, in spite of attempts to minimize burden on respondents, the participants may have to leave their regular work buildings to get to the location for interviews/focus groups, representing an additional expenditure of effort on behalf of the participants.
A unique aspect of this proposed interview/focus group data collection is that it is not only the overall sample size that is critical to the quality of the research, but the participation of the specific individuals who have been invited to participate. Interviews/focus groups will be requested from specific school staff based on their roles and their ability to speak to different elements of the school environment; therefore, in order to accurately describe the school climate, it will be essential to gain participation from those particular individuals. One of the challenges for the study team is that research shows that in data collection with education staff, “there is likely to be a strong association with nonresponse and the survey topics of interest”.6 Educators who have interest in the topic of the survey or who participated in the program being assessed may be more likely to participate in interviews or focus groups, introducing a bias and limiting our ability to get a true picture of the school climate. Given that the topic of the proposed information collection includes questions on the school climate, a topic which is likely to have widely varying levels of appeal to school staff, and the fact that by this round of data collection some, but not all, staff may have participated in program activities, the study team believes that the potential for bias from interest in the topic is a particular concern for the data collection. The use of incentives can help minimize bias resulting from variations in interest in the topic by helping the motivate the staff members recruited for interviews/focus groups to actually make the commitment of their time necessary for participating in the interviews/focus groups. Krueger and Casey (2009) note that the gift helps emphasize to participants that the assessment is important, which in turn will make them more inclined to make time to participate. More specifically, the incentive basically “serves to protect the promised time slot from being preempted.”7 In this data collection, the use of incentives is expected to minimize bias related to interest in the topic, and therefore, increase the quality and accuracy of data collected.
It is for these reasons that the study team is proposing to offer $25 gifts to interview/focus group participants. Both Goldenkoff (2004)8 and Quinn Patton (2002) support the use of incentives.9 We expect $25 to be sufficient to improve participation rates and it is consistent with what has been cited in the literature on response rates; for example, in a 2008 article, Cantor, O’Hare, & O’Connor state that “a number of studies have found that promised incentives of $15-$35 increase response rates.”10 Although this amount is slightly under the more standard incentive of $40 for focus groups, we believe this is an appropriate amount given that most interviews/focus groups will be conducted on the campuses where participants work. IRB approval of the study included the review and approval of the $25 gift. In addition, the research office of each the participating LEA has approved the data collection with these gifts included as part of the protocol. This gift is consistent with that approved and used under the previously approved ICR.
Information collection procedures were reviewed by the NCHHSTP IT Security Information System Security Officer (ISSO) and CDC Privacy Officer to assess this package for applicability of 5 U.S.C. § 552a, and determined that the Privacy Act does not apply to the overall information collection. No individually identifiable information is being collected. CDC staff have reviewed this information collection request and determined that the Privacy Act does not apply. We anticipate no adverse impact of the proposed data collection on respondents’ privacy because no individually identifiable information will be collected as part of this information collection. Any PII that was provided for the purposes of participant recruitment will remain separate from the information gathered through the Web-based instrument and the interviews/focus groups, and will kept secure by the project team.
Web-based Instrument
For the Web-based questionnaire, no sensitive information is being collected and no individually identifiable information will be recorded or stored as part of the questionnaire or database. PII will be used only to administer the questionnaire and will be stored separately with no information that links members to their responses.
Interview Guide
For the in-person interviews/focus groups, no sensitive information is being collected and no individually identifiable information will be recorded or stored as part of the interviews/focus groups or database. All notes and/or recordings will be kept separate from the names of participants. Responses will only be reported in aggregate due to the small sample size. Reports will focus on overall climate at the schools rather than individual responses. The interview/focus group participants' names will not be associated with specific quotes or comments. In addition, all reports will be written in a way in which no comments will be attributed to any one person.
Voluntary collection. Participants will be informed that providing the information for this data collection is voluntary.
Consent.
Web-based instrument
School staff will receive a consent form (see Attachments 15-17) that provides information about the Web-based data collection instrument and informs them that participation is completely voluntary and they may choose not to participate at any time.
Interview Guide
School staff will receive a consent form (see Attachment 18) that provides information about the in-person interviews/focus groups and informs them that participation is completely voluntary and they may choose not to participate at any time.
Safeguards and security.
Web-based Instrument
For the Web-based questionnaire, no sensitive information is being collected and no individually identifiable information will be recorded or stored as part of the questionnaire or database. Staff names and contact information, provided by the LEA, will be used only to administer the questionnaire and will be stored separately with no information that links members to their responses. The questionnaire will be offered through a secure Website, SurveyMonkey®, that uses Secure Socket Layer technology to protect user information. Once data are transmitted to the Contractor over a secure and encrypted connection, data will be accessible only to the contractor's study team staff in possession of the password necessary to access and download the data. Only approved project staff will have access to the data. CDC contractors will assist the LEAs with data analyses and provide all findings to each LEA in aggregate. No responses will be traceable back to their source. Summaries of findings may also be shared with other stakeholders (e.g., the other LEAs and the NGO funded under strategy 4 of PS13-1308, CDC staff) and researchers in the field, once appropriate permissions and clearances have been secured from both CDC and the LEAs.
Interview Guide
For the in-person interviews/focus groups, no sensitive or individually identifiable information is being collected in the interviews/focus groups. All notes and/or recordings will be kept separate from the names of participants. Responses will only be reported in aggregate due to the small sample size. Reports will focus on overall climate at the schools rather than individual responses. The participants' names will not be associated with specific quotes or comments. In addition, all reports will be written in a way in which no comments will be attributed to any one person. All transcribers will be asked to sign privacy agreements and team members will be trained on security requirements. During data collection in the field, interviewers will maintain data collection materials in their possession or in secured storage at all times. All documents associated with the study will be collected and stored in a password-protected electronic file on a secure network accessible only by the Contractor's study team.
System of records. A system of records is not being created under the Privacy Act.
The proposed Web-based data collection and in-person interview/focus groups data collection protocols have been reviewed and approved by the existing contractor’s IRB (see Attachment 13 and Attachment 14). In addition, the protocols have been reviewed and approved by the research offices of the participating school districts.
Sensitive Questions
No information will be collected that are of personal or sensitive nature.
This revision request involves no changes to the instruments, methods, protocols, or burden estimates per respondent or per cycle of data collection; however, annualized burden estimates have technical changes from the previous approval due to changes in the number of data collections planned and the length of clearance period requested.
Web-based Instrument
The average time to complete the instrument including time for reviewing instructions, gathering needed information and completing the instrument, was estimated to range from 20 to 25 minutes. For the purposes of estimating burden hours, the upper limit of this range (i.e., 25 minutes) is used.
We estimate that 245 School Staff from each school district; Broward County, LA, and San Francisco, will respond 1 time taking 25 minutes each for a total burden of 102 each creating a total of 306 burden hours for the Web-based instrument.
Interview Guide
The estimated time range for actual respondents to complete the interview/focus group is as follows: 1 hour for the district level administrators; 1 hour for school level administrators; and 1.5 hours for school staff participating in a group interview/focus group. For the purposes of estimating burden hours, these are the values used.
Two (2) District-level Administrators will provide 1 response each, with each response taking 1 hour. This creates a total of 2 burden hours for the School Climate Index Interview.
Fourteen (14) School-level Administrators will provide 1 response each, with each response taking 1 hour. This creates a total of 14 burden hours for the School Climate Index Interview.
Twenty-eight (28) School Staff will provide 1 response each, with each response taking 1.5 hour. This creates a total of 42 burden hours for the School Climate Index Interview.
Table A.12-1 Estimated Annualize Burden to Respondents
Respondents |
Form Name |
Number of Respondents |
Number of Responses per Respondent |
Average Burden per Response (in hours) |
Total Burden (in hours) |
School staff |
Web-based instrument for Broward County Public Schools (Att 3 & Att 6 web) |
245 |
1 |
25/60 |
102 |
School staff |
Web-based instrument for Los Angeles Unified School District (Att 4 & Att 7 web) |
245 |
1 |
25/60 |
102 |
School staff |
Web-based instrument for San Francisco Unified School District (Att 5 & Att 8 web) |
245 |
1 |
25/60 |
102 |
District-level Administrators |
School Climate Index Interview Guide for District-level Administrators (Att 9) |
2 |
1 |
1 |
2 |
School-level Administrators |
School Climate Index Interview Guide for School-level Administrators (Att 10) |
14 |
1 |
1 |
14 |
School Staff |
School Climate Index Interview Guide for School Staff (Att 11) |
28 |
1 |
1.5 |
42 |
Total |
364 |
Annualizing this collection over one year results in an estimated annualized burden of 364 hours.
Annualized cost.
Table A.12-2 provides estimates of the annualized cost to respondents for the collection of data.
Web-based Instrument
Estimates for the average hourly wage for respondents are based on Department of Labor (DOL) data from May 2016 providing national industry-specific occupational employment and wage estimates (http://www.bls.gov/oes/current/naics4_999200.htm). Based on DOL data, an average hourly rate for education administrators is $43.82 and for educational, guidance, school, and vocational school counselors is $25.84. In addition, the DOL data reports the average salary of secondary school teachers is $60,920. To compute an hourly rate, we estimated average teachers work approximately 9 months annually (1560 hours), which equals approximately $39.05 an hour. (If anything, this may slightly over-estimate the cost burden for teachers, as many may work more than 40 hour weeks or slightly more than 9 months a year.) Given that all of these types of staff will be included in the information collection, we estimated an average hourly rate from these three and arrived at an average hourly wage of $36.24. This rate is estimated for all 735 respondents (245 in each of 3 LEAs) from whom data will be collected in 2 of the 3 years (annualized estimate of 492 total respondents per year for 3 years). Table A-12 shows estimated burden and cost information.
Interview Guide
Estimates for the average hourly wage for respondents are based on Department of Labor (DOL) data from May 2016 providing national industry-specific occupational employment and wage estimates (http://www.bls.gov/oes/current/naics4_999200.htm). Based on DOL data, an average hourly rate for education administrators is $43.82. Based on DOL data, an average hourly rate for educational, guidance, school, and vocational school counselors is $25.84. In addition, the DOL data reports the average salary of secondary school teachers is $60,920. To compute an hourly rate, we estimated average teachers work approximately 9 months annually (1560 hours), which equals approximately $39.05 an hour for teachers. (If anything, this may slightly over-estimate the cost burden for teachers, as many may work more than 40 hour weeks or slightly more than 9 months a year.) Given both of these types of staff will be included in the information collection, we estimated an average hourly rate from these and arrived at an average hourly wage of $32.45. This rate is estimated for the 28 respondents (4 in each of the 7 schools) who are school staff and from whom data will be collected in 2 of the 3 years (annualized estimate of 19 school staff respondents per year over 3 years). Table A-12 shows estimated burden and cost information.
Table A.12-2 Annualized Costs to Respondents
Type of Respondent |
Form Name |
Burden Hours |
Average Hourly Wage Rate |
Respondent Costs |
School staff |
Web-based instrument for Broward County Public Schools |
102 |
$36.24 |
$3696.48 |
School staff |
Web-based instrument for Los Angeles Unified School District |
102 |
$36.24 |
$3696.48 |
School staff |
Web-based instrument for San Francisco Unified School District |
102 |
$36.24 |
$3696.48 |
District-level Administrators |
School Climate Index Interview Guide for District-level Administrators |
2 |
$43.82 |
$87.64 |
School-level Administrators |
School Climate Index Interview Guide for School-level Administrators |
14 |
$43.82 |
$613.48 |
School Staff |
School Climate Index Interview Guide for School Staff |
42 |
$32.45 |
$1,362.90 |
|
$13,153.46 |
|||
|
|
There will be no direct costs to the respondents or record keepers other than their time to participate in each information collection.
Cost will be incurred by the government in personnel time for overseeing the project. CDC time and effort for overseeing the contractor’s assistance with data collection and answering questions posed by the contractor and funded agencies are estimated at 4% for two GS-14 (step 6) level Atlanta-based CDC employees and 1% for a GS-14 (step 10) level Atlanta-based senior CDC employee a year for the three years of the project. The grade and step levels were determined based on the experience levels of the staff currently proposed to work on the project. The senior level employee supervises the two GS-13-level employees. The average annual cost to the federal government for oversight and project management is $11,312 (Table A-14-1).
The contractor’s costs are based on estimates provided by the contractor who helped plan the data collection activities. With the expected period of performance, the annual cost to the federal government from contractor and other expenses is estimated to be approximately $42,500 (Table A-14-1). This is the cost estimated based on the current funding level of the contractor at approximately $850,000 per year and the percentage of the contractor’s effort that is anticipated for this specific data collection. It is estimated this data collection will take approximately 5% of the contractor’s effort in the year data collection takes place. This includes the estimated cost of coordination with DASH, providing assistance to the LEA for data collection and processing, and support for analysis and reporting.
The total annualized cost to the government, including direct costs to the federal government and contractor expenses is $53,812.
Table A.14-1. Annualized and Total Costs to the Federal Government
Expense Type |
Expense Explanation |
Annual Costs (dollars) |
Direct Cost to the Federal Government |
||
CDC employee oversight for project |
1 CDC Senior Health Scientist at 1% time (GS-14, step 10) |
$1,383 |
CDC oversight of contractor and project |
2 CDC Health Scientists at 4% time each (GS-14, step 6) |
$9,929 |
Subtotal, Direct Costs to the Government per year |
$11,312 |
|
Contractor and Other Expenses |
||
Assistance with data collection, processing, and preliminary analysis |
Labor and other direct costs for supporting data collection, processing, and analysis |
$42,500 |
Subtotal, Contract and Other Expenses per year |
$42,500 |
|
Total of all annualized expenses |
$53,812 |
This request revises the approval period of the original ICR to allow for the final data collection in the planned assessment and to make technical changes to the burden estimates based on calculations for only one round of data collection over a one-year period (instead of two rounds of data collection over a three-year period in the last ICR). The initial ICR for this project included mention that this revision would be requested. Data collection instruments, consent forms, and protocols are not being changed for this final administration. The estimated burden per respondent and per data collection has not changed from the originally approved ICR.
Plans for tabulation and publication of data from this information collection include analyzing data for differences in key outcomes between baseline and follow-up data collections and publication of these findings in written reports for the LEA and possibly, peer-reviewed journals. Baseline data have been shared in written reports for the LEAs and mid-point data reports are currently in the process of being prepared.
Analysis Plan
Web-based Instrument
Data analysis will begin within two weeks after completion of data collection for the Web-based instrument. Data will be analyzed using both descriptive and inferential statistics. As relevant (where multiple items were developed to measure a larger construct such as school climate), data reduction techniques/factor analyses and item correlation analyses will be used to develop scales from within the questionnaire items, as needed. Descriptive statistics of data will assist in data cleaning, generating additional hypotheses, and summarizing the perceptions and experience of school staff, particularly related to school climate and referring students to existing community services they may need. Data will be analyzed in aggregate when possible, and also by LEA.
We plan to pool the data from school staff from across all 3 LEAs. After the first data collection, we used descriptive statistics to examine current perceptions and experiences of school staff and provided data reports to each LEA. Following the data collection covered under this ICR, data will be analyzed to identify changes from baseline and mid-point data collections to the final data collection. In particular, we will analyze for changes in: the number of students staff referred to existing community services to meet their needs, staff perceptions of absenteeism, and staff perceptions of school climate. The samples will be successive independent samples (non-linked). The project team will use t-tests and chi-square tests to examine for differences between baseline and follow up data, and may use regressions to model changes among the staff as a function of time. Additional post hoc analyses may be conducted to meet emerging needs or interests of the programs involved. A few example table shells are provided in Attachment 19: Example Data Analysis Table Shells.
Findings from the data will be summarized into written reports for all three LEAs and may be shared with other stakeholders through mechanisms such as presentations, executive summaries, or peer-reviewed articles. Findings will be used to improve the program and to help CDC better understand which types of program strategies to recommend to other schools and LEAs.
Interview Guide
Upon completion of the interviews/focus groups, all recorded interviews/focus groups will be transcribed and the transcripts will be provided to the project team. The qualitative interview/focus group data analysis will include iterative code development, establishment of intercoder reliability, single coding of full transcripts using ATLAS.ti 7 software, MAXQDA, (or a similar software), and qualitative analysis of coded data. A team of multiple coders will be used to code the interview/focus group data. To establish intercoder reliability, team members will select numerous segments of text from two randomly selected interview/focus group transcripts and team members will apply the most relevant primary code to each section of text. The consistent use of these codes will be analyzed for intercoder reliability. The coding team will meet to review any discrepancies and will continue the process until an acceptable level of intercoder reliability is reached. Then, each transcript will be coded by one coding team member for analysis. The team will later systematically analyze the coded transcripts to identify common themes that emerge. Data will be analyzed across schools to provide a general picture of school climate in the project schools.
Findings from the data collected under the previously approved ICR have been summarized at a high level for use by LEA staff in program improvement, and findings that include the data collected under this ICR will be summarized into written reports for the LEA and may be shared with other stakeholders through mechanisms such as presentations, executive summaries, or peer-reviewed articles. Findings will be used to improve the effectiveness of LEA activities and to help CDC better understand which types of program strategies to recommend to other schools and LEAs.
Project Time Schedule
Baseline data were collected in the 2014-2015 school year, and mid-point data were collected in the 2016-2017 school year. The final round of data collection (covered under this ICR) will be collected in the 2017-2018 school year. These data, and differences in baseline and follow-up data, are likely to be analyzed, summarized, and shared through unpublished or published reports in 2018.
A one year clearance is being requested.
Figure A.16-1: DASH Project Time Schedule
Activity |
Time Schedule |
First round of data collection |
|
Design Web-based information collection instrument |
Complete |
Design interview guide |
Complete |
Develop data collection protocol, instructions, and analysis |
Complete |
Pilot test Web-based information collection instruments |
Complete |
Pilot test interview guide tool |
Complete |
Prepare OMB package |
Complete |
Collect e-mail addresses for participating staff |
1-2 months prior to anticipated OMB approval |
Receive OMB approval |
TBD |
Administer Web-based data collection instrument |
0-2 months after OMB approval |
Conduct School Climate Index interviews/focus groups |
0-2 months after OMB approval |
Clean quantitative Web-based instrument data |
2-3 months after OMB approval |
Analyze quantitative data from Web-based instrument |
4-6 months after OMB approval |
Transcribe interviews/focus groups |
2-3 months after OMB approval |
Determine intercoder reliability for qualitative data analysis of interviews/focus groups |
4 months after OMB approval |
Code and analyze interview/focus group data |
5-7 months after OMB approval |
Writing (and revising) of baseline data summaries, reports, and/or manuscripts (for both Web-based instrument and interview/focus group data) |
7-12 months after OMB approval |
The CDC contractor, with the review and approval of the CDC staff and the LEAs, will develop specific reports for the LEAs to use for program improvement and communication with the LEAs’ stakeholders. CDC will use the LEAs’ assessment findings during the project period to establish key recommendations for partners on program impact, sustainability, and continued program improvement.
The display of the OMB expiration date is not inappropriate. All data collection instruments will display the expiration date for OMB approval of the information collection. We are requesting no exemption.
There are no exceptions to the certification. These activities comply with the requirements in 5 CFR 1320.9.
References
[1] Centers for Disease Control and Prevention. (2013a). HIV among gay and bisexual men. Retrieved March 3, 2014, from http://www.cdc.gov/hiv/pdf/risk_gender_238900B_HIV_Gay_Bisexual_MSM_FS_final.pdf
[2] Centers for Disease Control and Prevention. (2012a). Estimated HIV incidence in the United States, 2007-2010. HIV Surveillance Supplemental Report, 17(4).
[3] Centers for Disease Control and Prevention. (2016). Youth Risk Behavior Surveillance—United States, 2015. MMWR Surveillance Summaries, 65(6), 1-174.
[4] Collinson V, Cook TF. “I don’t have enough time” Teachers’ interpretations of time as a key to learning and school change. J Educ Admin. 2001;39(3):226-281.
[5] Bauman S. Improving survey response rates of school counselors: Comparing the use of incentives. J Sch Couns. 2007;5(3). Retrieved January 23, 2015, from http://www.jsc.montana.edu/articles/v5n3.pdf
[6] Jacob RT, Jacob B. Prenotification, incentives, and survey modality: An experimental test of methods to increase survey response rates of school principals. J Res Educ Eff. 2012;5:401-418. doi: 10.1080/19345747.2012.698375
[7] Krueger RA, Casey MA. Focus groups. A practical guide for applied research. Thousand Oaks, CA: Sage; 2009.
[8] Goldenkoff R. Using focus groups. In: Wholey JS, Hatry HP, Newcomer KE, eds. Handbook of practical program evaluation 2nd ed. San Francisco, CA: Jossey-Bass; 2004; 340-362.
[9] Quinn Patton M. Qualitative Research and Evaluation Methods 3rd ed. Thousand Oaks, CA: Sage; 2002.
[10] Cantor D, O’Hare B, O’Connor K. The use of monetary incentives to reduce non-response in random digit dial telephone surveys. In: Lepkowski JM, Tucker C, Brick JM, Leeuw ED, Japec L, Lavrakas PJ, Link MW, Sangster RL, ed. Advances in telephone survey methodology. New York, NY: Wiley; 2008:471-89.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Bleechington |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |