SSA 0920-14AHH_2014-01-30clean

SSA 0920-14AHH_2014-01-30clean.docx

Assessing Education Agency Staff Perceptions of School Climate and Youth Access to Services

OMB: 0920-1048

Document [docx]
Download: docx | pdf




“Assessing Education Agency Staff Perceptions of School Climate and Youth Access to Services”





OMB #0920-new


Supporting Statement Part A





September 2, 2014




Supported by:



Catherine Rasberry, PhD

Division of Adolescent and School Health

Centers for Disease Control and Prevention

CDC/OID/NCHHSTP, Health Scientist

(404) 718-8170

[email protected]


Elana Morris, MPH

CDC/OID/NCHHSTP, Health Scientist

(404) 718-8193

[email protected]








List of Attachments

Attachment Number

Document Description

1

Public Health Service Act Legislation

2

60 Day FRN

3

MS Word Data Collection Instrument for Broward County Public Schools

4

MS Word Data Collection Instrument for Los Angeles Unified School District

5

MS Word Data Collection Instrument for San Francisco Unified School District

6

Web-based Data Collection Instrument for Broward County Public Schools

7

Web-based Data Collection Instrument for Los Angeles Unified School District

8

Web-based Data Collection Instrument for San Francisco Unified School District

9

School Climate Index Interview Guide for District-Level Administrators

10

School Climate Index Interview Guide for School-Level Administrators

11

School Climate Index Interview Guide for School Staff

12

Organizations and Individuals Providing Consultation on the Information Collection

13

Contractor’s IRB Approval for Web-based Data Collection Instrument and Protocol

14

Contractor’s IRB Approval for School Climate Interview Guide

15

Consent Statement for Web-based Data Collection Instrument for Broward County Public Schools

16

Consent Statement for Web-based Data Collection Instrument for Los Angeles Unified School District

17

Consent Statement for Web-based Data Collection Instrument for San Francisco Unified School District

18

Consent Statement for School Climate Index Interviews

19

Example Data Analysis Table Shells

20

Public Comments on 60-day FRN and Response



Section A: Justification for Information Collection


  1. Circumstances Making the Collection of Information Necessary

Background

The Centers for Disease Control and Prevention (CDC) requests a 3-year OMB approval to conduct a new information collection entitled, “Assessing Education Agency Staff Perceptions of School Climate and Youth Access to Services.” The information collection uses 2 separate, but complementary, information collections to conduct assessment of HIV and STD prevention efforts that are taking place in three local education agencies (LEA) funded by the Centers for Disease Control and Prevention (CDC), Division of Adolescent and School Health (DASH) under strategy 4 (School-Centered HIV/STD Prevention for Young Men Who Have Sex with Men (YMSM) of PS13-1308: Promoting Adolescent Health through School-Based HIV/STD Prevention and School-Based Surveillance. This data collection will provide data and reports for the funded LEAs, and will allow the LEAs to identify areas of the program that are working well and other areas that will need additional improvement. In addition, the findings will allow CDC to determine if changes in key outcomes took place following the implementation of currently recommended strategies and make changes to those recommendations if necessary.

HIV infections remain high among young men who have sex with men.1 The estimated number of new HIV infections increased between 2008 and 2010 both overall and among YMSM ages 13 to 24.2 Furthermore, sexual risk behaviors associated with HIV, other sexually transmitted diseases (STD), and pregnancy often emerge in adolescence. For example, 2011 Youth Risk Behavior Surveillance System (YRBSS) data revealed 47.4% of U.S. high school students reported having had sex, and among those who had sex in the previous three months, 39.8% reported having not used a condom during last sexual intercourse.3 In addition, 2001-2009 YRBSS data revealed high school students identifying as gay, lesbian, and bisexual and those reporting sexual contact with both males and females were more likely to engage in sexual risk-taking behaviors than heterosexual students.4

Given the disproportionate risk for HIV among YMSM ages 13-24, it is important to find ways to reach the younger youth (i.e., ages 13-19) in this range to decrease sexual risk behaviors and increase health-promoting behaviors such as routine HIV testing. Schools provide one opportunity for this. Because schools enroll more than 22 million teens (ages 14-19)5 and often have existing health and social services infrastructure, schools and their staff members are well-positioned to connect youth to a wide range of needed services, including housing assistance, support groups, and sexual health services such as HIV testing. As a result, CDC’s DASH has focused a number of HIV and STD prevention efforts on strategies that can be implemented in, or centered around, schools.

However, conducting HIV and STD prevention work (particularly work that is designed to specifically meet the needs of YMSM), can be challenging. School is not always a welcoming environment for lesbian, gay, bisexual, transgender, and questioning (LGBTQ) youth.6 Harassment, bullying, and verbal and physical assault are often reported, and such unsupportive environments and victimization among LGBT youth are associated with a variety of negative outcomes, including truancy,7 substance use,7-8 poor mental health,9-10 HIV and STD risk,8-9 and even suicide.11


CDC’s Division of Adolescent and School Health (DASH) in the National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention (NCHHSTP) awarded funds to implement PS13-1308: Promoting Adolescent Health through School-Based HIV/STD Prevention and School-Based Surveillance in order to build the capacity of state and local education agencies and support the efforts of national, non-governmental organizations (NGOs) to help priority school districts (districts) and schools develop and implement sustainable adolescent-focused program activities. Within that cooperative agreement, three local education agencies and one national, non-governmental organization were funded under strategy 4 for specific HIV and sexually transmitted disease (STD) prevention work focused on reaching 13-19 year old black and Latino young men who have sex with men (YMSM). In this project, YMSM are defined to include young men who report sexual activity with other males, attraction to other males, or who identify as gay or bisexual. The project’s distal goals are to:

  • Increase the number of teen YMSM who are tested and treated for HIV/STDs;

  • Decrease sexual risk behaviors among teen YMSM; and

  • Reduce rates of absenteeism and school dropout among teen YMSM.

Program activities will focus on increasing access of students (particularly YMSM) to key sexual health services and making the school environment safer and more supportive for YMSM. Activities are likely to include implementation of a referral system for connecting youth to key services, programs and professional development to improve school climate, social marketing campaigns that address key outcomes, and educating staff on existing policies that are related to access to care or the school environment

The assessments included in this information collection request are designed to provide a baseline measurement of several more proximal project outcomes of interest, including: number of teen YMSM who received referrals from school staff for HIV or STD testing, school staff knowledge of community organizations that offer services to YMSM, and school climate for YMSM. As a note, all questions in the assessments are designed to ask school staff about policies, practices, experiences, or partnerships that have particular relevance for 13-19 year old black and Latino YMSM, but these same policies, practices, experiences, and partnerships may have relevance for other students as well (for example, we ask about bullying in school as an aspect of school climate, and although bullying can impact all students, it has very specific and important relevance for black and Latino YMSM). The focus of these assessments and our analysis of the information collection will be focused primarily on the relevance for 13-19 year old black and Latino YMSM. Furthermore, the 3 LEAs funded under the cooperative agreement have chosen to focus program activities in high schools (and not junior high schools), and as a result, the proposed data collection focuses on collecting information from staff of high schools. Results of this baseline assessment will be used to inform development and refinement of program strategies and approaches.

Data will be collected from LEA employees through 2 separate, but complementary, information collections. The first information collection will involve collecting information from a total of up to 735 LEA employees across 3 LEAs (the 3 LEAs funded for strategy 4 of cooperative agreement PS13-1308) through a Web-based instrument. The instrument will include items that ask education agency staff about professional development, referral practices, community linkages/partners, school climate for LGBTQ youth, school policies and practices, and staff comfort levels in helping address the health needs of YMSM.

The second information collection will be conducted in only 1 LEA (Broward County Public Schools in Fort Lauderdale, Florida) and is designed to provide an in-depth assessment of one LEA as a way to supplement the Web-based data collection with more detailed information. This information collection will involve in-person interviews with up to 44 LEA employees (2 district level employees, and up to 6 school level employees in each of 7 schools) to learn about six domains that can impact school climate: policy, practice, programs, professional development, place, and pedagogy. To allow for flexibility in scheduling, interview guides have been constructed to also allow for small group interviews functioning as focus groups. These information collection activities are proposed only for Broward County Public Schools (BCPS) because BCPS expressed interested in receiving assistance to conduct more detailed evaluation activities to examine the activities they implement through the cooperative agreement.

CDC is authorized to collect the data described in this request by Section 301 of the Public Health Service Act (42 USC 241). A copy of this enabling legislation is provided in Attachment 1. In addition to this legislation, there are several national initiatives and programs that this data collection would serve to support, including but not limited to:

  • Healthy People 2020, which provides national health objectives and outlines a comprehensive plan for health promotion and disease prevention in the United States. Of the Healthy People 2020 objectives, 31 objectives align specifically with PS-13-1308 activities related to reducing HIV infection, other STD, and pregnancy among adolescents.

  • The National Prevention Strategy (NPS) calls for “medically accurate, developmentally appropriate, and evidence-based sexual health education.” The NPS encourages the involvement of parents in educating their children about sexual health, the provision of sexual and reproductive health services, and the reduction of intimate partner violence.6

  • The U.S. Department of Health and Human Services’ (DHHS) Teen Pregnancy Prevention Initiative supports the replication of teen pregnancy prevention (TPP) programs that have been shown to be effective through rigorous research as well as the testing of new, innovative program activities to combat teen pregnancy.7

  • The NCHHSTP program imperative calls for Program Collaboration and Service Integration (PCSI) to provide improved integration of HIV, viral hepatitis, STD, and TB prevention and treatment services at the user level.8

  • CDC Winnable Battles, including prevention of HIV infection and TPP, have been chosen by CDC based on the magnitude of the health problems and the ability to make significant progress in improving outcomes. These are public health priorities with large-scale impact on health with known, effective strategies to address them.9


A.1.1 Privacy Impact Assessment

The privacy act does not apply as no individually identifiable information will be collected.

Any PII that is provided by the local education agency for the purposes of participant recruitment will remain completely separate from the information gathered through the Web-based instrument and the in-person interviews/focus groups, and will kept confidential by the project team.


Overview of the Information Collection System

The information collection system consists of (1) Web-based questionnaires (see Attachments 3, 4, and 5 for the MS Word version of the information collection instruments and Attachments 6, 7, and 8 for the Web version of the information collection instruments) and (2) in-person interview/focus group guides (see Attachment 9: School Climate Index Interview Guide for District-level Administrators; Attachment 10: School Climate Index Interview Guide for School-level Administrators; and Attachment 11: School Climate Index Interview Guide for School Staff). Both are explained in detail below.


Web-based Instrument

The information collection system consists of a Web-based instrument (see Attachments 3, 4, and 5 for the MS Word version of the information collection instruments and Attachments 6, 7, and 8 for the Web version of the information collection instruments) designed to assess education agency staff for their experiences with and perceptions of professional development, referral practices, community linkages and partners, school climate for LGBTQ youth, school policies and practices, and staff comfort levels in helping address the needs of YMSM. The information collection instrument will be administered as a Web-based instrument. The information collection instrument was pilot tested by 5 individuals (3 teachers or former teachers and 2 members of the project team). Feedback from this group was used to refine questions as needed, ensure accurate programming and skip patterns and establish the estimated time required to complete the information collection instrument.


Interview Guide

The information collection system consists of in-person interviews/focus groups (see Attachment 9: School Climate Index Interview Guide for District-level Administrators; Attachment 10: School Climate Index Interview Guide for School-level Administrators; and Attachment 11: School Climate Index Interview Guide for School Staff) designed to assess education agency staff for their experiences with and perceptions of six school-related domains that can impact or influence health among students: professional development, referral practices, community linkages/partners, school climate for LGBTQ youth, school policies and practices, and staff comfort levels in helping address the health needs of YMSM. The information collection instrument is divided into distinct segments and will be administered as a series of in-person interviews/focus groups with up to 44 different education agency employees from the district and 7 schools. The information collection instrument (interview guide) was pilot tested with 3 individuals with evaluation and/or school experience. Feedback from this group was used to refine questions as needed and establish the estimated time required to complete the interview/focus group segments.

Items of Information to be collected


Web-based Instrument

The project team created 3 versions of the Web-based instrument—one version for each LEA participating in the project. Each instrument contains the same set of 42 core questions, and each instrument contains an additional set of 4-5 supplemental questions that were requested additions by the LEAs. The instrument for Broward County Public Schools includes a maximum of 47 questions (see Attachment 6: Web-based Data Collection Instrument for Broward County Public Schools). The instrument for Los Angeles Unified School District contains a maximum of 46 questions (see Attachment 7: Web-based Data Collection Instrument for Los Angeles Unified School District). The instrument for San Francisco Unified School District contains a maximum of 47 questions (see Attachment 8: Web-based Data Collection Instrument for San Francisco Unified School District). Although the number of questions is slightly different for the 3 instruments (47 questions for 2 LEAs and 46 questions for 1 LEA) the response burden is consistent across all instruments.

The instruments include programmed automatic skip patterns, so it is expected that not all respondents will be presented with all questions. (As a note, skip patterns are noted in the Microsoft Word versions of the instrument for the reviewers of this information collection request, but are not visible to respondents in the Web-based versions that they will use.) Two questions are open ended and all other questions are multiple choice. For several of the multiple-choice questions, respondents have the opportunity to add another response option under the “Other, please specify” answer choice. The instruments will be distributed and data will be collected using the Web-based data collection tool, SurveyMonkey®. The Web-based instruments will collect information on the following:

  1. Respondent’s work characteristics. This information will be collected with 3 questions that ask the name of the respondent’s school, his/her role at the school (e.g., teacher, administrator), and the length of service in his/her role at the agency (in multiple-choice format)

  2. Perceptions of students’ experiences at school related to safety, absenteeism, and bullying. This information will be collected with 15 questions—all multiple choice.

  3. Staff perceptions of and experiences with a referral system for connecting youth to services. This information will be collected with 18 questions (16 multiple choice and 2 open-ended). Respondents will be asked about their awareness of a referral protocol in the school as well as the estimated number of referrals they have made to connect youth to services. Respondents will also be asked about their awareness of services in the community for YMSM in particular. The two open ended questions will allow respondents to write in the names of the organizations they know of in their community that help to serve YMSM.

  4. Professional development and policies. This information will be collected through 6 questions that assess the topics on which staff have received professional development (2 questions) and the policies they report having at their school (4 questions). All 6 questions are multiple choice.

  5. Supplemental LEA-specific topics of interest. This information will be collected in additional questions on each instrument. On the Web-based data collection instrument for Broward County Public Schools, 5 questions address use of the LGBTQ critical support guide, and staff awareness of a specific policy and curriculum. On the Web-based data collection instrument for Los Angeles Unified School District, 4 questions address referrals and professional development. On the Web-based data collection instrument for San Francisco Unified School District, 5 questions address interactions with school wellness centers and experiences with certain types of bullying. All items are multiple choice.


Interview/Focus Group Guide

The complete interview guide is divided into 3 sections (in essence, 3 separate interview guides)—one for each respondent type (i.e., district-level administrator, school-level administrator, and school staff). These sections (3 distinct interview guides) are provided in attachments 9, 10, and 11 of this package. Each of these 3 sections (or guides) is further divided into 6 domains of interest—(1) policy (part A is district-level policy and part B is school-level policy); (2) practice; (3) programs; (4) professional development (part A is district-level professional development and part B is school-level professional development); (5) place; and (6) pedagogy. In addition, each section of the interview guide includes primary questions and secondary questions. The primary questions are the most essential and will be priority questions for the interviewers. If there is time remaining in the interview/focus group, the interviewer will ask some or all of the secondary questions. For district administrators, there are 10 primary questions and 8 secondary questions; these will be split between 2 respondents. For school administrators, there are 16 primary questions and 9 secondary questions; these will be split between the 2 respondents per school (in 7 schools). For school staff, there will be 13 primary questions and 12 secondary questions; these will be asked during group interviews/focus groups with up to 4 respondents per school (in 7 schools). Questions are open-ended.

The interview guide (as a whole, which includes all three sections) will collect information on the following:

  1. Policy. There are 27 questions (7 primary and 5 secondary questions for district administrators; 4 primary and 2 secondary questions for school administrators; 4 primary and 5 secondary questions for school staff)

  2. Practice. There are 23 questions (7 primary and 5 secondary questions for school administrators; 6 primary and 5 secondary questions for school staff)

  3. Programs. There are 3 questions (1 primary and 2 secondary questions for school staff)

  4. Professional development. There are 10 questions (3 primary and 3 secondary questions for district administrators; 2 primary questions for school administrators; 2 primary questions for school staff)

  5. Place. There are 2 questions (2 primary questions for school administrators)

  6. Pedagogy. There are 3 questions (1 primary and 2 secondary questions for school administrators)


Identification of Website(s) and Website Content Directed at Children Under 13 Years of Age

One aspect of the information collection system involves using a Web-based information collection instrument. Respondents will be sent a link directing them to the online instrument only (i.e., not a website). No website content will be directed at children.



  1. Purpose and Use of Information Collection

Data gathered from the Web-based data collection instruments and interviews/focus groups will allow the funded education agencies to assess their program activities conducted under PS13-1308. It will allow them to ensure their activities are helping improve HIV/STD prevention practices and services in schools, and to determine if their activities are impacting outcomes such as school climate, particularly for LGBTQ youth, and staff referrals for youth to receive important HIV and STD prevention health services. This supports a major public health goal of reducing disparities in HIV/STD infection experienced by adolescent YMSM.

Data collected through the Web-based data collection instruments and the interviews/focus groups will be analyzed by the project team to identify existing strengths, weaknesses, and gaps that can inform program development and refinement for future activities, and it also will be analyzed to provide a descriptive baseline assessment that can be used for comparison as future data are collected to determine program impact. Analysis of data from the Web-based data collection instrument may include frequencies, tests for differences among types of school staff, or comparisons across data collection points in years to come). Analysis of interview/focus group data will likely involve iterative code development, use of qualitative data analysis software (such as ATLAS.ti), and identification of major themes within the data. The finding from this information collection also have practical utility to the government because they can impact both the activities used by the CDC-funded LEAs and the strategies and approaches CDC recommends for use in schools.

Without this data collection, the LEAs would be unable to determine if their program activities had the desired impact on the school climate and staff referrals for students to receive important HIV and STD prevention health services. In addition, without collecting this data, CDC would have little evidence on several of the new and innovative strategies that are being used to enhance HIV and STD prevention efforts in schools.


A.2.1 Privacy Impact Assessment


How information will be shared and for what purpose


Web-based Instrument

For the Web-based questionnaire, no sensitive information is being collected and no individually identifiable information will be collected, recorded, or stored as part of the questionnaire or information collection database. Although the LEA will provide a list of staff names and contact information for inviting staff to participate in the survey, this information will be used only to administer the questionnaire and will be stored separately with no information that links members to their responses. The questionnaire will be offered through a secure Website, SurveyMonkey®, that uses Secure Socket Layer technology to protect user information. Once data are transmitted to the Contractor over a secure and encrypted connection, data will be accessible only to the contractor's study team staff in possession of the password necessary to access and download the data. Only approved project staff will have access to the data. CDC contractors will assist the LEAs with data analyses and provide all findings to each LEA in aggregate. No responses will be traceable back to their source. Summaries of findings (not the raw data) may also be shared with other stakeholders (e.g., the other LEAs and the NGO funded under strategy 4 of PS13-1308, CDC staff) and researchers in the field, once appropriate permissions and clearances have been secured from both CDC and the LEAs.


Interview Guide

For the in-person interviews/focus groups, no sensitive information is being collected. Although the LEA will provide a list of staff names and contact information for inviting staff to participate in the interviews/focus groups, this information will be used only to set up interviews/focus groups; all notes and/or recordings will be kept separate from the names of participants. Responses will only be reported in aggregate due to the small sample size. Reports will focus on overall climate at the schools rather than individual responses. The participants' names will not be associated with specific quotes or comments. In addition, all reports will be written in a way in which no comments will be attributed to any one person. All transcribers will be asked to sign confidentiality agreements and team members will be trained on security requirements. During data collection in the field, interviewers will maintain data collection materials in their possession or in secured storage at all times. All documents associated with the study will be collected and stored in a password-protected electronic file on a secure network accessible only by the Contractor's study team.


Impact of the proposed collection on respondents’ privacy

We anticipate no adverse impact of the proposed data collection on respondents’ privacy because no individually identifiable information will be collected as part of this information collection. Any PII that was provided for the purposes of participant recruitment will remain separate from the information gathered through the Web-based instrument and the interviews/focus groups, and will kept confidential by the project team.


Web-based Instrument

For the Web-based questionnaire, no sensitive information is being collected and no individually identifiable information will be recorded or stored as part of the questionnaire or database. PII will be used only to administer the questionnaire and will be stored separately with no information that links members to their responses.


Interview Guide

For the in-person interviews/focus groups, no sensitive information is being collected and no individually identifiable information will be recorded or stored as part of the interviews/focus groups or database. All notes and/or recordings will be kept separate from the names of participants. Responses will only be reported in aggregate due to the small sample size. Reports will focus on overall climate at the schools rather than individual responses. The interview/focus groups participants' names will not be associated with specific quotes or comments. In addition, all reports will be written in a way in which no comments will be attributed to any one person.


  1. Use of Improved Information Technology and Burden Reduction


Web-based Instrument

Questionnaire data will be collected via Web-based instruments allowing respondents to complete and submit their responses electronically. This method was chosen to reduce the overall burden on respondents. The information collection instrument was designed to collect the minimum information necessary for the purposes of this project through the use of automated skip patterns (i.e., limited to a maximum of 47 questions and a possibility of skipping up to 17 questions). In addition, the Web-based administration allows respondents to easily access the data collection instrument at a time and location that is most convenient for them.


Interview Guide

Interviews/focus groups will be conducted in-person by trained interviewers. Interviews/focus groups will be audio-taped with permission of the respondents. This may help reduce the amount of time required of participants because the interviewer will not have to pause for note-taking.

  1. Efforts to Identify and Use of Similar Information


Web-based Instruments

In preparation for collection of data from school staff in all three LEAs, the project team reviewed the literature for any existing instruments or data collection activities that asked about professional development, referral practices, community linkages/partners, school climate for LGBTQ youth, school policies and practices, and staff comfort levels in helping address the health needs of YMSM. Most of these topics were not covered by existing instruments or data collections. There was no instrument or data collection that collected all of the information we sought to collect. For this reason, the project team developed the school staff data collection instruments (one tailored to each district). Where possible, we adapted questions from existing questionnaires. For example, many of the school climate questions were adapted from surveys from the Gay, Lesbian, and Straight Education Network and professional development and policy questions were adapted from CDC’s School Health Policies and Practices Study questionnaire. The newly developed questionnaire will allow the project team to collect the relevant data from the specific schools that are participating in this project. There is no other source of information that can provide the relevant data specifically from the project’s participating schools.


Interview Guide

In preparation for collection of data from school staff in one LEA through in-person interviews/focus groups, the project team reviewed the literature for any existing instrument or data collection activities that provide in-depth information about the six domains that can impact school climate. To do this more effectively, the project team partnered with representatives from Hetrick-Martin Institute (HMI). HMI staff had conducted similar work in the past, so the interview guide used in this project was based on existing interview guides. Although HMI had conducted similar work in the past, it was not conducted in the LEA participating in this project. In addition, the review of the literature enabled further development of the guide to ensure that it reflected a broad range of elements that could impact school climate. There is no other source of information that can provide the relevant in-depth information on school climate from the schools participating in this project.


  1. Impact of Small Businesses or Other Small Entities

No small businesses or other small entities will be involved in or impacted by this data collection.



  1. Consequences of Collecting the Information Less Frequently

This information collection is scheduled to occur in 2015, 2016, and 2018. These time points align with the initiation of program activities (for baseline data collection in 2015) and then the mid-way point (2016) and end point (2018) of program activities funded under PS13-1308. The first two data collections (2015 and 2016) will be covered by this ICR. Because we are not collecting personal information from respondents, no attempt will be (or could be) made to include the same participants in each collection; the samples will be drawn independently. (We anticipate requesting an extension of this ICR to cover the final data collection in 2018.) There are no technical or legal obstacles to reduce the burden.

There would be a number of consequences to collecting the data less frequently. First of all, this was designed to use the fewest data collections to achieve project goals. The first data collection is essential to provide a clean baseline for the assessment and present an accurate picture of what staff perceptions and experiences were like prior to initiation of program activities and strategies. Without this first data collection, it would be impossible to determine if the program had any impact. Furthermore, this initial baseline data collection provides critical information that LEA staff can use to determine the most appropriate focus of their activities. It will allow them to determine areas of greatest need that can be incorporated into program planning. The second data collection (in 2016) is really essential for good public health practice. One of the key purposes of the assessment is not simply to know if activities worked but to be able to make mid-course corrections to improve the likelihood that future activities can have even greater impact. The mid-program data collection will allow the LEA to assess impact midway through the program and to make improvements based on strengths or weaknesses revealed by the 2016 data. Finally, the 2018 data collection is essential for determining the full impact of the funded activities and will allow time for program activities to actually result in changes in the school staff experiences (particularly those related to making referrals). Without these three data collection points, the LEAs and CDC would not be able to achieve both goals of improving program activities and assessing their impact.


  1. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

There are no special circumstances with this information collection package. This request fully complies with the regulation 5 CFR 1320.5 and will be voluntary.


  1. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

A. As required by 5 CFR 1320.8(d), a 60-day Notice was published in the Federal Register on June 16, 2014, Vol. 79, No. 114, pages 33924 - 33925] (see Attachment 2). One non-substantive public comment was received within the 60-day comment period and the standard CDC response was sent (see Attachment 20). In addition, one substantive public comment arrived outside the 60-day comment period; however, the agency provided a response as a courtesy (see Attachment 20).

B. The local education agencies involved in this information collection were consulted to discuss all aspects of the data collection. They provided extensive feedback on the availability of existing data, other data collections in their LEAs and the frequency of data collection for this project. In addition, CDC contractors provided extensive input into the clarity of instructions, reporting formats, and the data elements that will be reported. LEA staff also reviewed and approved this information.

These consultations took place in 2013 and 2014. A list of organizations and individuals consulted is provided in Attachment 12: Organizations and Individuals Providing Consultation on the Information Collection. There were no major problems that arose during the consultation, and all issues raised were resolved.


  1. Explanation of Any Payment or Gift to Respondents


Web-based Instruments

For the web-based instrument aspect of this data collection, no payment, gift, or incentive will be provided to the respondents for their participation.


Interview Guide

Tokens of appreciation for data collection participation are an important tool used in studies and are particularly important for the population in this information collection. Educators (including teachers, principals, school counselors, school nurses, and other school staff) work within extremely regimented schedules that offer little room for flexibility or variation in the way they spend the time during their work days. In the study team’s extensive experience working with schools and school staff, we have consistently heard that time is extremely hard to come by for school staff. In our experience, the lack of time for school personnel is such a substantial concern for school administrators, that local education agencies often restrict the commitments they allow school personnel to make for tasks such as data collection. A study funded by the U.S. Department of Education helped document some of the time constraints faced by school staff. In that study of middle school teachers, researchers identified a number of time-related challenges, two of which included “feeling overwhelmed” and “lack of discretionary time”.12 Discretionary time, in that study, was defined as “the time when teachers are free from scheduled responsibilities and can decide what to do,” and the study found that true discretionary time for teachers was rare. Administrators typically set teachers’ schedules, and the majority of their time was spent with students. Even “free time” was often spent with set responsibilities such as team meetings, parent conferences, student meetings, supervising lunch rooms, and moving students from one place to another.12 It is precisely this lack of discretionary time that can make achieving high response rates among educators a challenge. In this particular data collection, it is expected that many school staff will need to participate in the data collection outside of their regular work hours, which produces an additional burden for them that threatens to impact response rates. Other researchers have found that providing incentives for school staff such as school counselors13 and school principals14 have increased their likelihood of participation.

Given the considerations outlined above and the estimated burden on the in-person interviews/focus groups, gifts to respondents in the form of $25 Visa gift cards are proposed for participants. These interviews/focus groups are not part of the participants’ job duties and are completely voluntary in nature. Furthermore, interviews/focus groups are estimated to be 1 hour or 1.5 hours (depending on the role of the individuals participating), and this represents an extremely large block of time for personnel working in schools. Scheduling time for these interviews/focus groups will require a great commitment from school staff, and not only will it be an activity in addition to their regular duties, it is likely many interviews/focus groups will take place outside of regular working hours. In addition, in spite of attempts to minimize burden on respondents, the participants may have to leave their regular work buildings to get to the location for interviews/focus groups, representing an additional expenditure of effort on behalf of the participants.

A unique aspect of this proposed interview/focus group data collection is that it is not only the overall sample size that is critical to the quality of the research, but the participation of the specific individuals who have been invited to participate. Interviews/focus groups will be requested from specific school staff based on their roles and their ability to speak to different elements of the school environment; therefore, in order to accurately describe the school climate, it will be essential to gain participation from those particular individuals. One of the challenges for the study team is that research shows that in data collection with education staff, “there is likely to be a strong association with nonresponse and the survey topics of interest”.14 Educators who have interest in the topic of the survey or who participated in the program being assessed may be more likely to participate in interviews or focus groups, introducing a bias and limiting our ability to get a true picture of the school climate. Given that the topic of the proposed information collection includes questions on the school climate specifically for LGBTQ youth, a topic which is likely to have widely varying levels of appeal to school staff, and the fact that by the planned second round of data collection for this project some, but not all, staff may have participated in program activities, the study team believes that the potential for bias from interest in the topic is a particular concern for the data collection. The use of incentives can help minimize bias resulting from variations in interest in the topic by helping the motivate the staff members recruited for interviews/focus groups to actually make the commitment of their time necessary for participating in the interviews/focus groups. Krueger and Casey (2009) note that the gift helps emphasize to participants that the assessment is important, which in turn will make them more inclined to make time to participate. More specifically, the incentive basically “serves to protect the promised time slot from being preempted.”15 In this data collection, the use of incentives is expected to minimize bias related to interest in the topic, and therefore, increase the quality and accuracy of data collected.

It is for these reasons that the study team is proposing to offer $25 gifts to interview/focus group participants. Both Goldenkoff (2004)16 and Quinn Patton (2002) support the use of incentives.17 We expect $25 to be sufficient to improve participation rates and it is consistent with what has been cited in the literature on response rates; for example, in a 2008 article, Cantor, O’Hare, & O’Connor state that “a number of studies have found that promised incentives of $15-$35 increase response rates.”18 Although this amount is slightly under the more standard incentive of $40 for focus groups, we believe this is an appropriate amount given that most interviews/focus groups will be conducted on the campuses where participants work. IRB approval of the study included the review and approval of the $25 gift. In addition, the research office of each the participating LEA has approved the data collection with these gifts included as part of the protocol.


  1. Assurance of Confidentiality Provided to Respondents

No individually identifiable information is being collected. CDC staff have reviewed this information collection request and determined that the Privacy Act does not apply.

IRB approval

The proposed Web-based data collection and in-person interview/focus groups data collection protocols have been reviewed and approved by the existing contractor’s IRB (see Attachment 13 and Attachment 14). In addition, the protocols have been reviewed and approved by the research offices of the participating school districts.


10.1 Privacy Impact Assessment Information

  1. Voluntary collection. Participants will be informed that providing the information for this data collection is voluntary.


  1. Consent.


Web-based instrument

School staff will receive a consent form (see Attachments 15-17) that provides information about the Web-based data collection instrument and informs them that participation is completely voluntary and they may choose not to participate at any time.


Interview Guide

School staff will receive a consent form (see Attachment 18) that provides information about the in-person interviews/focus groups and informs them that participation is completely voluntary and they may choose not to participate at any time.


  1. Safeguards and security.


Web-based Instrument

For the Web-based questionnaire, no sensitive information is being collected and no individually identifiable information will be recorded or stored as part of the questionnaire or database. Staff names and contact information, provided by the LEA, will be used only to administer the questionnaire and will be stored separately with no information that links members to their responses. The questionnaire will be offered through a secure Website, SurveyMonkey®, that uses Secure Socket Layer technology to protect user information. Once data are transmitted to the Contractor over a secure and encrypted connection, data will be accessible only to the contractor's study team staff in possession of the password necessary to access and download the data. Only approved project staff will have access to the data. CDC contractors will assist the LEAs with data analyses and provide all findings to each LEA in aggregate. No responses will be traceable back to their source. Summaries of findings (not the raw data) may also be shared with other stakeholders (e.g., the other LEAs and the NGO funded under strategy 4 of PS13-1308, CDC staff) and researchers in the field, once appropriate permissions and clearances have been secured from both CDC and the LEAs.

Interview Guide

For the in-person interviews/focus groups, no sensitive or individually identifiable information is being collected in the interviews/focus groups. All notes and/or recordings will be kept separate from the names of participants. Responses will only be reported in aggregate due to the small sample size. Reports will focus on overall climate at the schools rather than individual responses. The participants' names will not be associated with specific quotes or comments. In addition, all reports will be written in a way in which no comments will be attributed to any one person. All transcribers will be asked to sign confidentiality agreements and team members will be trained on security requirements. During data collection in the field, interviewers will maintain data collection materials in their possession or in secured storage at all times. All documents associated with the study will be collected and stored in a password-protected electronic file on a secure network accessible only by the Contractor's study team.

  1. System of records. A system of records is not being created under the Privacy Act.


  1. Justification for Sensitive Questions

No information will be collected that are of personal or sensitive nature.


  1. Estimates of Annualized Burden Hours and Costs

Burden hours.

Table A.12-1 provides estimates of burden for the data collection.


Web-based Instrument

The estimate for burden hours is based on a pilot test of the Web-based instrument by 5 individuals working in public health and/or education. In the pilot test, the average time to complete the instrument including time for reviewing instructions, gathering needed information and completing the instrument, ranged from 20 to 25 minutes. Based on these results, the estimated time range for actual respondents to complete the instrument is 20-25 minutes. For the purposes of estimating burden hours, the upper limit of this range (i.e., 25 minutes) is used.


Interview Guide

The estimate for burden hours is based on a pilot test of the segments of the interview guide by 3 individuals with evaluation and/or school experience. In the pilot test, the average time to complete the interviews/focus groups including time for reviewing instructions, gathering needed information and completing the interview/focus group, ranged from 60 to 90 minutes. Based on these results, the estimated time range for actual respondents to complete the interview/focus group is as follows: 1 hour for the district level administrators; 1 hour for school level administrators; and 1.5 hours for school staff participating in a group interview/focus group. For the purposes of estimating burden hours, these are the values used.

Table A.12-1 Estimated Annualize Burden to Respondents

Respondents

Form Name

Number of Respondents

Number of Responses per Respondent

Average Burden per Response (in hours)

Total Burden (in hours)

School staff

Web-based instrument for Broward County Public Schools

(Att 3 & Att 6 web)

163

1

25/60

68

School staff

Web-based instrument for Los Angeles Unified School District

(Att 4 & Att 7 web)

163

1

25/60

68

School staff

Web-based instrument for San Francisco Unified School District

(Att 5 & Att 8 web)

163

1

25/60

68

District-level Administrators

School Climate Index Interview Guide for District-level Administrators

(Att 9)

1

1

1

1

School-level Administrators

School Climate Index Interview Guide for School-level Administrators

(Att 10)

9

1

1

9

School Staff

School Climate Index Interview Guide for School Staff

(Att 11)

19

1

1.5

29

Total

243

Annualizing this collection over three years results in an estimated annualized burden of 243 hours.

Annualized cost.

Table A.12-2 provides estimates of the annualized cost to respondents for the collection of data.


Web-based Instrument

Estimates for the average hourly wage for respondents are based on Department of Labor (DOL) data from May 2013 providing national industry-specific occupational employment and wage estimates (http://www.bls.gov/oes/current/naics4_999200.htm). Based on DOL data, an average hourly rate for education administrators is $43.59 and for educational, guidance, school, and vocational school counselors is $25.42. In addition, the DOL data reports the average salary of secondary school teachers is $55,970. To compute an hourly rate, we estimated average teachers work approximately 9 months annually (1560 hours), which equals approximately $35.88 an hour. (If anything, this may slightly over-estimate the cost burden for teachers, as many may work more than 40 hour weeks or slightly more than 9 months a year.) Given that all of these types of staff will be included in the information collection, we estimated an average hourly rate from these three and arrived at an average hourly wage of $34.96. This rate is estimated for all 735 respondents (245 in each of 3 LEAs) from whom data will be collected in 2 of the 3 years (annualized estimate of 492 total respondents per year for 3 years). Table A-12 shows estimated burden and cost information.


Interview Guide

Estimates for the average hourly wage for respondents are based on Department of Labor (DOL) data from May 2013 providing national industry-specific occupational employment and wage estimates (http://www.bls.gov/oes/current/naics4_999200.htm). Based on DOL data, an average hourly rate for education administrators is $43.59. This rate is estimated for the 16 respondents who are district-level (n=2) or school-level (n=14) administrators from whom data will be collected in 2 of the 3 years (annualized estimate of 2 district level administrators and 10 school-level administrators per year over 3 years). Based on DOL data, an average hourly rate for educational, guidance, school, and vocational school counselors is $25.42. In addition, the DOL data reports the average salary of secondary school teachers is $55,970. To compute an hourly rate, we estimated average teachers work approximately 9 months annually (1560 hours), which equals approximately $35.88 an hour for teachers. (If anything, this may slightly over-estimate the cost burden for teachers, as many may work more than 40 hour weeks or slightly more than 9 months a year.) Given both of these types of staff will be included in the information collection, we estimated an average hourly rate from these and arrived at an average hourly wage of $30.65. This rate is estimated for the 28 respondents (4 in each of the 7 schools) who are school staff and from whom data will be collected in 2 of the 3 years (annualized estimate of 19 school staff respondents per year over 3 years). Table A-12 shows estimated burden and cost information.


Table A.12-2 Annualized Costs to Respondents

Respondent

Form Name

Number of Respondents

Number of Responses per Respondent

Average Burden per Response

(in hours)

Average Hourly Wage Rate

Total Cost

School staff

Web-based instrument for Broward County Public Schools

163

1

25/60

$34.96

$2377

School staff

Web-based instrument for Los Angeles Unified School District

163

1

25/60

$34.96

$2377

School staff

Web-based instrument for San Francisco Unified School District

163

1

25/60

$34.96

$2377

District-level Administrators

School Climate Index Interview Guide for District-level Administrators

1

1

1

$43.59

$44

School-level Administrators

School Climate Index Interview Guide for School-level Administrators

9

1

1

$43.59

$392

School Staff

School Climate Index Interview Guide for School Staff

19

1

1.5

$30.65

$874

Total

$8,441



  1. Estimates of Other Annual Cost Burden to Respondents or Record Keepers

There will be no direct costs to the respondents or record keepers other than their time to participate in each information collection.


  1. Annualized Cost to Federal Government

Cost will be incurred by the government in personnel time for overseeing the project. CDC time and effort for overseeing the contractor’s assistance with data collection and answering questions posed by the contractor and funded agencies are estimated at 6% for two GS-13 (step 7) level Atlanta-based CDC employees and 3% for a GS-14 (step 8) level Atlanta-based senior CDC employee a year for the three years of the project. The grade and step levels were determined based on the experience levels of the staff currently proposed to work on the project. .The senior level employee supervises the two GS-13-level employees. The average annual cost to the federal government for oversight and project management is $16,173 (Table A-14-1).

The contractor’s costs are based on estimates provided by the contractor who helped plan the data collection activities. With the expected period of performance, the annual cost to the federal government from contractor and other expenses is estimated to be approximately $42,000 (Table A-14-1). This is the cost estimated based on the current funding level of the contractor at approximately $600,000 per year and the percentage of the contractor’s effort that is anticipated for this specific data collection. It is estimated this data collection will take approximately 7% of the contractor’s effort in each year data collection takes place. This includes the estimated cost of coordination with DASH, providing assistance to the LEA for data collection and processing, and support for analysis and reporting.

The total annualized cost to the government, including direct costs to the federal government and contractor expenses is $58,211.


Table A.14-1. Annualized and Total Costs to the Federal Government

Expense Type

Expense Explanation

Annual Costs (dollars)

Direct Cost to the Federal Government

CDC employee oversight for project

1 CDC Senior Health Scientist at 3% time (GS-14)

$3,776

CDC oversight of contractor and project

2 CDC Health Scientists at 6% time each (GS-13)

$12,435

Subtotal, Direct Costs to the Government per year

$16,211

Contractor and Other Expenses

Assistance with data collection, processing, and preliminary analysis

Labor and other direct costs for supporting data collection, processing, and analysis

$42,000

Subtotal, Contract and Other Expenses per year

$42,000

Total of all annualized expenses

$58,211


  1. Explanation for Program Changes or Adjustments

This is a new information collection.


  1. Plans for Tabulation and Publication and Project Time Schedule

Current plans for tabulation and publication of data from this information collection include analyzing data for differences in key outcomes between baseline and follow-up data collections and publication of these findings in written reports for the LEA and possibly, peer-reviewed journals. In addition, basic analyses of baseline (2015) data will be shared in written reports for the LEAs and may also be shared through published reports for other stakeholders.


Analysis Plan


Web-based Instrument

Data analysis will begin within two weeks after completion of data collection for the Web-based instrument. Data will be analyzed using both descriptive and inferential statistics. As relevant (where multiple items were developed to measure a larger construct such as school climate), data reduction techniques/factor analyses and item correlation analyses will be used to develop scales from within the questionnaire items, as needed. Descriptive statistics of data will assist in data cleaning, generating additional hypotheses, and summarizing the perceptions and experience of school staff, particularly related to school climate and referring students for services. Data will be analyzed in aggregate when possible, and also by LEA.

We plan to pool the data from school staff from across all 3 LEAs. After the first data collection, we will use descriptive statistics to examine current perceptions and experiences of school staff. In follow-up data collections, data will be analyzed to identify changes from baseline to follow up. In particular, we will analyze for changes in: the number of students staff referred for services, staff perceptions of absenteeism, and staff perceptions of school climate. The samples will be successive independent samples (non-linked). The project team will use t-tests and chi-square tests to examine for differences between baseline and follow up data, and may use regressions to model changes among the staff as a function of time. Additional post hoc analyses may be conducted to meet emerging needs or interests of the programs involved. A few example table shells are provided in Attachment 19: Example Data Analysis Table Shells.


Findings from the data will be summarized into written reports for all three LEAs and may be shared with other stakeholders through mechanisms such as presentations, executive summaries, or peer-reviewed articles. Findings will be used to improve the program and to help CDC better understand which types of program strategies to recommend to other schools and LEAs.


Interview Guide

Upon completion of the interviews/focus groups, all recorded interviews/focus groups will be transcribed and the transcripts will be provided to the project team. The qualitative interview/focus group data analysis will include iterative code development, establishment of intercoder reliability, single coding of full transcripts using ATLAS.ti 7 software (or a similar software), and qualitative analysis of coded data. A team of multiple coders will be used to code the interview/focus group data. To establish intercoder reliability, team members will select numerous segments of text from two randomly selected interview/focus group transcripts and team members will apply the most relevant primary code to each section of text. The consistent use of these codes will be analyzed for intercoder reliability. The coding team will meet to review any discrepancies and will continue the process until an acceptable level of intercoder reliability is reached. Then, each transcript will be coded by one coding team member for analysis. The team will later systematically analyze the coded transcripts to identify common themes that emerge. Data will be analyzed across schools to provide a general picture of school climate in the project schools.

Findings from the data will be summarized into written reports for the LEA (and possibly for the participating schools) and may be shared with other stakeholders through mechanisms such as presentations, executive summaries, or peer-reviewed articles. Findings will be used to improve the program and to help CDC better understand which types of program strategies to recommend to other schools and LEAs.


Project Time Schedule

Baseline data are scheduled to be collected in early 2015. The baseline data are likely to be analyzed, summarized, and reported (through unpublished or published reports) in 2015. The first round of follow-up data (data collected at the mid-point of the program) will be collected in Spring 2016. These data, and differences in baseline and follow-up data, are likely to be analyzed, summarized, and shared through unpublished or published reports in 2016 and 2017.

A three year clearance is being requested. At the end of this period, we anticipate seeking an extension of the approval for a second round of follow-up data to be collected at the end of the program’s 5-year funding cycle.


Figure A.16-1: DASH Project Time Schedule


Activity

Time Schedule

First round of data collection

Design Web-based information collection instrument

Complete

Design interview guide

Complete

Develop data collection protocol, instructions, and analysis

Complete

Pilot test Web-based information collection instruments

Complete

Pilot test interview guide tool

Complete

Prepare OMB package

Complete

Collect e-mail addresses for participating staff

1-2 months prior to anticipated OMB approval

Receive OMB approval

TBD

Administer Web-based data collection instrument

0-2 months after OMB approval

Conduct School Climate Index interviews/focus groups

0-2 months after OMB approval

Clean quantitative Web-based instrument data

2-3 months after OMB approval

Analyze quantitative data from Web-based instrument

4-6 months after OMB approval

Transcribe interviews/focus groups

2-3 months after OMB approval

Determine intercoder reliability for qualitative data analysis of interviews/focus groups

4 months after OMB approval

Code and analyze interview/focus group data

5-7 months after OMB approval

Writing (and revising) of baseline data summaries, reports, and/or manuscripts (for both Web-based instrument and interview/focus group data)

7-12 months after OMB approval

Second Round of Data Collection


Collect e-mail addresses for participating staff

16-17 months after OMB approval

Administer Web-based data collection instrument

18-20 months after OMB approval

Conduct School Climate Index interviews/focus groups

18-20 months after OMB approval

Clean quantitative Web-based instrument data

20-21 months after OMB approval

Analyze quantitative data from Web-based instrument

22-24 months after OMB approval

Transcribe interviews/focus groups

20-21 months after OMB approval

Determine intercoder reliability for qualitative data analysis of interviews/focus groups

22 months after OMB approval

Code and analyze interview/focus group data

23-25 months after OMB approval

Writing (and revising) of baseline data summaries, reports, and/or manuscripts (for both Web-based instrument and interview/focus group data)

25-36 months after OMB approval


The CDC contractor, with the review and approval of the CDC staff and the LEAs, will develop specific reports for the LEAs to use for program improvement and communication with the LEAs’ stakeholders. CDC will use the LEAs’ assessment findings during the project period to establish key recommendations for partners on program impact, sustainability, and continued program improvement.


  1. Reason(s) Display of OMB Expiration Date is Inappropriate

The display of the OMB expiration date is not inappropriate. All data collection instruments will display the expiration date for OMB approval of the information collection. We are requesting no exemption.


  1. Exceptions to Certification for Paperwork Reduction Act Submissions

There are no exceptions to the certification. These activities comply with the requirements in 5 CFR 1320.9.

References

[1] Centers for Disease Control and Prevention. (2013a). HIV among gay and bisexual men. Retrieved March 3, 2014, from http://www.cdc.gov/hiv/pdf/risk_gender_238900B_HIV_Gay_Bisexual_MSM_FS_final.pdf

[2] Centers for Disease Control and Prevention. (2012a). Estimated HIV incidence in the United States, 2007-2010. HIV Surveillance Supplemental Report, 17(4).

[3] Centers for Disease Control and Prevention. (2012b). Youth Risk Behavior Surveillance--United States, 2011. MMWR Surveillance Summaries, 61(4), 1-168.

[4] Centers for Disease Control and Prevention. (2011). Sexual identity, sex of sexual contact, and health-risk behaviors among students in grades 9-12--Youth Risk Behavior Surveillance, selected sites, United State, 2001-2009. MMWR, 60(SS7), 1-133.

[5] United States Census Bureau. (2013, September 3, 2013). School enrollment: CPS October 2012--Detailed Tables. Retrieved March 7, 2014, from http://www.census.gov/hhes/school/

[6] Bontempo DE, D’Augelli AR. Effects of at-school victimization and sexual orientation on lesbian, gay, or bisexual youths’ health risk behavior. J Adolesc Health. 2002;30(5):364-374.

[7] Birkett M, Espelage DL, Koenig B. LGB and questioning students in schools: The moderative effects of homophobic bullying and school climate on negative outcomes. J Youth Adolesc. 2009;38(7):989-1000. doi: 10.1007/s10964-008-9389-1

[8] Garofalo R, Wolf RC, Kessel S, Palfrey J, DuRant RH. The association between health risk behaviors and sexual orientation among a school-based sample of adolescents. Pediatrics. 1998;101(5):895-902.

[9] Russell ST, Ryan C, Toomey RB, Diaz RM, Sanchez J. Lesbian, gay, bisexual, and transgender adolescent school victimization: Implications for young adult health and adjustment. J Sch Health. 2011;81(5):223-230. doi: 10.1111/j.1746-1562.2011.00583.x

[10] Toomey RB, Ryan C, Diaz RM, Card NA, Russell ST. Gender-nonconforming lesbian, gay, bisexual, and transgender youth: School victimization and young adult psychosocial adjustment. Developmental Psychology. 2010;46(6):1580-1589. doi: 10.1037/a0020705

[11] Hatzenbuehler ML. The social environment and suicide attempts in lesbian, gay, and bisexual youth. Pediatrics. 2011;127(5):896-903. doi: 10.1542/peds.2010-3020

[12] Collinson V, Cook TF. “I don’t have enough time” Teachers’ interpretations of time as a key to learning and school change. J Educ Admin. 2001;39(3):226-281.

[13] Bauman S. Improving survey response rates of school counselors: Comparing the use of incentives. J Sch Couns. 2007;5(3). Retrieved January 23, 2015, from http://www.jsc.montana.edu/articles/v5n3.pdf

[14] Jacob RT, Jacob B. Prenotification, incentives, and survey modality: An experimental test of methods to increase survey response rates of school principals. J Res Educ Eff. 2012;5:401-418. doi: 10.1080/19345747.2012.698375

[15] Krueger RA, Casey MA. Focus groups. A practical guide for applied research. Thousand Oaks, CA: Sage; 2009.

[16] Goldenkoff R. Using focus groups. In: Wholey JS, Hatry HP, Newcomer KE, eds. Handbook of practical program evaluation 2nd ed. San Francisco, CA: Jossey-Bass; 2004; 340-362.

[17] Quinn Patton M. Qualitative Research and Evaluation Methods 3rd ed. Thousand Oaks, CA: Sage; 2002.

[18] Cantor D, O’Hare B, O’Connor K. The use of monetary incentives to reduce non-response in random digit dial telephone surveys. In: Lepkowski JM, Tucker C, Brick JM, Leeuw ED, Japec L, Lavrakas PJ, Link MW, Sangster RL, ed. Advances in telephone survey methodology. New York, NY: Wiley; 2008:471-89.




21



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorBleechington
File Modified0000-00-00
File Created2021-01-26

© 2024 OMB.report | Privacy Policy