The Safe Schools/Healthy Students (SS/HS) Initiative
National Evaluation
Supporting Statement
A. Justification
1. Circumstances of Information Collection
The Substance Abuse and Mental Health Services Administration (SAMHSA) Center for Mental Health Services (CMHS) is seeking Office of Management and Budget (OMB) approval for seven instruments associated with the evaluation of the Safe Schools/Healthy Students (SS/HS) Initiative:
Year 1 Site Visit protocol
Group Interview protocol
Project Director Interview protocol
Partnership Inventory
Project-Level Survey
School-Level Survey
Staff School Climate Survey
The above instruments (with the exception of the Staff School Climate Survey) are currently in use without OMB approval. The program became aware of the need for OMB approval of the item currently in use during July 20, 2007 telephone conversation with OMB officials.
The SS/HS Initiative is authorized under the Safe and Drug-Free Schools and Communities Act (20 U.S.C. 7131), Public Health Service Act (42 U.S.C. 290hh), and Juvenile Justice and Delinquency Prevention Act (42 U.S.C. 5614[b][4][e] and 5781 et seq.). The initiative is an unprecedented collaborative grant program supported by three Federal departments—U.S. Departments of Health and Human Services, Education, and Justice.
The SS/HS Initiative provides funds for grantees—Local Education Agencies (LEAs)—to use, and sometimes develop, state-of-the-art knowledge about what programs and practices work best to foster resilience, promote safe and healthy environments where America’s children can learn and develop, and prevent violence and substance use among our Nation’s youth, schools, and communities. Contributing to the uniqueness of this initiative is the requirement that each grantee include partnership collaboration among the LEAs and representatives from the local law enforcement, mental health services, and juvenile justice communities.
The SS/HS evaluation conducted under the authority of the Secretary of Health and Human Services is mandated under 42 U.S.C. 290(hh), item (f):
The Secretary shall conduct an evaluation of each project carried out under this section and shall disseminate the results of such evaluations to appropriate public and private entities.
The Federal Evaluation Workgroup for the SS/HS Initiative1 has determined that local education agencies (LEAs) receiving the grant must meet the requirements of the Government Performance and Results Act of 1993 (P.L. 103–62), or “GPRA.” GPRA requires all Federal agencies to set program performance targets and report annually on the degree to which the previous year’s targets were met. Agencies are expected to evaluate their programs regularly and to use results of these evaluations to explain their successes and failures and justify requests for funding. To meet the GPRA requirements, SAMHSA must collect performance data (i.e., “GPRA data”) from grantees (OMB No.1890-0004, pending at OMB).
In addition to these required outcome data, the Workgroup has determined that this mandate requires collection of detailed process information on planning and implementation of grant activities at each grantee site, particularly with respect to the collaborative partnerships. To ensure the greatest usefulness and generalizability of the SS/HS evaluation, it is critical to determine the factors leading to both positive and negative results. A careful examination of the following will enable clear communication to the public of the effectiveness of the Initiative:
Conditions existing at the onset of projects
The impact of the partnerships
How and why interventions, policies, and strategies were selected and implemented
The impact of a wide variety of potential intervening events and intermediate outcomes
2. Purpose and Use of Information
The three Federal agencies collaborating on the Initiative share an expectation that LEAs and communities nationwide will benefit from the documented experiences of the grantees. A letter to introduce the National Evaluation Team (NET) was mailed to all grantees from the Director of the Division of Prevention, Traumatic Stress and Special Programs at SAMHSA (see Attachment 1). The NET has developed instruments to collect detailed quantitative and qualitative data on the grant activities and the local collaborative process. These instruments all apply to the local Project Director of the grant, the local project evaluator, one representative of each local organization that formally partners in the administration of the grant activities, one representative of each school receiving services through this grant, and instructional and administrative staff at targeted schools. These instruments are:
A site visit protocol administered during the first year of the grant (see Attachment 2)
Group telephone interviews with project leadership at each grantee site (see Attachment 3)
Project Director interviews at each grantee site (see Attachment 4)
A Partnership Inventory, completed by a representative of local grant partners (see Attachment 5)
Project-Level Survey completed by Project Directors (see Attachment 6),
School-Level Survey completed by a representative of each participating school (see Attachment 7), and
Staff School Climate Survey completed by staff at all participating schools (see Attachment 8)
All these instruments, with the exception of the Staff School Climate Survey, collect information concerning process components of grants. The Staff School Climate Survey provides additional outcome information that extends beyond required GPRA outcome measures.
2A. Process Evaluation Overview
An integral part of the SS/HS Initiative is process evaluation information, to be collected annually via surveys and interviews of grantee and LEA representatives. Instituted by Congress following the murderous assaults at Columbine High School in Colorado, the SS/HS Initiative is designed to provide LEAs, including school districts and multidistrict regional consortia, with grant funding for up to five years to simultaneously address activities in the following areas:
Security
Educational reform
The review and updating of school policies
Alcohol and drug abuse prevention and early intervention services
Mental illness prevention and treatment services
Early childhood development and psychosocial services2
The specific activities to be conducted at each site and the mode and means of interagency collaboration and partnership at the local level are at the discretion of each grantee. However, representatives from the local law enforcement, mental health services, and juvenile justice communities are “required partners” for each grant.
2B. Process Evaluation Instruments
1. Site Visit Protocol: Purpose and Use of Information. Information provided by grantees in their grant application will be organized with the help of a protocol for a site visit to be conducted shortly after award of the grant. Specific content of questions during the site visit will vary, depending on the contents of the grant application. The protocol provides a comprehensive set of subtopics presented in question form within each of five general topical areas:
Planning for the SS/HS project
Current status of project implementation
School–community partnership structure, composition, and functioning
Enhancing interagency services
Status of the local evaluation
Data expectations for the Year 1 site visit are shown in Table 1. The Table displays the informant or informants who are most likely to be able to address each data requirement. These data will be used to address several central evaluation questions:
Which LEA-level SS/HS Initiative activities are expected to or have facilitated the LEAs to achieve identified near-term outcomes?
Which LEA-level SS/HS Initiative activities are expected to enable the LEAs to sustain their programs or activities after the funding is finished?
Which LEA-level SS/HS Initiative activities are expected to or have improved average scores on GPRA measures?
Which LEA-level system change (near-term outcomes) are expected to or have facilitated the partnership’s ability to achieve their long-term outcomes?
Which LEA-level system changes are expected to lead to improved sustainability?
Do outcomes achieved following implementation of the SS/HS Initiative differ on the basis of the following:
Geographic category of LEA (urban, suburban, rural, tribal)
Grade levels (early elementary, elementary, middle, high, etc.)
Type of collaborative structure in the LEA
Membership of the local SS/HS Initiative partnership in the LEA (e.g., diverse membership vs. membership limited to core partners)
Local assessment of enhancement of preventive services
Perceived characteristics of the local partnership’s functioning
Table 1: Data Collection Expectations for Year 1 Site Visit
Topical Area/Required Items |
Likely Respondent3 |
||
Project Director |
Local Evaluator |
Required Partners |
|
Who participated in grant planning |
√ |
√ |
(√) |
How target populations were identified |
√ |
√ |
(√) |
How existing services/resources were assessed |
√ |
√ |
(√) |
What gaps were identified |
√ |
√ |
(√) |
Programs, strategies, activities to be implemented (new and existing) |
√ |
√ |
(√) |
Target population |
√ |
√ |
(√) |
If evidence-based programs used, source of effectiveness data |
√ |
√ |
(√) |
Responsibility for implementation |
√ |
√ |
(√) |
Status of implementation |
√ |
√ |
(√) |
Changes in key partner representation |
√ |
√ |
√ |
Barriers to involvement |
√ |
√ |
√ |
Structure, leadership, membership, operation |
√ |
√ |
√ |
Collaborative history, purpose, and prior successes |
√ |
√ |
√ |
Sustainability after SS/HS |
√ |
√ |
√ |
New structures/systems established through the grant |
√ |
√ |
(√) |
Changes in agency functioning |
√ |
√ |
(√) |
Strategies and how they will be monitored |
√ |
√ |
(√) |
2. Group Interview of the Local SS/HS Partnership: Purpose and Use of Information. The group interviews of the local SS/HS partnerships are designed to assess the status of the following:
The SS/HS project as a whole
Required Partner involvement
Structure and functioning of the community partnership
Efforts to enhance service integration and systems change
Local evaluation activities
Perceived impact of the SS/HS project
Since these data reflect different perspectives of participants, information from the various sources must be collected, analyzed, and synthesized. The interviews also enable the NET to update information provided by the grantee as part of the grant application and during the initial site visits. Key informants may include (in addition to the Project Director) the local evaluator, required partners from each site, and representatives from other local organizations (e.g., alcohol and drug prevention or treatment agencies, after-school programs, early childhood programs). The NET will consult with Federal Project Officers and the local Project Directors in deciding which partners/organizations will serve as key informants in the interviews. Since these data represent ongoing local grant conditions, they cannot be obtained from existing sources or from other instruments.
3. Project Director Interview: Purpose and Use of Information. An individual telephone interview of the local SS/HS Project Director at all sites will follow the group interview and will be used primarily to assess each partner’s contribution to the core elements of collaborative functioning.
Data from the telephone interview will be used to:
Update information about the programs, strategies, and activities the sites intend to implement
Explore partners’ involvement in the project
Investigate the role of the community partnership in the local project
Secure information regarding the site’s perspective on the impact of the SS/HS project on students, families, and the community
Assess collaborative functioning
This information will be used to refine project classifications, examine changes in the number and types of evidence-based practices being implemented, and document the number and type of new service structures or systems sites plan to implement through the grant.
Which LEA-level SS/HS Initiative activities are expected to or have facilitated the LEAs to achieve identified near-term outcomes?
Which LEA-level SS/HS Initiative activities are expected to enable the LEAs to sustain their programs or activities after the funding is finished?
Which LEA-level SS/HS Initiative activities are expected to or have improved average scores on GPRA measures?
Which LEA-level system change (near-term outcomes) are expected to or have facilitated the partnership’s ability to achieve long-term outcomes?
Which LEA-level system changes are expected to lead to improved sustainability?
6. The School-Level Survey: Purpose and Use of Information. The School-Level Survey, completed by the SS/HS coordinator in each targeted school, is designed specifically to provide an indicator as to whether and how project-level SS/HS-related policies, interventions, and strategies are consistently diffused to the individual schools. Specific questions addressed in this survey include:
Which partners are the schools collaborating with, and are they satisfied with the results of the collaboration?
Which SS/HS Initiative activities are implemented in the school? Who is involved in the implementation process?
2C. Summary of Process Evaluation Information. Core data expectations for the process evaluation are presented in Table 2. The evaluation questions identified by the members of the Federal Evaluation Workgroup are presented in the column on the left. The specific survey instruments and/or interview protocols that address these questions appear in the column on the right.
Table 2: Process Evaluation Questions and Data Sources
Evaluation Question |
Data Source |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8. Do outcomes achieved following implementation of the SS/HS Initiative differ on the basis of the following: a. Geographic category of LEA (urban, suburban, rural, tribal) b. Grade levels (early elementary, elementary, middle, high, etc.) c. Type of collaborative structure in the LEA d. Membership of the local SS/HS Initiative partnership in the LEA (e.g., diverse membership vs. membership limited to core partners) e. Local assessment of enhancement of preventive services f. Perceived characteristics of the local partnership’s functioning |
a. Grant applications/Site Visit Protocol b. GPRA or Staff School Climate Survey c. Project-Level Survey d. Project-Level Survey
e. Project-Level Survey
f. Partnership Inventory |
Prior to fielding the Project-Level and School-Level Surveys, an e-mail and/or letter will be sent to LEAs and SS/HS school coordinators to explain the purpose of the survey and provide information on how to complete the surveys. The e-mail and/or letter will provide names, e-mail addresses, telephone numbers, and fax numbers for the NET contact(s) to ensure respondents have contact information if they need to clarify survey-related questions. The e-mail and/or letter will also explain the options available for completing and returning the survey (paper vs. electronic). A similar process is used for the Partnership Inventory, which will be sent out prior to conducting the Group Interview.
Designated NET staff responsible for the surveys will call or e-mail the respondents after distribution to ensure responses are received in a timely fashion. The NET also plans additional follow-up efforts to track any respondents who fail to submit their completed surveys after the initial follow-up. Sample scripts for these instruments are included in Attachments 9, 10, and 11.
2D. School Climate Overview
In addition to the instruments for the process evaluation described above, a Staff School Climate Survey that measures particular aspects of project outcomes will be administered annually to instructional and administrative staff at targeted schools of each LEA that has been awarded an SS/HS Initiative grant. An introductory letter for this survey is included in Attachment 12.
The Federal Evaluation Workgroup has determined that the most important effects of the SS/HS Initiative are likely to be observed in changes in what has been termed “school climate.” At the conceptual level, school climate is the perception of the school as an environment in which to grow and learn. For example, a perceived improvement in youth violence and substance use, and hence a safer environment, would be seen as a positive result of the Initiative. The Federal partners administering the SS/HS Initiative have directed evaluators to identify or develop a consistent instrument with proven psychometric integrity for measuring multiple dimensions of school climate.
2E. Staff School Climate Survey: Purpose and Use of Information.
After extensive research, the NET identified the staff version of the California Healthy Kids Survey (CHKS), developed by WestEd for the State of California and available in the public domain, as ideal for this purpose. The CHKS initially was created to meet Federal requirements for the assessment of teacher perceptions of the incidence and prevalence of drug use and violence in the schools. First administered by the state of California in 2004–2005, this staff version of the CHKS was designed to assess seven components of school climate from school staff perspectives:
Student risk/problem behaviors such as substance use, violence, and truancy; the extent of the problem they pose for the school; and the sufficiency of efforts to reduce them
Availability of health and counseling services
Staff and student safety
Nature, communication, and enforcement of school rules and policies
Academic standards and priorities, and learning supports and barriers
Staff-student relationships, school connectedness, and staff supportive relationships
Parent involvement
The CHKS for this data collection consists of 43 items. One item on the survey asks about the respondent’s role at the school, and another asks how long they have been in their current position. The remaining 41 items, address the seven components of school climate listed above. While these components represent the content outline of the instrument, subsequent psychometric analyses have yielded an array of eight scales and subscales with sufficient reliability to be used in the analysis of school-to-school variation and change over time. These scales, subscales, and obtained reliabilities4 are as follows:
Table 3. California Healthy Kids Survey (CHKS) Reliability
Scale |
No. of Items |
Coefficient Alpha |
Positive Learning and Working Environment
|
24 |
0.91 |
Staff/student relationships |
6 |
0.93 |
Student behaviors that facilitate learning |
9 |
0.83 |
School-level norms and standards |
9 |
0.72 |
Staff and Student Safety
|
9 |
0.88 |
Perception of student violent behavior |
7 |
0.87 |
Perception of school as safe place |
2 |
0.89 |
Clear, Consistent Communication and Enforcement of Policies
|
2 |
0.83 |
Adequate Health/Counseling Services
|
3 |
0.65 |
Perception of Problems With ATOD Use
|
3 |
0.94 |
3. Use of Information Technology
School-Level and Project-Level Survey responses are collected via Web-based methods in which a school staff member uses a computer to log in to a Web site with a unique login code. This login process enables the participant in the survey to enter the information directly in an electronically coded format, reducing the burden to both the participants and the administrators of the survey. The login process also enables respondents to complete the survey when it is convenient to them, enhancing the response rate and reducing the perceived burden. Staff School Climate Survey respondents follow a similar procedure, although unique login codes are not assigned to each participant; rather, unique codes are assigned to the respondent’s school.
In the event that completing a survey via Internet is inconvenient for school staff members, the survey can be completed in paper form from a copy e-mailed by the NET or downloaded from the Internet.
4. Efforts to Identify Duplication
The survey instruments and interview protocols used to collect data for the process evaluation of the SS/HS Initiative are unique to this initiative. They are tailored to the knowledge of the grant operations of each set of stakeholders and focus solely on information pertinent to the planning and implementation of the grant activities. The match of instrument to stakeholder is depicted in Table 4. The evaluation will ask each key stakeholder to complete one quantitative survey instrument and qualitative interview protocol per year.
Table 4. Expectations for Stakeholder Involvement in Process Evaluation
Stakeholder Category |
Nature of Process Data |
Instrument |
Local Project Director/Evaluator |
Quantitative |
Project-Level Survey |
|
Qualitative |
(1) Site Visit Protocol (First Year Only) (2) Project Director Interview (3) Group Interview Protocol
|
Representative of Schools Receiving Grant Services |
Quantitative |
School-Level Survey |
Representative of Other Local Grant Partner Organizations |
Quantitative
Qualitative
|
Partnership Inventory
Group Interview Protocol
|
The Staff School Climate Survey planned for use in outcome evaluation is an instrument with proven psychometric validity and reliability, as validated by pilot testing by the State of California. California schools already complete the instrument to meet data collection requirements for State and Federal educational funding.
Twenty percent of the 2005 cohort of grant recipients are California LEAs that already are required by the State government to conduct the survey biennially. In those years when they already are conducting the survey, the California LEAs will not be asked to complete the instrument a second time to comply with the needs of the SS/HS Initiative. In effect, grantees in California will be required only to increase the frequency of survey operations from biennially to annually.
Several other school districts and LEAs that have received or applied for SS/HS Initiative grants also use instruments that are purported to measure school climate. Detailed review of these locally developed instruments found that (1) they lack empirical confirmation of reliability or validity, and/or (2) they are far less comprehensive in their coverage of the multiple dimensions of school climate. On this basis, the NET has determined that similar information is not available from other data collection in place. In fact, several grantees outside of California have informed the U.S. Department of Education they voluntarily intend to replace their locally developed school climate assessments with SAMHSA’s Staff School Climate Survey.
5. Involvement of Small Entities
The collection of information applies only to selected individuals in LEAs that have received a specific Federal grant award or that are subgrantees of such school districts. This data collection does not significantly involve small businesses or entities.
6. Consequences if Information is Collected Less Frequently
The collection of information for the evaluation of the grant is scheduled to occur on an annual basis during the 3 years of the grant. Annual data collection encourages the accuracy of up-to-date information on recent grant planning and implementation activities and reduces the possibility of confusing linear change over time resulting from the initiative with short-term change caused by a local event or trend, such as a single, highly publicized act of violence. It also takes into consideration the likelihood of turnover among representatives of key local stakeholders in the grant.
Failure to collect the information on this annual schedule also would prevent the Federal partners from meeting their obligations under GPRA and PART (Program Assessment Rating Tool) to report to Congress on the outcome and impact of the SS/HS Initiative. Collecting the information only twice (i.e., at baseline and at the end of the grant) would also deprive the grantees of the opportunity to review interim results on aspects of school climate and take corrective action, if necessary, before the end of the grant.
7. Consistency With the Guidelines in 5 CFR 1320.5(d)(2)
The data collection is consistent with the guidelines in 5 CFR 1320.5(d)(2).
8. Consultation Outside the Agency
The notice required by 5 CFR1320.8(d) was published in the Federal Register, October 1, 2007, (72 FR 55794). No comments were received in response to this notice.
SAMHSA relied on informal input from the grantees themselves as a form of outside consultation. Neither the agency nor its contractors recorded the names of individual local evaluators among the grant recipients who provided comments on the proposed content and use of the survey instruments. Additionally, SAMHSA reviewed all recordkeeping and survey forms nominated by the 40 LEAs that received initial SS/HS grant awards in FY 2005 to facilitate its internal evaluation of grantee performance. Although several grantees proposed to independently collect annual surveys that include elements of the Staff School Climate Survey, the instruments proposed for this purpose had not been tested for reliability and validity and lacked the comprehensive definition of “school climate” that underlies the CHKS.
9. Payment to Respondents
There will be no payments or gifts given or offered to respondents.
10. Assurance of Confidentiality
The evaluation team, which has extensive experience conducting national-level evaluations, will implement a comprehensive plan and diligently follow it to ensure the security of all data collected in the SS/HS National Evaluation. All project staff who access any program data will generally be researchers who have in-depth knowledge of data protection requirements. In addition, they will have been trained in data security policies (i.e., staff responsibilities for securing hard-copy materials and computer workstations, shredding discarded copies of documents, protecting information collected, etc.). These policies will be shared across the project team and reinforced through training as needed.
For all data collection instruments and data files, there will no individually identifiable information: No personal names will be directly attached to any hard- or soft-copy form or file. No one except selected project staff will have access to any information that could be used to identify specific individuals. In addition, the NET will utilize an Information Resource Management System (IRMS) for the purposes of collecting and storing evaluation data. To protect the security of this system, the IRMS will be deployed to a facility that will provide monitoring and support, backup heat and cooling, redundant access to the Internet, and space for providing redundant resources to assure high availability. The facility will ensure the highest level of security, both physical and virtual. The contractor for the evaluation will be responsible for supporting and maintaining the system (hardware and software) through the period of performance. Appropriate safeguards will be used to protect data from improper disclosure in accordance with applicable portions of the U.S. Department of Health and Human Services Automated Information Systems Security Program Handbook, as well as the standards set forth by the National Institute of Standards and Technology. The relevant policies and procedures for each instrument are summarized below.
Security and Protection for Site Visit Data. Project staff will assign unique ID numbers to each grantee organization. Each SS/HS site will have a record in the IRMS developed for the SS/HS National Evaluation. The record will be the repository of all information and data for a specific site. Immediately after the initial site visit, site team members will enter the raw data (e.g., notes from interviews or observations and written materials provided by the sites) into the site record. Any names associated with the information collected (other than contact information) will be removed. Identities of individuals supplying information during the site visit will be anonymous in the database.
Security and Protection for Group Interview Data. These interviews are recorded digitally (with permission of the participants) and stored on the system and may include individually identifiable information. However, these records are used solely for backup for staff entering data; they do not require the names of individuals. These digital records are erased when data entry and initial quality checks are completed.
Security and Protection for Project Director Interview Data. This interview, completed by the Project Director for each SS/HS grant site, is also recorded (with permission). Again, while the names of the Project Directors will be known, their individual responses will not be shared by anyone other than NET staff. The recordings will be used for backup and erased when data entry and initial quality checks are completed.
Security and Protection for Partnership Inventory Data. This Inventory is e-mailed to designated representatives of local partnering organizations. The instrument does not ask for the name of the respondent, who is identified only by the sector (i.e., School District, Law Enforcement, Mental Health Services, or Juvenile Justice). When responses are received, data are immediately sent to the database, thereby ensuring the names of individual respondents will not be linked to their responses.
Security and Protection for Project Level Survey Data. This Web-based survey is completed by the Project Director for each SS/HS grant site. While the names of the Project Directors will be known, their individual survey responses will not be shared by anyone outside of NET staff. All data will be reported in aggregate form, except with written permission of the individual Project Director.
Security and Protection for School-Level Survey Data. This Web-based survey will be completed anonymously by an individual chosen by the Project Director to represent each participating school involved with the grant. Thus, the identity of the respondents will not be known to anyone other then the Project Directors, although their positions at the schools will be recorded. Project Directors will have responsibility for tracking respondents to ensure required completion rates.
Security and Protection for the Staff School Climate Survey Data. This survey will be completed by all staff at each participating school. Rather than generating and tracking an estimated 100,000+ unique respondent codes, the following procedures will be used:
Each school will be assigned a unique code number.
Each school will designate a project liaison to be responsible for identifying individual respondents. The liaison will tell the NET how many staff are eligible for the survey.
As each staff member completes the survey, he or she will be issued a token “completion certificate,” which will also be sent to the liaison. The liaison will be able to identify who has and has not completed the survey, although neither the liaison nor NET staff will be able to link survey responses to individuals.
Periodically, the liaison (and NET staff) will match the number of completed surveys to the number of completion certifications; mismatches will indicate if any respondent has completed more than one survey.
The liaison will be responsible for ensuring required completion rates.
Summary. All data collection instruments and data collection procedures have been carefully constructed to avoid any potential issues with data that may raise confidentiality concerns. All physical documents containing program data (e.g., faxes, hand-written surveys) will be stored in a secure central location by NET staff charged with their safekeeping. For any sensitive data stored electronically, user IDs and passwords will be required for access. Both the Web server and database will reside in a staffed data center and will have firewall protection. The database will not allow anonymous connections, and account information will be encrypted. If a security incident occurs, proper incident response procedures will be followed. Supervisors are responsible for ensuring that all project staff observe all security requirements and receive appropriate security training. Reports and publications from these data will be limited to aggregate data analysis that fully protects the identity of individual participants. No data will be stored with identifying respondent information.
11. Questions of a Sensitive Nature
Respondents will not be asked any questions of a personally sensitive nature. The subject matters of the interview and survey questions are limited to the perceptions of grant planning and implementation activities among key stakeholders of the grants and to school employees’ perception of student behavior, substance use, violence, and safety.
12. Estimates of Annualized Hour Burden
Table 5 provides the basis of the resulting estimates of the hour burden of collection of information, based on field tests of the proposed protocols and instruments.
Table 5. Elements of Annualized Hour Cost Burden of Data Collection*
Instrument Description |
Anticipated Number of Respondents |
Responses per Respondent |
Average Hours per Response |
Total Annual Hour Burden |
Hourly Rate |
Total Annual Cost |
Site Visit Protocol |
425 |
1 |
3 |
1275 |
$28.82 |
$36,746 |
Group Interview |
425 |
1 |
1.5 |
6,38 |
$28.82 |
$18,373 |
Project Director Interview |
85 |
1 |
0.75 |
64 |
$23.80 |
$1,517 |
Partnership Inventory |
340 |
1 |
0.25 |
85 |
$23.80 |
$2,023 |
Project-Level Survey |
85 |
1 |
0.75 |
64 |
$23.80 |
$1,517 |
School-Level Survey |
2,500 |
1 |
0.75 |
1,875 |
$28.47 |
$53,381 |
Staff School Climate Survey |
106,250 |
1 |
0.117 |
12,431 |
$25.00 |
$310,781 |
Total |
106,675 |
|
|
16,431 |
|
$424,338 |
* Number of respondents based on 59 current grantees from the 2005 and 2006 cohort. The NET anticipates an additional 26 grantees from the 2007 cohort for a total of 85. Estimates based on an average of 25 schools per grant. Average hours per response are based on field tests.
13. Estimates of Annualized Cost Burden to Respondents
The National Education Association estimates a nationwide average salary for instructional staff in public schools of $49,429, or slightly less than $25 per hour. Rates for the School-Level Survey were based on an average salary of $59,224, or about $28.47 per hour. This salary was derived from a weighted average of the distribution of occupations that participated in the 2005 administration. Rates for the Project-Level Survey and the Project Director Interview were based on a nationwide average salary of $49,500, or $28.47 per hour for Social and Community Service Managers. The initial site visit and the group telephone interview both included rates based on average nationwide salaries of common representatives from the required partners, the project director, and the local evaluator. This average annual salary from all five entities is approximately $59,944. The rate for the group interview is based on the average annual salary of $61,280, or $29.46 per hour for the representatives of the required partners. The partnership inventory rate is derived from the salaries as the site visit and group telephone interview, minus the local evaluator. As a result, our estimate for the total annual cost burden to respondents in the equivalent to their full-time salary is $424,338.5
14. Estimates of Annualized Cost to the Government
The annual cost to the Government of the proposed data collection consists of 20 percent of the Government Project Officer’s time and 80 percent of a competitive contract awarded for the conduct of the SS/HS evaluation to MANILA Consulting Group, Inc. by the Department of Health and Human Services, Substance Abuse and Mental Health Services Administration. The estimated annual cost of these expenses is $3,220,000 per year.
15. Change in Burden
This is a new project.
16. Time Schedule, Publication and Analysis Plans
16a. Time Schedule
The time schedule for implementing and using the proposed instruments is summarized in Table 6. A 3-year clearance is requested for this project.
Table 6. Time Schedule for SS/HS Evaluation Instruments
Tasks |
Dates |
OMB approval |
2008 |
Initial data collection |
As soon as OMB approval is received |
Final data collection |
2010 |
Data analysis |
Ongoing |
16b. Publication Plans
As noted earlier in the initial discussion of the requirement for this data collection effort, 42 U.S.C. 2099hh emphasizes publication and dissemination of the results of information derived from the evaluation of each project funded by the SS/HS Initiative grant program. The evaluation contract for the SS/HS Initiative grant program anticipates that aggregate results from the evaluation will be incorporated in text and charts of the following publications, planned for completion and distribution in 2009 and 2010:
A glossy Executive Summary of the multiyear evaluation of the SS/HS Initiative grant program
A folder with highlights, brief methods summary, and case study examples of success
A very brief summary with anecdotal highlights, suitable for eighth-grade reading level
The three agencies sponsoring the SS/HS Initiative (the U.S. Department of Health and Human Services, the U.S. Department of Education, and the U.S. Department of Justice) may also choose to incorporate the aggregate results from the Staff School Climate Survey in journal articles, scholarly presentations, and congressional testimony referring to the outcomes of the SS/HS grant program.
16c. Analysis Plan
The NET, in collaboration with the Federal Evaluation Workgroup, has examined the question of what is expected to happen in a community as a direct result of the SS/HS Initiative. The answer can be examined in terms of five types of activities: (1) partnership functioning, (2) school involvement in decision making, (3) training and technical assistance, (4) leadership, and (5) project advocacy.
Partnership functioning. Partnership functioning often begins prior to grant award as part of the means to prepare for the grant application. It may then continue as the improvements assisted by the grant award are implemented, assessed, and sustained. The NET expects to find partnership functioning among in grantee communities reflected in some or all of the following areas:
Assessing needs of targeted populations and the type of help that will make a significant difference. Partnership functioning could help to identify categories of youth served independently by multiple providers or encourage the sharing of perspectives on youth needs from several disciplines
Searching for best practice solutions. Partnership functioning may help expand the search by considering programs, services, and system changes from multiple fields; for example, mental health, law enforcement, juvenile justice, “character education,” prevention science.
Analyzing implementation requirements of prospective solutions to determine whether they may be appropriate for the community’s resource base. This may help partners match requirements and community resources they had not previously considered.
Selecting, implementing, and supporting best practice solutions. This may range from a process of distributing lead responsibility for services in each of the six SS/HS Initiative topical elements among the specific partners to negotiating joint implementation and funding of specific programs and services by several partners.
Monitoring and tracking performance to determine if services are being implemented as intended and if the anticipated benefits are occurring. If not, performance tracking should indicate whether program adjustments might be needed or if the partners need to reexamine the link between identified needs and initially planned solutions.
Recommending policy changes that require action by elected officials or by organizations outside the local community, such as State boards or charitable foundations.
Planning future activities, including sustaining best practice solutions.
At a minimum, each core partner (LEA, juvenile justice, mental health, law enforcement) will be asked separately about the substantive contributions or participation their organization has made to each of the seven collaborative functions. Levels of consensus or lack of consensus will be noted in the site visit write-up in preparation for the analysis of collaborative functioning.
Observed variation will enable the NET to address how different patterns of collaborative functioning relate to near-term outcomes in terms of system changes and long-term behavior changes. In particular, it may be important to distinguish among the effects on outcomes when community partners engage in collaborative functioning before grant award, after grant award, and both before and after the grant award. The findings may provide useful information on both the timing and the activities of collaborative functioning that are related to systems change.
School involvement in decisionmaking. At the most basic level, the recipients of grants are intended to implement locally appropriate interventions at the school level. Information obtained on the number and types of activities individual schools implement as a result of the grants will allow the NET to address both short-term outcomes (e.g., Did the grant enable schools to do what was intended?) and more programmatic issues, such as the role of the local schools themselves in the grant administrative process.
Training and technical assistance. In addition to changes within the community system, recipients of the SS/HS grant receive potentially valuable advice, consultation, and assistance. Support is furnished by outside technical expertise and by the Federal Project Officers and the three Federal agencies that jointly operate the SS/HS Initiative. Training and technical assistance provided through SS/HS grants will be targeted to help sites achieve their desired near- and long-term outcomes.
In some cases, a grantee may view the assistance as informative and timely, while in other cases the help may be seen as redundant or inapplicable. Members of the NET have found that training and technical assistance provided on previous projects was potentially very useful but not delivered at the stage in grant implementation when it would provide the greatest practical benefit to the grantee. The Project-Level Survey will enable the evaluation team to compare the perceived usefulness of training and technical assistance across multiple grantee communities. This comparison will enable the NET to examine the relationship of the perceived value of training and technical assistance to the attainment of near- and long-term outcomes. For example, the NET will be able to evaluate whether outside assistance may partially compensate for a relatively low level of collaborative functioning among some grantees. It is also possible that improved performance is associated more with sites that seek training and technical assistance than with sites that do not seek it.
Leadership. The SS/HS Initiative provides an opportunity for the exercise of continuing, high-level leadership on the interconnected elements of youth development. In some educational systems, this leadership may be a significant innovation that contributes to the achievement of the desired system-level outcomes. In either case, the effectiveness of leadership may affect the attainment of near- and long-term outcomes. Effective leadership helps the members of the SS/HS partnership perform responsibilities as part of a coherent body of policy and program interventions designed to reinforce positive effects on youth behavior and the school environment. Less effective leadership may result in a piecemeal approach in which partners act independently with limited common purpose.
Project advocacy. In some jurisdictions, one or more individuals may serve as the community advocates for the SS/HS Initiative. Advocates can provide inspiration and commitment to use the grant for the purpose of sustaining integrated youth development outcomes. Effective advocacy is believed to help the SS/HS partnership to maintain a coherent body of policy and program interventions designed to reinforce positive effects on youth behavior and the school environment.
The near-term results of the SS/HS Initiative grant activities consist of four closely related changes believed to be related to improved school climate, sustainability, and other long-term outcomes: (1) comprehensive policies and practices, (2) implementation of enhanced services, (3) improved coordination among the partners, and (4) school-level organizational change.
Comprehensive policies and practices. Policies and practices adopted as a result of the local SS/HS Initiative are expected to be comprehensive in scope (the topics they address) and in diffusion (influencing youth development at the community level and in individual schools). The NET plans to use the number of new or expanded activities, as indicated in the annual Project-Level Survey, as a proxy for implementation of policies and practices. For example, instituting an evidence-based parenting curriculum may yield significant improvements in school readiness among preschool youth. However, a community that institutes training for child care providers, policies to screen for preschool youth at risk, and referral procedures as well as the parenting curriculum is making a larger investment in time and effort to early childhood development than a community that institutes the parenting curriculum alone.
Collection of survey data measuring diffusion of changes to the school level provides the NET with an opportunity to explore whether long-term outcomes in a given program element are affected by the diffusion of innovation to individual schools. The primary source of this information is the School-Level Survey, which asks each school’s designated SS/HS coordinator to report on the service delivery changes they observe occurring within the school.
Implementation of enhanced services. This refers to implemented activities that are more effective and/or more widespread than previous practices. The NET will incorporate local staff assessments as one measure of whether services are enhanced in each SS/HS program element. Enhanced services will be operationalized with reference to the following questions:
Do staff members, including personnel responsible for school-level oversight, perceive that services enhanced by the grant meet the targeted population’s needs?
Do innovations generated by the grant reflect evidence-based solutions?
Do innovations in services and practices generated by the grant cover a substantial part of the targeted population?
As there is some subjectivity involved in these types of self-reported assessments, the NET will use site visits and interviews as opportunities to validate and cross-reference responses. Specifically, the protocols will include probes to determine whether service innovations are being actively monitored in some objective way. This can help provide some assurance that the local assessments are based (at least in part) on objective data.
Improved coordination among the partners. This refers to the hypothesis that agencies in a grantee community’s system of youth development services will collaborate more in planning, implementing, monitoring, and sustaining activities. Improved coordination among the agencies responsible for addressing the problems of the community’s youth is another type of system change that constitutes a potential long-term outcome. Though difficult to measure directly, improved coordination may affect the long-term outcomes according to changes in youth perceptions and behavior and school climate.
The NET will use the Project-Level Survey and the Partnership Inventory to assess whether interagency coordination has improved within the community and monitor several promising indicators of collaboration. The results of this evaluation will help the Federal partners assess whether enhanced collaboration is generated by the SS/HS grant requirements. Ultimately, this will assist in determining whether a culture of collaboration is associated with improvements in measures of violence, substance use, and access to mental health services among youth, in overall school climate, and in perceived sustainability of effort following the end of the grant period.
School-level organizational change. This refers to the development, in some LEAs, of an organizational structure at the level of individual schools that parallels some of the decisionmaking or consultative functions of the SS/HS partnership at the LEA level.
Improved school climate is viewed as having a role both as a mediator and outcome of the SS/HS Initiative. Results from the Staff School Climate Survey in the form of overall scores and subscale scores for perceptions of student risk/problem behaviors, student and staff safety, and availability of health and counseling services will be used to address a key issue of the cross-site evaluation: Does school climate improve over time, and is such change related to variation in grantee characteristics? In effect, as with GPRA measures, results from the Staff School Climate Survey will be used as quantifiable evidence of improvement in the intended outcomes of the SS/HS Initiative and as a means of exploring potential relationships between these outcomes and relevant characteristics of the grantees and their schools. In addition, the results will be a means to provide feedback to grantees to improve programs and to build the infrastructure and capacity needed to ensure sustained improvements. Comparisons will be made both among grantee communities and among categories of grantees with similar operational characteristics.
17. Display of Expiration Date
The expiration date will be displayed.
18. Exceptions to Certification Statement
There are no exceptions to the certification statement. The certifications are included in this submission.
B. Collections of Information Employing Statistical Methods
1. Respondent Universe and Sampling Methods
The universe of respondents varies for each of the instruments used in data collection for the evaluation of the SS/HS Initiative. The proposed number of respondents appears in Table 7. The evaluation will collect data from all possible respondents or from the respondents identified as relevant by the local Project Director of each grantee; “sampling” per se will not be utilized.
Table 7. Instrument Description and Number of Respondents
Instrument Description |
Anticipated Number of Respondents |
Site Visit Protocol |
425 |
Group Interview |
425 |
Project Director Interview |
85 |
Partnership Inventory |
340 |
Project-Level Survey |
85 |
School-Level Survey |
2,500 |
Staff School Climate Survey |
106,250 |
Site Visit Protocol: Respondent Universe and Sampling Methods. There are 85 grantees in the 2005 (40), 2006 (19), and 2007 (26) cohorts. During the site visit, data are collected from five individuals per site: the Project Director, the local evaluator, and representatives from the three Required Partners. Thus, there is a universe of 425 potential and anticipated respondents. No sampling will be done from this universe.
Group Interview: Respondent Universe and Sampling Methods. Similarly, the one group at each site consists of the five individuals mentioned above. There is a universe of 425 potential and anticipated respondents. No sampling will be done from this universe.
Project Director Interview: Respondent Universe and Sampling Methods. Respondents for this interview are the 85 Project Directors. No sampling will be done from this universe.
Partnership Inventory: Respondent Universe and Sampling Methods. Respondents to this Inventory are the 85 Project Directors and the 255 representatives from the Required Partners (three per each of 85 sites = 255). The total universe of respondents is 340; no sampling will be done from this universe.
Project-Level Survey: Respondent Universe and Sampling Methods. Respondents for this interview are the 85 Project Directors. No sampling will be done from this universe.
School-Level Survey: Respondent Universe and Sampling Methods. Respondents for this survey are one selected individual from each school involved in each of the grantee LEAs. We anticipate a universe of 2,500 participating schools; no sampling will be done from this universe.
Staff School Climate Survey: Respondent Universe and Sampling Methods. All staff members in all targeted schools will be requested to participate in the Staff School Climate Survey data collection annually. “Staff members” include instructional, administrative, and support staff to ensure that data include a variety of potential perspectives on school climate. In fact, in many smaller school systems, there appears to be no practical distinction between instructional and administrative staff because instructional staff are tasked with administrative responsibilities. Given 85 grantees, an average of 25 schools per grantee, and an estimated 50 staff members per school, the result is the estimate of 106,250 eligible respondents.
The collection of data from all eligible participants is more cost-efficient than conducting a sufficiently large sample to measure differences in the perceptions of school climate of schools within the same school district. Most commonly, there are fewer than 100 staff members in an individual school. A sampling frame that would provide school climate scores representative for the overall school at the .05 level of accuracy would require completed data collection from approximately 81 percent of the staff in most schools. The previous experience of WestEd with the survey instrument suggests strongly that staff members who are most aware of the school climate in schools where they are employed—and therefore most likely to provide informed responses to the Staff School Climate Survey—also are those who feel strongly connected to the school as an institution and therefore most likely to respond to the invitation to participate. School staff who do not feel strongly connected to the school, such as visiting instructional aides and part-time groundskeepers, are less likely to provide meaningful, informed responses and therefore have less value in the data collection process.
Within the LEAs, the number of schools range from 1 to over 300. Schools are of all educational levels (e.g., elementary schools, middle schools, high schools, specialized education centers). Within schools, school staff members and administrators are the basic unit of assessment for the Staff School Climate Survey.
2. Information Collection Procedures
The LEAs that apply for the SS/HS Initiative grant support are committed by the terms of the SS/HS grant to actively participate in the evaluation of the SS/HS Initiative. In addition, the terms of the grant require preparation of a Memorandum of Understanding with at least local agencies (juvenile justice system, law enforcement agency, and mental health service provider) that includes the commitment of these partners to actively participate in the evaluation. This data collection is characterized as an elaboration of that requirement, and its successful implementation is a true partnership between the grantees and the Federal grantors.
The Project Director (or designee) of each grantee will be responsible for coordinating data collection activities with the National Evaluation Team (NET). This coordination includes the following responsibilities:
Ensure the participation of the local Project Director, and when appropriate, the local evaluator in the Project Director interview and the Project-Level Survey.
Identify and recruit a knowledgeable, designated contact for each school in the LEA receiving services directly or indirectly from the grant, and ensure the designated contact completes the School-Level Survey.
Inform the principals or other appropriate administrative directors of each school of the critical importance of maximum staff participation in the annual Staff School Climate Survey, and provide updates on the level of participation obtained for each iteration of the survey among the school staff, at least until 75 percent participation is achieved.
Annually collect and transmit statistics to the NET on the number of staff members at each school who are eligible to participate in the Staff School Climate Survey in order to accurately measure completion rates.
Identify one representative from each agency that has completed a Memorandum of Understanding for this grant or otherwise can be described as an “essential” participant in the grant planning and implementation process.
Facilitate the participation of the representatives from these agencies in data collection through the Group Interview and the Partnership Inventory and in any activities to clarify information during a site visit or through an individual telephone interview.
The NET has organized data collection staff into site liaison teams that are responsible for coordination on the part of the Federal evaluation effort. Each site liaison team is responsible for data collection liaison with a specific grantee, including scheduling data collection activities, the tracking of data collection responses, and the completion of data collection tasks.
Site Visit Protocol: Information Collection Procedures. Since grantees differ on many specific aspects of their grants, the initial site visit protocols serve mainly to guide the data collections. Prior to the site visit, preliminary discussions are conducted to arrange logistics, determine participants, and set a schedule. The initial meeting is with the Project Director of the grant. The information collected from this meeting is structured by the protocol and recorded (with permission). Occasionally, the Project Director involves other individuals who have specific knowledge concerning the grant. The next interview is with the local evaluator, followed by meetings with the required partners. Again, these interviews are guided discussions, but individualized for the specific site. Following the interviews and meetings, discussions are transcribed and data are entered into the NET database.
Group Interview: Information Collection Procedures. These interviews are conducted via telephone. Participants are informed via e-mail of the expected agenda and discussion topics prior to the conference. The interview is conducted as a focus group: The site lead (a member of the NET) introduces the session; participants are asked to volunteer information when relevant. A script is used to guide the discussion. Follow-up activities are discussed. Following the discussion, the site lead summarizes and clarifies any issues. The discussions are recorded (with permission of the participants). Discussions are transcribed and data are entered into the NET database.
Project Director Interview: Information Collection Procedures. These interviews are conducted via telephone. Participants are informed via e-mail of the expected agenda and discussion topics prior to the call. The interview is conducted by the site lead (a member of the NET), who introduces the session; participants are asked to volunteer information. A script is used to guide the discussion. For some components of the script, Project Directors are read a series of Likert-scale–type questions, provided to them prior to the call. They are also given a page of definitions of core functions of the collaborations; each question addresses these core functions. If needed, the definitions of these core functions are clarified. Follow-up activities are discussed. Following the discussion, the site lead summarizes and clarifies any issues. The discussions are recorded (with permission of the participants). Discussions are transcribed and data are entered into the NET database.
Partnership Inventory: Information Collection Procedures. This survey will be administered online through a subcontractor’s Web site. Respondents will be provided telephone technical assistance and the unique survey login/password for their grant. Respondents will be given approximately 2 weeks to complete the survey. Follow-up will be via e-mail.
Project-Level Survey: Information Collection Procedures. This survey will be administered online through a subcontractor’s Web site. Project Directors will be provided telephone technical assistance and the unique survey login/password for each school. Respondents will be given approximately 2 weeks to complete the survey. Follow-up with Project Directors will be via e-mail.
School-Level Survey: Information Collection Procedures. This survey will be administered online through a subcontractor’s Web site. Each participating school will be provided telephone technical assistance and their unique survey login/password. Respondents will be given approximately 2 weeks to complete each survey. Follow-up with sites on response rates will be via e-mail.
Staff School Climate Survey: Information Collection Procedures. This survey will be administered online through a subcontractor’s Web site. Sites will be provided telephone technical assistance and the survey login/password for each school. Each school will be assigned a unique code number. Each school will designate a project liaison who will be responsible for tracking individual respondents. The liaison will tell the NET how many staff are eligible for the survey. As each staff member completes the survey, he or she will be issued a token “completion certificate,” which will also be sent to the liaison. The liaison will be able to track who has and has not completed the survey, although neither the liaison nor NET staff will be able to link survey responses to individuals. Periodically, the liaison (and NET staff) will match the number of completed surveys to the number of completion certifications; mismatches will indicate if any respondent has completed more than one survey. The liaison will be responsible for ensuring required completion rates.
The lead contractor for the cross-site evaluation will provide oversight to its subcontractors and assume responsibility for successful implementation of the data collection.
3. Methods To Maximize Response Rates
Since the site visits and the Group Interviews are conducted either in person or as a single session, there is no issue regarding response rates.
Project Director Interview: Methods To Maximize Response Rates. The Project Director Interviews are inherent roles of the Project Director of each LEA that receives an SS/HS grant. Completion rates are monitored by the NET site liaisons. When completion has been delayed beyond 3 weeks from schedule, the site liaison will inform the Federal Project Officer for the grantee and will send an e-mail requesting a scheduled telephone conversation with the Project Director and the site’s local evaluator to identify and resolve any barriers or issues that may be delaying response at the site.
Partnership Inventory: Methods To Maximize Response Rates. Completion rates are monitored by the NET site liaisons. E-mail reminders will be sent to nonrespondents after approximately 2 weeks. When completion has been delayed beyond 3 weeks from schedule, the site liaison will inform the grantee Project Director.
Project-Level Survey: Methods To Maximize Response Rates. Similarly, completion of the Project-Level Survey is an inherent role of the Project Director of each LEA that receives an SS/HS grant. Completion rates are monitored by the NET site liaisons. When completion has been delayed beyond 3 weeks from schedule, the site liaison will inform the Federal Project Officer for the grantee and will send an e-mail requesting a scheduled telephone conversation with the Project Director and the site’s local evaluator to identify and resolve any barriers or issues that may be delaying response at the site.
School-Level Survey: Methods To Maximize Response Rates. A similar process will be in place for identification and resolution of barriers or issues that may be delaying completion of responses by the designated contact at each school responsible for completion of the School-Level Survey. Completion rates are monitored initially by the Project Director and the NET site liaisons. When completion has been delayed beyond 3 weeks from schedule, the site liaison will inform the Federal Project Officer for the grantee and will send an e-mail requesting a scheduled telephone conversation with the Project Director and the site’s local evaluator to identify and resolve any barriers or issues that may be delaying response at the school.
Staff School Climate Survey: Methods To Maximize Response Rates. The Project Director for each grantee will contact school staff to encourage participation in the survey and if necessary work with school superintendents and school staff representatives. The NET will also work with the Federal Project Officers overseeing each site to encourage sites to return surveys in a timely manner. As noted earlier, grantees vary from a single isolated tribal school to a county-wide metropolitan school district composed of hundreds of schools and thousands of staff members. A grantee’s structure and context will influence the specific methods used to maximize response rates. To encourage the desired 75 percent response rate, the designated site liaisons from the NET will be instructed to review the response rate achieved at each school—based on the number of responses obtained with the school’s unique access code—3 weeks after the data collection instrument has been made available to potential respondents.
Where an LEA has not achieved at least 60 percent complete response after 3 weeks, the NET site liaison will inform the Federal Project Officer for the grantee and will send an e-mail requesting a scheduled telephone conversation with the Project Director and the site’s local evaluator. The purpose of this phone call is to identify any barriers or issues that may be delaying response at the site (see Attachment 10). Possible delays might include unexpected school closings, testing or certification schedules, and lack of adequate explanation of the value of completion of the survey to the work of the potential respondents. During the scheduled conversation on barriers to completion, the site liaison and the grantee personnel will reach agreement on locally tailored methods to address the issues that appear to be delaying completion of the data collection effort.
A second progress review of completion rates will be conducted 6 weeks after the launch of the data collection instrument. The target for this milestone will be the expected 75 percent completion rate. Any LEAs reporting less than 75 percent at the 6-week point will receive an e-mail requesting a scheduled telephone conference on how to overcome any barriers inhibiting achievement of the completion rate (see Attachment 10).
A high response rate is anticipated from all schools targeted by the grant because it is believed there will be high recognition of the value of the survey in supporting and sustaining the SS/HS activities after the grant is over. In many LEAs, this value is enhanced by the role of the SS/HS as the first discretionary grant received by the area’s schools.
4. Test of Procedures
All of the data collection instruments (with the exception of the Staff School Climate Survey) developed uniquely for assessment of the planning and implementation processes of the SS/HS grants have been in operation without OMB permission for more than one year. Their development included the use of focus groups consisting of previous grantee staff to review the content, language, and format employed by each instrument. Original draft instruments were modified to comply with suggestions made by focus group participants or to clarify language that focus group participants found problematic.
The CHKS, which serves as the basis for the Staff School Climate Survey, has been fully tested for validity and reliability among school staff populations by WestEd under contract with the State of California and has been in use in that State since 2004. Several other States also have adopted the survey. The wording of specific items in the Staff School Climate Survey is unchanged from the CHKS.
The latent unidimensionality of the subscales generated from the Staff School Climate Survey items has been tested using Cronbach’s alpha, a coefficient of reliability or consistency. When the average interitem correlation is low, alpha is low. As the average interitem correlation increases, Cronbach’s alpha increases as well. By implication, when interitem correlations are high, then the items are measuring the same underlying construct. This is what is meant by the term “high” or “good” reliability.
In the case of the subscales of the Staff School Climate Survey, the three-item scale of perceptions of health and counseling services received the lowest alpha: 0.65. This is generally the low end of acceptable interitem reliability. All the other subscales received results ranging from 0.72 to 0.94—well within the acceptable range for interim reliability. The instrument will use these subscales to track changes in school climate as measured by changes in the aggregate perception of members of the school staff in schools receiving benefits of targeted Federal funding from the SS/HS Initiative.
5. Statistical Consultants
Danyelle Mannix, Special Expert
Division of Prevention, Traumatic Stress and Special Programs
Center for Mental Health Services, SAMHSA
U.S. Department of Health and Human Services
One Choke Cherry Road
Room 6-1085
Rockville, MD 20857
Phone: 240-276-1879
Fax: 240-276-1870
E-Mail: [email protected]
Andrew Rose, Ph.D.
MANILA Consulting Group, Inc
6707 Old Dominion Drive, Suite 315
McLean, VA 22101
Phone: 571-633-9797
Fax: 571-633-0335
E-Mail: [email protected]
List of Attachments (Submitted Separately)
Attachment 1: Letter of Introduction of the SS/HS Cross Site Evaluation
Attachment 2: Data Collection Instrument—Site Visit Protocol
Attachment 3: Data Collection Instrument—Group Interview
Attachment 4: Data Collection Instrument—Project Director Interview
Attachment 5: Data Collection Instrument—Partnership Inventory
Attachment 6: Data Collection Instrument—Project-Level Survey
Attachment 7: Data Collection Instrument—School-Level Survey
Attachment 8: Data Collection Instrument—Staff School Climate Survey
Attachment 9: Project-Level Survey—Sample E-Mail
Attachment 10: School-Level Survey—Sample E-Mail
Attachment 11: Partnership Inventory—Sample E-Mail
Attachment 12: Draft Letter of Introduction for the Staff Climate Survey
Attachment 13: Draft Suggested E-Mail Requesting Conference on Survey Completion Rate
1 The Federal Evaluation Workgroup is composed of officials representing two agencies:
U.S. Department of Education, Office of Safe and Drug-Free Schools
U.S. Department of Health and Human Services, Substance Abuse and Mental Health Services Administration, Center for Mental Health Services
2 42 U.S.C. 290(hh)(b). As of FY 2005, the terms of the grants issued under this authority have been limited to 3 years in duration.
3 Note: A check mark in parentheses (√) indicates respondent’s ability to answer will depend on his or her level or type of involvement in the activity.
4 Cronbach alpha reliabilities computed for each scale based on the 2004–2005 administration of the survey in California schools (N=18,000–20,000, depending on the scale).
5 These salary estimates were calculated from data obtained from the Department of Labor’s Occupational Information Network (O*NET) Web site http://online.onetcenter.org on July 30, 2007, and are based on 2005 median wages.
File Type | application/msword |
File Title | The Safe Schools/Healthy Students (SS/HS) National Evaluation |
Last Modified By | DMANNIX |
File Modified | 2008-03-28 |
File Created | 2007-08-13 |