Ss-b-sshs-omb-01-20-15_final

SS-B-SSHS-OMB-01-20-15_FINAL.docx

Mulit-Site Evaluation of the Safe Schools/Healthy Students (SS/HS) State Program

OMB: 0930-0347

Document [docx]
Download: docx | pdf

Multi-Site Evaluation of the Safe Schools Healthy Students (SS/HS) State Program


Supporting Statement


B. Statistical Methods

1. Respondent Universe and Sampling Methods

The respondent universe and sampling methods are described for each of the data collection instruments below. Table 7 shows the number of respondents that are expected to participate in each data collection activity.



Table 7. Number of Respondents by Data Collection Activity









Data Collection Activity

Number of Respondents

State KIIs (PCP)

14

District KIIs (PCP)

63

State Collaborator Survey (PCP)

208

District Collaborator Survey (PCP)

624

State Collaboration Indication Data Instrument (PCP)

7

District Collaborator Indication Data Instrument (PCP)

21

KIIs (Implementation)

56

Student-Level Survey (Implementation)

2100







State Key Informant Interview (PCP Study): To identify respondents for the PCP KIIs each state project team will be asked to supply a contact list of persons in the following roles: project coordinators and other key staff, representatives of the SMT member agencies, evaluation staff and other key informants identified by the SS/HS project team to participate in the KIIs. Two respondents will be interviewed per state/tribal site.

District Key Informant Interview (PCP Study): In each LEA 3 respondents will be chosen to participate in the district KIIs. There will be 63 total interviews conducted annually in each state. Respondents will include key LEA staff and informed representatives of CMT agencies. The MSE team intends to interview the same individuals each year. The number and identity of respondents may change with staff turnover or changes in the composition of the agency collaboration.

State Collaborator Survey (PCP Study): The respondents per state will vary according to the size of the SMT, the size of the state coalition network, and the number of individuals involved. The team will attempt to identify at least 208 individuals in total, approximately 29/state, to participate in the state collaborator survey.

District Collaborator Survey (PCP Study): Each state SS/HS project team will be asked to supply a contact list of LEA staff and other community representatives directly involved in meetings, communications, or tasks related to the SS/HS grant. Up to 624 respondents will be targeted to complete the survey annually. Again, staff will aim to contact the same respondents for each administration.

State Collaboration Indicator Data Instrument (PCP Study): All seven state SS/HS grantees will be tasked with completing the collaborator indicator instrument on an ongoing basis. Respondents for the indicator data instrument will be project evaluators and/or program staff.

District Collaboration Indicator Data Instrument (PCP Study): Respondents for the District Collaborator Indicator Data Entry will be project evaluators and/or program staff with knowledge of the community level activities. Each of the 21 LEA grantees will be required to complete the inventory.


Key Informant Interview (Implementation Study): The evaluation team expects to conduct 2 interviews per grantee, per interview administration. A total of 14 interviews will be completed each year across state/tribal grantees. A total of 42 interviews will be conducted with persons at the LEA/District level annually


School Level Survey (Implementation Study). The school level survey will be conducted in all schools identified by each local education agency (LEA) SS/HS Project Coordinator as a school participating in the SS/HS State Program. Schools are selected by the LEAs, not by the MSE team. The survey population does not include students. It is defined as those school staff (counselors, school resource officers, teachers, parents, and community members), who are directly involved in administering, decision-making, coordinating, advising, or delivering services within the schools that are participating in the SS/HS State program. These are the persons in the best position to provide information to the MSE about the implementation of the program in their school. For this survey population, a modified census sample will be the most efficient and appropriate design for gathering the data. We will gather data from all individuals in the respondent categories in all schools unless there are more than 20 individuals in a school, and if so, we will randomly select 10 individuals. With an 80% response rate we anticipate an overall returned sample of 2,100.


Since the lowest unit of analysis in the study is the school, and the schools are not a sample but the program population, at that level of analysis, generalizability to larger populations is not an issue. This is a descriptive implementation evaluation and we do not propose to draw inferences to other populations or schools. Variance-based precision estimates will be made for parameter estimates in each school as a quality check, and for possible methods caveats when necessary. Other internal sample analyses across schools, such as estimation differences for observer groups across schools (e.g., principals, parents, teachers, counselors, SRO’s) will be conducted for quality checks and possible analytic interpretation. Since neither of the proposed samples is statistically representative of a larger population, the samples at each unit of observation are in very low N nests (school within LEA, LEA within state, state within program), we do not anticipate the use of random coefficient or other hierarchical regression models. With this sample and small n analysis necessity, we do not anticipate the use of interclass correlation as a statistical adjustment. Our aggregation to higher levels of analysis (e.g., schools to LEA’s) will be averages of lower unit averages (e.g., school performance parameters are averages of key informant survey responses to performance report scales). Decisions on issues such as the relative desirability of weighted or un-weighted averages will be made when the configuration (e.g., variance characteristics) of the data are better understood. Analyses will focus on comparative identification of lower unit of analysis static and over time performance as it is impacted by higher level context. For example, what portions of school response differences are attributable to differences in schools, and differences in shared characteristics of LEA context relevant to the SS/HS state program (e.g., support, involvement), or to state context.




2. Procedures for Collection of Information

Staff from the state and LEA project team will be instrumental in identifying respondents for the MSE data collection activities.

For the PCP State and District KIIs and Implementation Study KIIs program staff will be asked to supply a contact list of key informants that collaborate with the SS/HS program in different roles that are appropriate for the data collection activity. See attachment I for the recruitment script for the KIIs. The MSE team will do a final selection of respondents. Project directors and/or evaluators will be asked to contact tentative respondents to request their participation. Then the MSE team will then send an e-mail invitation to schedule an interview, follow up telephone calls will be made for nonresponsive parties if necessary. The interviews will be conducted over the phone and will last approximately an hour each. Prior to beginning the interview, each respondent will provide verbal consent. The interviews consist of a series of open-ended question. Interviews will be audio recorded with the respondents’ permission and transcribed for analysis. If the respondent declines recording the interviewer will take notes of the interview instead.


Each SS/HS project team will be asked to identify and create a contact list agency contacts for the State Collaborator and District Collaborator Surveys. Once the list is tailored the staff will be asked to inform selected participants of support for their participation. The MSE team will send an e-mail invitation to potential respondents and follow up with a telephone call if necessary. The email invitation will include a link to the web-based survey. Before initiating the survey, respondents will consent to the survey by checking yes on the consent screen. For the subsequent administration of the survey, the grantee will be asked to confirm that the previous respondents are still participants involved in the same capacity and to provide any updated contact information as necessary. To enhance the response rate the MSE team will also employ the Total Design Method as offered by Dillman to follow up with non-responders one week after the initial contact and three to seven weeks thereafter (Dillman, 2007).


The program updates provided in the State and District Collaboration Indicator Data Instruments will be reported quarterly through the SHEDS. The MSE team will train program staff to complete the form and will monitor completion. Each grantee will be provided via email a unique username and password to log in to the Web-based data entry form. No individual identifying information will be provided when completing the inventory. Logging in and completing the inventory will imply consent for completion. Because many of the data elements will be collected on an ongoing basis, several on-demand reports will be available to provide real-time reports on key elements of interest to SAMHSA.


After schools are identified in each LEA to participate in the School-Level Survey program staff will supply contact information for up to 10 respondents per school. The survey link will be provided via email to respondents with their unique survey login/password (see Attachment H.2). Respondents will be given approximately 4 weeks to complete each survey. Follow-up with non-responders will be via email. Respondents will be asked to consent to the survey on an initial screen prior to launching the survey.



3. Methods to Maximize Response Rates

Efforts to maximize response rates in all activities will involve providing ongoing training in order to identify specific procedures that will improve participation of specific sites in all aspects of the evaluation. The timeline for data collection will be staggered in an effort to decrease burden on the project team and instrument respondents and the administration window will be long enough to obtain maximum participation. In addition, steps will be taken for each activity to increase numbers.


Methods that will be used to maximize response rates for the qualitative interviews (i.e. Planning, Collaboration and Partnership State KIIs, District KIIs, and the Implementation Study KIIs) include obtaining buy-in from key program stakeholders, providing flexibility in scheduling, and conducting follow-up phone calls and emails to non-responders. In addition, local program staff will be utilized to obtain contact information for respondents, which will result in more accurate information, thus increasing response rates. The MSE team will contact respondents at least three times to invite them to participate in the interviews. If any identified respondents for the qualitative interviews are nonresponsive, the MSE team will request that local program staff identify replacement respondents.


For the web-based activities including, State Collaborator Survey, District Collaborator Survey, State Collaboration Indicator Data Instrument, District Collaboration Indicator Data Instrument, and School-Level Survey training will be given to provide information to help individuals complete the survey. Additionally, a helpdesk will be available to aid respondents in completing the survey and troubleshooting any issues. Local program staff will also be utilized to obtain contact information for respondents, and respondents will receive reminder emails to complete the Web-based survey. Additionally, project management and evaluators will receive updates about periodic required reporting deadlines through calls with MSE contacts and government project officers. The MSE team will also use the Dillman Total Design Method to track and follow up with respondents to maximize the response rates (Dillman, 2007). This method, is designed by the author to deliver 80% return rates for mail and telephone surveys. It will be employed to maximize response to all of the surveys conducted as part of the MSE. If less than an 80% response rate occurs in any survey, the MSE team will conduct a response bias estimate using respondent data that are available for the full sample frame.


4. Tests of Procedures

As new measures were developed, standard instrument development procedures including review of the literature, item development, and content review by experts in the field were used. Also, the EAP will review the instruments to ensure the content, language, and methodologies are appropriate. Some of the data collection activities will include items from previously tested and used surveys. The majority of the items in the School-level Survey come from reliable and valid measures that have been used in previously developed instruments including the EBPAS, MHSIS,SMHQAQ, and SMHCI.


Web-based instruments will undergo usability testing prior to fielding. Usability testing refers to pilot testing of the interface for administering questionnaires to determine the most efficient and understandable presentation. Typically, this is completed with a prototype and modifications are made before final fielding.



5. Statistical Consultants

The multi-site evaluator has full responsibility for the development of the overall statistical design, and assumes oversight responsibility for data collection and analysis. Training and monitoring of data collection will be provided by the MSE team. The individuals responsible for overseeing data collection and analysis are:


Christine M. Walrath, PhD

ICF Macro, Inc.

116 John Street, Fl. 8

New York, NY 10038

(212) 941-5555


The following individuals will serve as statistical consultants to this project:


Christine M. Walrath, PhD

ICF Macro, Inc.

40 Wall Street, 34th Floor

New York, NY 10005

(212) 941-5555


Lucas Godoy Garraza, MA

ICF Macro, Inc.

40 Wall Street, 34th Floor

New York, NY 10005

(212) 941-5555


The agency staff person responsible for receiving and approving contract deliverables is:


Melanie Brown MPH, MA

Social Science Analyst

SAMHSA/DPTSSP

1 Choke Cherry Road

Room 6-1008

Rockville, MD 20857

Phone: (240) 276-1909










References

Aarons, G. A. (2004). Mental health provider attitudes toward adoption of evidence-based practice: The evidence-based practice attitude scale (EBPAS). Mental Health Services Research, 6(2), 61–74.

———. (2006). Transformational and transactual leadership: Association with attitudes towards evidence based practice. Psychiatric Services, 57(8), 1162–1168.

Adelman, H. S., & Taylor, L. (2006). The implementation guide to student learning supports in the classroom and schoolwide: New directions for addressing barriers to learning. Thousand Oaks, CA: Corwin Press, Sage.

Atkins, M., McKay, M., Arvantitis, P., London, L., Madison, S., Costigan, C., et al. (1998). An ecological model for schoolbased mental health services for urban lowincome aggressive children. The Journal of Behavioral Health and Services Research, 5, 6475.

Atkins, M., Hoagwood, K., Kutash, K., & Seidman, E. (2010). Toward the integration of education and mental health in schools. Administrative Policy in Mental Health, 37, 4047.

Brandes, U. & Pich, C. Explorative Visualization of Citation Patterns in Social Network Research. Journal of Social Structure 12(8), 2011

Bruns, E. (2008). Measuring wraparound fidelity: The Resource Guide to Wraparound. Portland, OR: National Wraparound Initiative, Research and Training Center on Family Support and Children’s Mental Health, Portland State University.

Burton, D. L., Hanson, A., Levin, B., & Massey, O. T. (2012). School mental health. In M. ShallyJensen (Ed.), Mental health care issues in America. Santa Barbara, CA: ABCCLIO.

Co-Occurring Center for Excellence. (2006). Treatment, volume 1: Understanding evidence-based practices for co-occurring disorders. Rockville MD: Substance Abuse and Mental Health Services Administration, Center for Mental Health Services, Center for Substance Abuse Treatment.

Denzin, N. K. (1978). The research act: A theoretical introduction to sociological method. New York: McGraw-Hill.

Dillman, D. A. (2007). Mail and internet surveys: The tailored design method. 2007 update with new internet, visual and mixed mode guide. Hoboken, NJ: John Wiley and Sons.

Eisenhardt, K. (1989). Building theories from case study research. Academy of Management Review, 14(4), 532–550.

Gonzales, J., Ringeisen, H., & Chambers, D. (2002). The tangled and thorny path of science to practice: Tensions in interpreting and applying “evidence.” Clinical Psychology: Science and Practice, 9(2), 204–209.

Green, L. W. (2008). Making research relevant: If it is an evidence-based practice, where’s the practice-based evidence? Family Practice—An International Journal, 25, 20–24.

Greenhalgh, T., Robert, G., MacFarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of innovation in service organizations: Systematic review and recommendations. The Milbank Quarterly, 82(4), 581–629.

Hoagwood, K., Burns, B. J., Kiser, L., Ringeisen, B., & Schoenwald, S. K. (2001). Evidence-based practice in child and adolescent mental health services. Psychiatric Services, 52(9), 1179–1189.

Kutash, K., Duchnowski, A. J., & Green, A. L. (2011). Schoolbased mental health programs for students who have emotional disturbances: Academic and socialemotional outcomes. School Mental Health, 3(4), 191208.

Lehman, W., Greener J., & Simpson, D. (2002). Assessing organizational readiness for change. Journal of Substance Abuse Treatment, 22(4), 197–209.

LeCompte, M. D., & Schensul, J. J. (1999). Designing and conducting ethnographic research. Walnut Creek, CA: AltaMira.

Massey, O. T., Armstrong, K., Boroughs, M., Henson, K., & McCash, L. (2005). Mental health services in schools: A qualitative analysis of challenges to implementation, operation and sustainability. Psychology in the Schools, 42, 361–372.

Mellin, E. A., Bronstein, L., Anderson-Butcher, D., Amorose, A. J., Ball, A., & Green, J. (2010). Measuring interprofessional team collaboration in expanded school mental health: Model refinement and scale development. Journal of Interprofessional Care, 24(5), 514–523.

Merrill, M. L., Taylor, N. L., Martin, A. J., Maxim, L. A., D’Ambrosio, R., Gabriel, R. M., et al.

(2012). A mixed-method exploration of functioning in Safe Schools/Healthy Students

partnerships. Evaluation and Program Planning, 35(2), 280–286.

Miles, M., & Huberman, A.M. (1994). Qualitative Data Analysis. Thousand Oaks, CA: Sage Publications.

Owens, P. L., Hoagwood, K., Horwitz, S. M., Leaf, P. J., Poduska, J. M., Kellam, S. G., et al. (2002). Barriers to children’s mental health services. Journal of the American Academy of Child & Adolescent Psychiatry, 41(6), 731738.

Penuel, W., Riel, M., Krause, A., & Frank, K. (2009). Analyzing teacher’s professional interactions in a school as social capital: A social network approach. Teachers College Record, 11(1), 124–163.

Prochaska, J.M., Prochaska, J.O., & Levesque, D.A. (2001). A transtheoretical approach to changing organizations. Administration and Policy in Mental Health, 28(4), 247–261.

Proctor, E. K., Knudson, K. J., Fedoravicius, N., Hovmand, P., Rosen, A., & Perron, B. (2007) Implementation of evidence-based practice in community behavioral health: Agency director perspectives. Administration and Policy in Mental Health and Mental Health Services Research, 34(5), 479–488.

Rones, M., & Hoagwood, K. (2000). School-based mental health services: A research review. Clinical Child and Family Psychology Review, 3, 223–241.

Sacket, D. L., Rosenberg, W. M. C., Gray, J. A. M., Harnes, J. B., & Richardson, W. S. (1996). Evidence-based medicine: What it is and what it isn’t. British Medical Journal, 312, 71–72.

Taylor, L., & Adelman, H. S. (2000). Toward ending the marginalization and fragmentation of mental health in schools. Journal of School Health, 70, 210215.

Urban J., & Trochin, W. (2009). The role of evaluation in research-practice integration: Working toward the golden spike. American Journal of Evaluation, 30(40), 538–553.

U.S. Department of Health and Human Services. (1999). Mental health: A report of the Surgeon General. Rockville, MD: U.S. Department of Health and Human Services, Substance Abuse and Mental Health Services Administration, Center for Mental Health Services, National Institutes of Health, National Institute of Mental Health.

Wandersman, A., Duffy, J., Flaspohler, P., Noonan, R., Lubell, K., Stillman, L., et al. (2008). Bridging the gap between prevention research and practice: The interactive systems framework for dissemination. American Journal of Community Psychology, 41, 171–181.

Weist, M.D., Sander, M. A., Walrath, C., Link, B., Nabors, L., Adelsheim, S., et al. (2005). Developing principles for best practice in expanded school mental health. Journal of Youth and Adolescence, 34(1), 7–13.

Yampolskaya, S., Massey, O. T., & Greenbaum, P. E. (2006). Atrisk high school students in the “Gaining Early Awareness and Readiness” Program (GEAR UP): Academic and behavioral outcomes. The Journal of Primary Prevention, 27(5), 457475.




List of Attachments

Attachment A

State Key Informant Interviews (PCP)

Attachment B

District Key Informant Interviews (PCP)

Attachment C

State Collaborator Survey (PCP)

Attachment D

District Collaborator Survey (PCP)

Attachment E

State Collaboration Indicator Data Instrument (PCP)

Attachment F

District Collaboration Indicator Data Instrument (PCP)

Attachment G

Key Informant Interviews (Implementation)

Attachment H.1

School Level Survey (Implementation)

Attachment H.2

School Level Survey Recruitment Script (Implementation)

Attachment I

Key Informant Interviews Recruitment Script




SS/HS Multi-Site Evaluation Page 7 of 29

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-26

© 2024 OMB.report | Privacy Policy