Table of Contents
Background and Study Rationale 1
Sample and Recruitment Plans 3
Assurance of Confidentiality 6
The School Climate Surveys (SCLS) are a suite of survey instruments being developed for schools, school districts, and states by the U.S. Department of Education’s National Center for Education Statistics (NCES). This national effort extends current activities that measure school climate, including the state-level efforts of Safe and Supportive Schools (S3) grantees, which were awarded funds in 2010 by the Department of Education’s Office of Safe and Healthy Students (OSHS) to improve school climate. Through the SCLS, schools nationwide will have access to survey instruments and a survey platform that will allow for the collection and reporting of school climate data across stakeholders at the local level. The surveys can be used to produce school-, district- and state-level scores on various indicators of school climate from the perspectives of students, teachers and staff, principals, and parents and guardians. NCES will also provide benchmark data, collected from a nationally representative sample of schools across the United States, to facilitate comparisons between school climate scores at the local and national levels.
As part of the SCLS item development process, a portion of the survey items will be tested on 120 target respondents through cognitive interviews and usability testing in the summer of 2014, before they are administered to a larger sample in the pilot and the national benchmark studies. This document describes the types of testing we plan to conduct, the number of items to be tested, sample and the recruitment of participants, the data collection process, the hourly burden, and the cost of the testing.
We will concurrently conduct two complementary types of testing: cognitive interviews and usability testing. The cognitive interviews will focus on the content of questionnaires, while the usability interviews will focus on the functionality of the survey platform. They will be conducted separately with different participants.
In cognitive interviews, an interviewer uses a structured protocol in a one-on-one interview drawing on methods from cognitive science. The cognitive interviews will investigate the cognitive processes that respondents use to answer survey questions. In particular, these interviews will identify problems of ambiguity or misunderstanding in question wording. We are particularly intent on ensuring that all items included in the final surveys are easily understood by respondents, with their interpretations consistently aligned with the concepts being measured.
We plan to use two cognitive interviewing methods: think-aloud interviewing and verbal probing techniques (these two methods are also known as concurrent and retrospective recall probing, respectively). With think-aloud interviewing, respondents are explicitly instructed to think aloud (i.e., describe what they are thinking) as they work through items. With verbal probing techniques, the interviewer asks probing questions, as necessary, to clarify points that are not evident from the “think-aloud” process, or to explore additional issues that have been identified a priori as being of particular interest. Cognitive interview studies produce qualitative data in the form of verbalizations made by participants during the think-aloud phase and in response to the interviewer probes.
Usability testing will explore the interaction that respondents have with the prototype of the SCLS platform. This phase will take place after the development team has created basic working interaction elements for users to test. The actual tasks that participants will be asked to perform will be determined by the specific interaction elements or features created for the platform (e.g., toggling the questionnaire language from the English version to the Spanish version; re-accessing the survey after taking a break). In addition to testing the data collection functions by different respondent groups, a group of school principals will be asked to test the reporting functions of the platform, as they are the target audience for these functions. For this special group, usability testing tasks may include generating a summary survey results report, exporting a survey report, or locating item-level response frequencies. Users’ success or difficulty in completing assigned tasks will be analyzed to determine which information or control elements are missing or insufficient to allow successful completion of anticipated user tasks. After each set of tasks is completed, respondents will be asked to answer some questions on the ease of using the features or navigating through the platform. They will also be asked for any comments they have about the task they just completed. All comments will be recorded and analyzed to help guide the development of the SCLS platform.
Some observations of behavior in the cognitive interviews and usability testing may also be noted by interviewers as supplementary information. Behavioral observations may include such things as nonverbal indicators of affect, suggesting emotional states such as frustration or engagement, as well as interactions with the task, such as ineffectual or repeated actions suggestive of misunderstanding or usability issues. Because we do not plan to have a second observer present during interviews or testing, behavioral observations will only be made if nonverbal indicators of affect are clearly demonstrated or noted by the interviewers.
Based on our review of existing school climate surveys and the conversations we had with the state and district personnel1 involved in administering these or similar surveys, we have concluded that the ideal length of the student, instructional staff, and non-instructional staff surveys should be around 15-20 minutes (about 70-80 items) in order to ensure appropriate respondent burden and smooth survey administration. Given the low response rates observed in existing school climate parent surveys, we have decided to keep the parent survey to around 5-8 minutes (about 20-30 items) in the hope of increasing the number of parents who respond. Building upon existing school climate surveys and the recommendations from the SCLS Technical Review Panel (TRP), we have selected items needed for the four questionnaires to collect sufficient information about each of the 3 domains and 13 topical areas (figure 1). Extra items2 have been selected so that we have the flexibility to drop low-performing items before the pilot test and/or the national study.
The cognitive interviews will include three categories of items: (1) new items that have not been fielded previously; (2) items that have been fielded but for which validation information is not available; and (3) items from validated scales that have been revised based on the TRP’s recommendation. In instances where many items have been changed in the same way (e.g., reversing the order of “agree-disagree” response options), we will test a representative subsample of the items in order to minimize the level of respondent burden. Table 1 below summarizes the number of items we plan to include in the cognitive interviews for each questionnaire. Due to the large amount of items that need cognitive interviews, we will need to split items into two or three groups by domains or topical areas. We expect to interview at least five potential respondents for each item to explore their understanding of the items and answer options. Based on the interview results, we will consider dropping items that are confusing to respondents or making only minor wording changes that can improve the clarity of the items, but will not require another round of cognitive interviews.
Figure 1. SCLS model of school climate
Table 1. Number of items in cognitive interviews, by SCLS instrument
|
Student |
Teacher/ Instructional staff |
Principal/ Non-instructional staff |
Parent |
New items |
2 |
2 |
10 |
5 |
Extant items, validation information unavailable |
48 |
55 |
72 |
32 |
Extant items, from validated scales |
17 |
24 |
24 |
3 |
Total items |
67 |
81 |
106 |
40 |
NCES contracted the American Institutes for Research (AIR) to conduct the cognitive laboratory testing in July and August of 2014. Participants will include the target recipients of each of the four questionnaires: middle and high school students,3 teachers and instructional staff, principals and non-instructional staff, and parents of middle and high school students. They will be recruited from the District of Columbia and San Mateo, California metropolitan areas (both near AIR offices) to maximize scheduling and interviewing efficiency and flexibility. To ensure that the Spanish translations of the student and parent surveys are understandable to Spanish speakers in the southern part of the United States, AIR will attempt to test the Spanish translations of the student and parent surveys in the southern part of the country as well.
AIR will recruit participants representing a range of characteristics (including urban and suburban areas, students from a mix of grades, and staff serving in a variety of school roles). We will make sure that both principals and professional staff are included in the sample for testing the principal/non-instructional staff questionnaire, and both middle and high schools will be covered in the sample for the student and parent questionnaires. Please note that although the sample will include a mix of characteristics, the results will not explicitly measure differences by these characteristics.
We will use multiple outreach methods and resources, such as marketing research companies, newspaper/internet ads, and contacts with schools and community organizations (e.g., libraries and summer or afterschool programs) to recruit participants. Paper flyers, e-mails, and phone calls will be used to contact the potential participants. Interested participants will be screened using a screener script to ensure they meet the criteria for participation in the interviews. During this communication, the parent or guardian of any interested minors will be informed about the objectives, purpose, and participation requirements of the data collection effort, as well as the activities that it entails. Only after AIR has obtained written consent from the parent or guardian will the student be allowed to participate in the cognitive laboratory testing session. We will try to recruit parent-student pairs during the screening. For the selected participants, AIR will confirm the interview date, time, and location via e-mail, letter, or telephone. Principals, non-instructional staff, teachers, students, and the parents or legal guardians of participating students will complete consent forms at the time of the interview.
At least five people will provide feedback for each item or each task. Depending on the groups of people needed to test all items in cognitive interviews and different functions in usability testing, we plan to recruit 58 people for the cognitive interviews for the four questionnaires and 27 people for usability testing of the data collection and reporting functions. Given that there will also be a Spanish language version of the student and parent questionnaires, an additional 25 people will be recruited for cognitive interviews and an additional 10 people for usability testing. Table 2 summarizes the total number of participants needed for cognitive laboratory testing.
Table 2. Sample size: cognitive interviews and usability testing
|
Student (English) |
Student (Spanish) |
Teacher/ Instructional staff |
Principal/ Non-instructional staff |
Parent (English) |
Parent (Spanish) |
Total |
Cognitive Interviews |
15 |
15 |
15 |
18 |
10 |
10 |
83 |
Usability testing |
5 |
5 |
5 |
12 |
5 |
5 |
37 |
Total |
20 |
20 |
20 |
30 |
15 |
15 |
120 |
The cognitive laboratory testing will take place in a range of locations and/or facilities. In most cases, participants will be invited to AIR offices, but depending on scheduling and participants, some testing may take place in schools. In all cases, an appropriate environment, such as a quiet room, will be used to conduct the interviews.
Participants will first be welcomed, introduced to the interviewer and the observer (if an in-room observer is present), and told they are there to help answer questions about how people answer school climate survey items. All participants will be reassured that their participation is voluntary and that their answers may be used only for research purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002, 20 U.S.C §9573]. Interviewers will explain the think-aloud process and conduct a practice session with a sample question.
The think-aloud component of the cognitive interviews will use either (1) a concurrent think-aloud method in which the participant verbalizes his or her thoughts while working through a questionnaire or (2) a retrospective think-aloud method during which participants work through a questionnaire silently and then discuss their thoughts about the item content while working through it a second time.
The cognitive interviews will also include a verbal probing component conducted after completion of the think-aloud portion for a given questionnaire. The verbal probes will include a combination of pre-planned item-specific questions, identified before the session as important, and ad hoc questions that the interviewer identifies as important from observations during the interview, such as clarifications or expansions on points raised by the participant. To minimize the burden on the participant, efforts will be made to limit the number of verbal probes that can be used in any one session. The protocols will contain generic prompts to be applied flexibly by the interviewer to facilitate and encourage participants to verbalize their thoughts. For example: “I see you’re looking at the answer. What are you thinking?” or “Are there any questions or words that seem confusing here?” Observers will take notes on what participants say and the sessions will be audio recorded.
A very similar process will be used for the usability testing. Instead of telling participants to focus on the content of the survey items, interviewers will give an overview of the survey platform and then ask participants to complete certain tasks. Participants can use either a concurrent think-aloud or a retrospective think-aloud method to let interviewers know what they are thinking while working on the task as well as about any confusion they may have. During the think-aloud process, interviewers will not provide any assistance to the participants or answer any questions directly related to the tasks. In the case that a participant cannot move forward for a prolonged period of time or asks for help, the interviewer will only respond with a generic prompt, such as “Is there anything else you can do to help you move forward?” “What do you think should happen/be available here?” After the think-aloud activity is completed, the interviewer with conduct a verbal probing component on the ease of navigating through the platform and participants’ experiences with the features that the platform provides. Interviewers can use probing questions such as “Did you find it difficult switching from the English version to the Spanish version?” or “Are any of the buttons/links confusing on this page?” Similarly, observers will take notes on what participants say and the sessions will be audio-recorded.
For the testing of the data collections, the key unit of analysis is the item for the cognitive interviews or the task for the usability testing. Items or tasks will be analyzed across participants.
The types of data collected about the tasks or items will include:
think-aloud verbal reports;
responses to generic questions prompting participants to think out loud;
responses to targeted questions specific to an item or task; and
additional comments volunteered by participants or behavioral observations noted by interviewers.
The general analysis approach will be to compile the different types of data to facilitate identification of patterns of responses or issues for specific items or tasks. This overall approach will help to ensure that the data are analyzed in a thorough and systematic way that enhances the identification of problems with items or tasks and provides recommendations for addressing them.
Participants will be notified that their participation is voluntary and that their answers may be used only for research purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002, (20 U.S.C. §9573)].
Written consent will be obtained from participants who are over the age of 18 and from parents or legal guardians of students who are under the age of 18. Participants will be assigned a unique identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form. The consent forms, which include the participant name, will be separated from the participant interview files and secured for the duration of the study and will be destroyed after the final report is completed.
The interviews will be audio-recorded. The only identification included in the audio files will be the participant ID. The recorded files will be secured for the duration of the study – with access limited to key AIR project staff – and will be destroyed after the final report is submitted.
The estimated burden for recruitment assumes attrition throughout the process.4 The initial contact and response is estimated at 3 minutes, or 0.05 hours. The follow-up phone call to screen participants and/or answer any questions that they (or their parents or legal guardians) have is estimated at 9 minutes, or 0.15 hours. The follow-up to confirm participation is estimated at 3 minutes, or 0.05 hours. All interviews will be scheduled for no more than 60 minutes. Table 3 details the estimated burden for the SCLS cognitive laboratory testing.
Table 3. Estimate of hourly burden for recruitment and participation in SCLS Cognitive laboratory testing
Type |
Hours per respondent |
Number of respondents |
Total hours |
Recruitment |
|||
Initial contact |
0.05 |
430 |
22 |
Follow-up via phone or e-mail |
0.15 |
215 |
32 |
Confirmation |
0.05 |
150 |
8 |
Subtotal |
|
430 |
62 |
Participation |
|
|
|
Students |
1 |
40 |
40 |
Parents |
1 |
30 |
30 |
Teachers/Instructional staff |
1 |
20 |
20 |
Principal/Non-instructional staff |
1 |
30 |
30 |
Subtotal |
|
120 |
120 |
Total Burden |
915 responses |
430 |
182 |
Marketing research companies, such as Shugoll, charge a management fee, in addition to a recruitment fee, for every participant obtained through its network.
To encourage their participation, and thank them for their time and effort, all participants will be offered an incentive if their participation does not occur in a school building during normal school hours. Parents, teachers, and staff will be offered $40 for their participation. Students will be offered $25 for their participation, and parents or legal guardians who are not participating in the testing themselves but bring their student to and from the testing location, will also receive $25 for the time and cost of providing transportation to the student participant.
The estimated costs of the cognitive laboratory testing activities in this submittal are described in table 3.
Table 3. Estimated costs of cognitive testing activities
Activity |
Provider |
Estimated cost |
Cognitive interviews Design, prepare for, and conduct cognitive interviews (including recruitment, incentive costs, data collection, analysis, and reporting) |
AIR
|
$ 102,040
|
Usability testing Design, prepare for, and conduct usability testing (including recruitment, incentive costs, data collection, and reporting) |
AIR/ Sanametrix |
$ 103,600 |
Total |
|
$205,640 |
Table 4 depicts the high-level schedule for the various activities. Each activity includes recruitment, data collection, analyses, and reports. In addition, the commencement of activities is contingent upon OMB approval.
Table 4. High-level schedule of milestones
Activity |
Dates |
Study design |
March – April 2014 |
Recruitment |
May – July 2014 |
Cognitive laboratory testing |
June – August 2014 |
Cognitive laboratory testing report |
September – October 2014 |
1 These include the nine district personnel we interviewed while preparing the position paper on data collection tools, the members of the Technology Committee of the NCES Winter Forum, and the representatives of the four S3 grantees.
2 About 40 to 50 percent more items are selected for each questionnaire based on psychometric consideration of dropping low-performing items in the process of scale construction and validation.
3 SCLS are developed to measure school climate in middle and high schools.
4Assumptions for approximate attrition rates for direct participant recruitment are 50 percent from initial contact to follow-up, 30 percent from follow up to confirmation, and 20 percent from confirmation to participation.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | 111313_version |
Subject | NAEP BQ |
Author | Wang, Yan |
File Modified | 0000-00-00 |
File Created | 2021-01-27 |