Justification

Volume I - NTPS 2017-2018 Cognitive Interviews.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf

National Center for Education Statistics





Volume I

Supporting Statement





2017-2018 National Teacher and Principal Survey (NTPS)

Cognitive Interviews





OMB# 1850-0803 v.147



Attachments:

Attachment I – Recruitment Advertisements

Attachment II – Recruitment Protocol and Eligibility Screener

Attachment IIISample Consent Form

Attachment IV – Cognitive Interview Protocol and Questionnaire Items






October 30, 2015



Submittal-Related Information

The following materials are being submitted under the National Center for Education Statistics (NCES) generic clearance agreement (OMB# 1850-0803) which provides for NCES to improve methodologies, question types, and/or delivery methods of its survey and assessment instruments by conducting field tests, focus groups, and cognitive interviews.

The request for approval described in this memorandum includes recruiting and screening activities and cognitive testing of the questionnaire items for new modules that will be added in the 2017-18 National Teacher and Principal Survey (NTPS). This package provides information about plans and procedures for conducting the cognitive testing, and presents the question wording to be tested in the cognitive interviews.

Background

The National Center for Education Statistics (NCES) conducts the National Teacher and Principal Survey (NTPS), the redesign of the Schools and Staffing Survey (SASS), which has been the U.S. Department of Education’s (ED’s) primary source of information on the teacher and principal labor market and on what is happening in K-12 schools from teachers’ and administrators’ perspectives, since the survey began in 1987. In 2012, NCES initiated the redesign of SASS. NCES’s vision for the redesigned SASS, named the National Teacher and Principal Survey (NTPS), is for a highly flexible, timely collection that is integrated with other ED data. To that end, the new NTPS includes modular content that can be easily swapped in and out of the survey, is fielded more frequently than SASS was fielded, and provides data users with relevant extant data rather than burdening respondents with answering questions that have been asked in other ED collections. The SASS/NTPS is being fielded for the first time in 2015-16.

The content of the 2011-12 SASS formed the basis of the NTPS content, though some questions were shifted to different questionnaire instruments or will be answered through the use of extant data sources. Teacher instruments remained relatively similar to previous teacher instruments. Principal instruments, however, included question items that were formerly fielded as separate school and district-level questionnaires. The NTPS schoolwide instrument contains new and edited items from the 2011-12 SASS, though several items measuring school characteristics that were part of the 2011-12 SASS will be provided to NTPS data users from EDFacts, Civil Rights Data Collection (CRDC), or Private School Universe Survey (PSS) data sources and will no longer be collected directly in the NTPS.

Each NTPS instrument has core sections with content that is included in every survey administration. In addition, the instruments include rotating modules that will be fielded in selected rounds. Although from a procedural perspective these will be considered distinct sections for survey administrators, the respondents will receive instruments in which the items are grouped in a logical order that is organized for greatest clarity.

Cognitive interviews will be conducted to evaluate new items for several rotating NTPS modules on the topics of educator evaluation, professional development, classroom organization, and instructional time. The cognitive interviews will enable the team to identify problems with question wording, organization, and the order of the questions in the instruments.

Study Design, Context, and Respondent Burden

Three instruments are being prepared for cognitive testing:

  • Teacher Questionnaire,

  • Principal Questionnaire, and

  • School Staff Questionnaire.

Exhibit 1 summarizes the number of interviews to be conducted within each round, as well as the recruitment burden. In Round 1, we are developing and testing two versions of some of the new questions for each of the three instruments. Existing questions have only been slightly modified and will not necessarily be tested against a new version Half of the respondents will receive version 1 of a new question first, then version 2. The other half of respondents will receive version 2 first, then version 1. During Round 2, only one version of the new question items will be presented. For the school questionnaire, we will only test one version in both rounds since the proposed questions for this questionnaire are only modifications of questions that have been used in the 2011-12 SASS.

To test these instruments, we will recruit teachers, principals, and school staff members who would be responsible for filling out the school staff questionnaire, representing both primary and secondary schools. We will strive to recruit urban, suburban, and rural schools, and educators from a variety of backgrounds who range in age and years of experience, and to select schools that vary across size and demographic characteristics. In particular we will attempt to recruit school staffs from small and large schools, in order to test the effect of school size responses (e.g. a principal in a small school vs. teacher’s aide in a large district). A maximum of 272 interviews will be completed. Our goal is to obtain 108 interviews with teachers, 108 interviews with principals, and 56 interviews with school staff. In the first round of interviews, 60 minutes will allow time to complete a subset of the core sections of the NTPS survey and then time for think aloud responses to in-depth probes of new question items. From prior experience, we anticipate needing to recruit approximately 577 individuals to yield the 272 desired interviews to account for ineligible respondents and cancellations.

Exhibit 1. Estimated Respondent Burden by Round of Cognitive Interviews, Instrument, and School Characteristic

Response Type

School Type

Time Estimate (minutes)

Participants per Round

Total # of respondents

Total # of respondents

Total burden hours

Round 1

Round 2

Teacher recruitment

all

10

94

47

141

141

24

Principal recruitment

all

10

216

108

324

324

54

School staff recruitment

all

10

64

48

112

112

19

Subtotal





577

577

97

Teacher Questionnaire

Primary

60

36

18

54

54

54

Secondary

60

36

18

54

54

54

Subtotal





-*

108

108

Principal Questionnaire

Primary

60

36

18

54

54

54

Secondary

60

36

18

54

54

54

Subtotal





-*

108

108

School Staff Questionnaire

Primary

60

16

12

28

28

28

Secondary

60

16

12

28

28

28

Subtotal





-*

56

56

Total Burden





577*

849

369

* Duplicate counts of the same respondents are not included in the total number of respondents.

Cognitive interview respondents will be recruited in the Washington, DC area and through educational networks in the United States. Because education is heavily influenced by state policy, we will recruit teachers and principals from a diversity of states. In the D.C. region we will recruit from schools in Maryland, Virginia, West Virginia, and the District of Columbia. In the other areas, we will attempt to recruit teachers and principals through educator networks and conferences.

Respondents will be recruited using educational website ads, through listservs, personal networks, and other professional networks. Characteristics for which we will attempt to recruit at least one teacher or principal include:

  • Teachers with alternative certification;

  • Teachers and principals from urban, suburban, and rural schools; and

  • Teachers that are full or part-time.

See Attachment II for the language to be used in recruitment advertisements and for sample recruitment script.

To reduce travel time and costs we will attempt to conduct as many interviews as possible in the greater D.C. region, and we will schedule multiple interviews for the same day if possible. The respondents will receive materials by email, or on paper if their interview is scheduled in person, to remind them of their interview time and any other directions or information they need.

Researchers from the U.S. Census Bureau and contractors will administer the cognitive interviews. Between five and ten staff members will be trained to conduct the interviews to allow the greatest flexibility possible in interviews scheduling. Interviewers will vary somewhat in their level of experience; however all will be required to demonstrate strong understanding of the interview protocol before beginning interviews. With respondent permission, the interviews will be audio-recorded so that the responses may be reviewed as needed during analysis. The Center for Survey Measurement staff will conduct interviewer training, supervise staff, and monitor data quality throughout the field period.

We have planned for two rounds of cognitive interviewing. During both rounds respondents will be given a subset of current NTPS core sections relevant to their population (Teacher, Principal, or School). After completing these sections, respondents will be trained on the think-aloud technique. They will then complete the questionnaire independently while thinking aloud. The interviewer will observe the respondent during survey completion, spontaneously probing on any questions that the respondent has or problems with particular items. After the respondent has completed the survey, the interviewer will then review the instrument with the respondent. Throughout the interview, the interviewer will use scripted probes (see Attachment IV for details) and also probe spontaneously about additional issues that arose for the respondent. The sets of questions to be cognitively tested in round 1 will include a subset of items from NTPS instruments’ core sections, plus one of two different versions of new or problematic items. Respondents will be randomly assigned as to which version of the new items they see first. These items will be iteratively tested in subsets within round 1, with the protocol and question wording being adapted based on the previous subsets results. Once these items have been tested and refined, round 2 will include only one revised wording for each question.

Cognitive interviewing techniques allow researchers to evaluate potential response error and to assure that the survey provides valid data. In general, the goal of cognitive testing is to assess the respondents’ comprehension of the questionnaire items, including question intent and the meaning of specific words and phrases in the item. Data from cognitive interviews can identify potentially problematic questions that are not understood as intended. This testing can also examine the respondents’ retrieval of relevant information from memory, decision processes involved with answering a question, and questions that are difficult to answer due to cognitive burden. Further, respondents’ ability to recall information such as the content of their professional development or to estimate information such as time spent teaching each subject can be tested.

The cognitive interviews will assess issues such as:

  • The subject’s understanding of terms in the survey

  • How confident the subject is in their response

  • How they remembered the information they provided in factual questions

  • Whether they found a response choice that fit their answer

  • How easy or difficult it is to answer a question

  • Issues with sensitive questions

  • Consistency of answers within the questionnaire and in comparison to the expected range of answers


The cognitive interview protocol is provided in Attachment IV.

Assurance of Confidentiality

Cognitive interview participants will be informed that their participation is voluntary and their responses may be used only for research purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (Education Sciences Reform Act of 2002; 20 U.S. Code, Section 9573). Participants will also be advised that the interview will be recorded and that the audio recording will only be reviewed for the purposes of report writing. Lastly, participants will be advised that direct quotes may be used in research papers and professional presentations, but names will never be attributed to anything a respondent says.

Participants will be assigned a unique teacher or principal identifier (ID), which will be created solely for data file management and used to keep all student and principal materials together. The respondent ID will not be linked to the respondent in any way or form. The signed consent forms will be kept separately from the interview files in a locked cabinet for the duration of the study and will be destroyed after the final report is released. A sample consent form may be found as Attachment III.

Estimate of Costs for Recruiting and Paying Respondents

The teacher and school staff respondents will be offered $40 for participation, to help motivate their partaking and to thank them for their time, effort, and inconvenience. Because principals are a small population that is difficult to recruit for participation, they will be offered $75 to participate. The incentive is also deemed necessary to ensure a diverse sample. Respondents will be paid by cash after completion of the interview.

Estimate of Cost Burden

There are no direct costs to participants.

Cost to the Federal Government

The cost of conducting the cognitive interviews will be $200,000, under the NCES contract to CSM at the U.S. Census Bureau, which includes participant incentives of $40-$75 per each of the 272 interviews.

Project Schedule

Recruit participants through networks and advertisements

November 2015-July 2016

Conduct Round 1 cognitive testing

November 2015-April 2016

Iterative revisions to item wording

January 2016-April 2016

Conduct Round 2 cognitive testing

May 2016- July 2016

Analysis and Final Recommendations

July 2016-November 2016

Final Wording for Questionnaire

January 2017



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorHerschel Sanders
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy