NCES System Clearance for Cognitive,
Pilot, and Field Test Studies 2019-2022
OMB# 1850-0803 v.248
Supporting Statement Part A
Prepared by:
National Center for Education Statistics
U.S. Department of Education
Washington, DC
March 2019
Part A JUSTIFICATION
A.1 Importance of Information 1
A.2 Purposes and Uses of the Data 4
A.3 Improved Information Technology (Reduction of Burden) 5
A.4 Efforts to Identify Duplication 5
A.5 Minimizing Burden for Small Institutions 6
A.6 Frequency of Data Collection 6
A.7 Special Circumstances 6
A.8 Consultations Outside NCES 6
A.9 Payments or Gifts to Respondents 6
A.10 Assurance of Confidentiality 6
A.11 Justification for Sensitive Questions 7
A.12 Estimates of Burden 7
A.13 Total Annual Cost Burden 8
A.14 Annualized Cost to Federal Government 8
A.15 Program Changes or Adjustments 8
A.16 Plans for Tabulation and Publication 8
A.17 Display OMB Expiration Date 8
A.18 Exceptions to Certification Statement 8
PART B COLLECTION OF INFORMATION EMPLOYING STATISTICAL INFORMATION
B.1 Respondent Universe and Response Rates 1
B.2 Procedures for Collection of Information 1
B.3 Maximizing Response Rates 1
B.4 Tests of Procedures 1
B.5 Individuals Consulted on Statistical Design 1
Appendix A NCES Authorization Legislation
This is a request for a 3-year renewal of the generic clearance for the National Center for Education Statistics (NCES) that will allow it to continue to develop, test, and improve its survey and assessment instruments and methodologies. The procedures utilized to this effect include but are not limited to experiments with levels of incentives for study participants, tests of various types of survey operations, focus groups, cognitive laboratory activities, pilot testing, exploratory interviews, experiments with questionnaire design, and usability testing of electronic data collection instruments.
This generic testing clearance is a helpful vehicle for evaluating questionnaires/assessments and various data collection procedures. It has allowed NCES to take advantage of a variety of methods to identify questionnaire/assessment and procedural problems, suggest solutions, and measure the relative effectiveness of alternative solutions. Through the use of these techniques, employed routinely in the testing phase of NCES surveys, questionnaires and assessments have been simplified for respondents, respondent burden has been reduced, and the quality of the questionnaires and assessments used in continuing and one-time surveys and assessments has been improved. Thus an increase in the quality of the data collected through these surveys has been achieved as well.
During the three-year generic clearance, NCES will provide periodic reports on the testing activities, which, in addition to methods used in the past, may include expanded field tests including split sample questionnaire experiments in multiple panels and usability testing of electronic data collection instruments. The focus of these activities will include testing of items and research about incentives (cash and non-cash), mode (telephone, paper and pencil, computer-based, mail-out and mail-in, etc.), and other methodologies of questionnaires and assessments.
This request for clearance provides a description of the scope of possible activities that might be covered, and NCES requests the same conditions that have been included in the previous clearance agreement for Cognitive, Pilot, and Field Test Studies (OMB# 1850-0803 v.153 and 194), approved on July 8, 2016 with a change request approved on April 13, 2017. This system generic clearance will go through the usual two Federal Register Review periods, subsequent to which, NCES requests that OMB review and clear proposed studies within a two-week period with no Federal Register Notice period required under the generic clearance. This clearance is similar to the testing clearances held by the Census Bureau, the Bureau of Labor Statistics, and the National Science Foundation, allowing the statistical agencies to develop, redesign, and test data collection instruments and procedures in a timely manner.
Some of the programs that have submitted developmental studies under this clearance in the last three years include the Early Childhood Longitudinal Studies (ECLS), Middle Grades Longitudinal Study (MGLS), High School and Beyond 2020 (HS&B:20), International Early Learning Study (IELS), Trends in International Mathematics and Science Study (TIMSS), International Computer and Information Literacy Study (ICILS), National Assessment of Educational Progress (NAEP), National Household Education Surveys (NHES), Adult Training and Education Survey (ATES), Quick Response Information System (QRIS) studies, Teaching and Learning International Survey (TALIS), National Teacher and Principal Surveys (NTPS), School Survey on Crime and Safety (SSOCS), School Crime Supplement to the National Crime Victimization Survey (SCS), ED School Climate Surveys (EDSCLS), and variety of postsecondary survey activities, including National Postsecondary Student Aid Study (NPSAS), Baccalaureate and Beyond Longitudinal Study (B&B), and Integrated Postsecondary Education Data Systems (IPEDS). We anticipate that other NCES programs will also be able to use this clearance for developmental projects.
The specific methods proposed for coverage by this clearance are described below. Also outlined are the procedures currently in place for keeping OMB informed about the identity of the surveys and the nature of the research activities being conducted.
The methods proposed for use in questionnaire and assessment development are as follows:
Field or pilot test. For the purposes of this clearance, we are defining field tests as data collection efforts conducted among either purposive or statistically representative samples, for which evaluation of the questionnaire and/or procedures is the main objective, and as a result of which only research and development (R&D) and methodological reports may be published, but based on which no statistical reports or data sets will be published. Field tests are an essential component of this clearance package because they serve as the vehicle for investigating basic item properties, such as reliability, validity, and difficulty, as well as feasibility of methods for standardized administration (e.g., computerized administration) of forms. Under this clearance a variety of surveys will be tested, and the exact nature of the surveys and the samples is undetermined at present. However, due to the smaller nature of the tests, we expect that some will not involve representative samples. In these cases, samples will basically be convenience samples, which will be limited to specific geographic locations and may involve expired rotation groups of a current survey blocks that are known to have specific aggregate demographic characteristics. The needs of the particular sample will vary based on the content of the survey being tested, but the selection of sample cases will not be completely arbitrary in any instance.
Behavior coding. This method involves applying a standardized coding scheme to the completion of an interview or questionnaire, either by a coder using a tape-recording of the interview or by a "live" observer at the time of the interview. The coding scheme is designed to identify situations that occur during the interview that reflect problems with the questionnaire. For example, if respondents frequently interrupt the interviewer before the question is completed, the question may be too long. If respondents frequently give inadequate answers, this suggests there are other problems with the question. Quantitative data derived from this type of standardized coding scheme can provide valuable information to identify problem areas in a questionnaire, and research has demonstrated that this is a more objective and reliable method of identifying problems than the traditional interviewer debriefing, which is typically the sole tool used to evaluate the results of a traditional field test (New Techniques for Pretesting Survey Questions by Cannell, Kalton, Oksenberg, Bischoping, and Fowler, 1989).
Interviewer debriefing. This method employs the knowledge of the employees who have the closest contact with the respondents. In conjunction with other methods, we plan to use this method in our field tests to collect information about how interviewers react to the survey instruments.
Exploratory interviews. These may be conducted with individuals to understand a topical area and may be used in the very early stages of developing a new survey. They may cover discussions related to administrative records (e.g. what types of records, where, and in what format), subject matter, definitions, etc. Exploratory interviews may also be used in evaluating whether there are sufficient issues related to an existing data collection to consider a redesign.
Respondent debriefing questionnaire. In this method, standardized debriefing questionnaires are administered to respondents who have participated in a field test. The debriefing form is administered at the end of the questionnaire being tested, and contains questions that probe how respondents interpret the questions and whether they have problems in completing the survey/questionnaire. This structured approach to debriefing enables quantitative analysis of data from a representative sample of respondents, to learn whether respondents can answer the questions, and whether they interpret them in the manner intended by the questionnaire designers.
Follow-up interviews (or reinterviews). This involves re-interviewing or re-assessing a sample of respondents after the completion of a survey or assessment. Responses given in the reinterview are compared with the respondents’ initial responses for consistency. In this way, reinterviews provide data for studies of test–re-test reliability and other measures of the quality of data collected. In turn, this information aids in the development of more reliable measures.
Split sample experiments. This involves testing alternative versions of questionnaires and other collection methods, such as mailing packages and incentive treatments, at least some of which have been designed to address problems identified in draft questionnaires or questionnaires from previous survey waves. The use of multiple questionnaires, randomly assigned to permit statistical comparisons, is the critical component here; data collection can include mail, telephone, Internet, personal visit interviews, or group sessions at which self-administered questionnaires are completed. Comparison of revised questionnaires against a control version, preferably, or against each other, facilitates statistical evaluation of the performance of alternative versions of the questionnaire. Split sample tests that incorporate questionnaire design experiments are likely to have a large sample size (e.g. several hundred cases per panel) to enable the detection of statistically significant differences, and facilitate methodological experiments that can extend questionnaire design knowledge more generally for use in a variety of data collection instruments.
Cognitive and usability interviews. This method involves intensive, one-on-one interviews in which the respondent is typically asked to "think aloud" as he or she answers survey questions. A number of different techniques may be involved, including asking respondents to paraphrase questions, probing questions asked to determine how respondents came up with their answers, and so on. The objective is to identify problems of ambiguity or misunderstanding, or other difficulties respondents have answering questions. This is frequently the first stage in revising a questionnaire.
Focus groups. This method involves group sessions guided by a moderator, who follows a topical outline containing questions or topics focused on a particular issue, rather than adhering to a standardized questionnaire. Focus groups are useful for surfacing and exploring issues which people may feel some hesitation about discussing (e.g., confidentiality concerns).
Procedures for Clearance
Before a testing activity is undertaken, NCES will provide OMB with a memo describing the study to be conducted and a copy of questionnaires and debriefing materials to be used. Depending on the stage of questionnaire development, this may be a printed questionnaire, a set of prototype items showing each item type to be used and the range of topics to be covered by the questionnaire, or an interview script. When split sample experiments are conducted, either in small group sessions or as part of a field test, the different version questionnaires to be used will be provided. For a test of alternative procedures, the description and rationale for the procedures will be submitted. A brief description of the planned field activity will also be provided. NCES requests that OMB raise comments on substantive issues within 10 working days of receipt.
Data collection for this project is authorized under the legislation authorizing the questionnaire being tested. In most cases, this is the Education Sciences Reform Act (ESRA 2002, 20 U.S.C. §9543), although other legislation, such as Title 13 or 15, may apply for surveys conducted in concert with other Federal agencies. A copy of ESRA 2002 (20 U.S.C.) Section 9543 is attached. We do not now know what other titles will be referenced, since we do not know what survey questionnaires will be tested during the course of the clearance. If other than ESRA, the relevant authorizing statute will be specified.
2. Needs and Uses
The information collected in this program of developing and testing questionnaires will be used by staff from NCES and sponsoring agencies to evaluate and improve the quality of surveys and assessments before they are conducted. None of the data collected under this clearance will be published for its own sake. Because the questionnaires being tested under this clearance are still in the process of development, the data that result from these collections are not considered official statistics of NCES or other Federal agencies. Data will not be made public, except when included in research reports prepared for sponsors inside and outside of NCES. The results may also be prepared for presentations related to survey methodology at professional meetings or publications on NCES website and in professional journals.
Information quality is an integral part of the pre‑dissemination review by NCES (fully described in NCES’s Statistical Standards and IES Style Guide, both of which can be found at http://nces.ed.gov/statprog/standards.asp). Information quality is also integral to the information collections conducted by NCES and is incorporated into the clearance process required by the Paperwork Reduction Act (PRA).
During the past three years this generic testing clearance has been used for:
1850-0803 v.162 NCES Confidentiality Pledges Cog Labs
1850-0803 v.163 NHES 2017 Web Test
1850-0803 v.164 NAEP 2019 Science Items Pretesting
1850-0803 v.165 B&B:08/18 Cognitive and Usability Testing
1850-0803 v.166 FRSS 108: Career & Technical Education in Districts Pretest
1850-0803 v.167 TALIS 2018 Principals Focus Group
1850-0803 v.168 B&B:16/17 Cognitive & Usability Testing
1850-0803 v.169 eNAEP Pretesting Study 2016 Updated (revised v.159)
1850-0803 v.170 NAEP SAIL Pretesting Activities - English Updated (revised v.106)
1850-0803 v.171 SSOCS 2018 Cognitive Interviews 2016
1850-0803 v.172 NAEP Middle School Transcript Study Pilot
1850-0803 v.173 MGLS:2017 Student Assessment Usability Study
1850-0803 v.174 NAEP Oral Reading Fluency Pilot Study 2017
1850-0803 v.175 NAEP Science Questionnaire Cog Labs 2017
1850-0803 v.176 NAEP SAIL Collaboration in ELA Study 2017
1850-0803 v.177 IPEDS 2017 Time Use and Burden Cog Labs Round 1
1850-0803 v.178 EDSCLS 2016 Additional Items Cog Labs -Set 2 Round 2
1850-0803 v.179 NAEP Survey and Cognitive Items Pretesting for 2017 & 2018 Pilots (revised v.155)
1850-0803 v.180 NAEP 2017 Feasibility Study of Middle School Transcript Study (MSTS)
1850-0803 v.181 NAEP DBA Usability Study 2017-18
1850-0803 v.182 NHES 2017 Web Test Updated (revised v.163)
1850-0803 v.183 NCER-NPSAS Grant - Financial Aid Nudges Focus Groups
1850-0803 v.184 NCER- NPSAS Grant Study – CSFA 2017 Focus Groups
1850-0803 v.185 SSOCS 2018 Principals Focus Groups
1850-0803 v.186 NHES:2019 Virtual Education Parents Focus Groups
1850-0803 v.187 NHES 2017 Web Test Debriefing Interviews
1850-0803 v.188 NHES 2019 Low Response Parents Focus Groups
1850-0803 v.189 NTPS 2017-18 Portal Usability Testing
1850-0803 v.190 IELS 2018 Items Trial
1850-0803 v.191 NCER-NPSAS Grant Study – CSFA 2017 Cog Labs
1850-0803 v.192 IPEDS 2017 Time Use and Burden Cog Labs Round 1
1850-0803 v.193 NAEP SBTs Pretesting for 2019 Pilot 4 & 8 Grade
1850-0803 v.194 NCES Cognitive, Pilot, and Field Test Studies System Clearance - Confidentiality Pledge Change Request 83C
1850-0803 v.195 NHES 2019 ATES Cog Labs
1850-0803 v.196 NAEP Grade 4 Writing Prompts Study
1850-0803 v.197 NAEP Grade 8 Social Sciences IICs Pretesting
1850-0803 v.198 NHES 2019 Early Childhood Cog Labs
1850-0803 v.199 NAEP TEL and eNAEP Pretesting Study
1850-0803 v.200 NHES 2019 Types of Schooling Cognitive Interviews
1850-0803 v.201 NAEP SES Indicator Items Development Studies
1850-0803 v.202 FRSS 109: Teachers’ Use of DLR for Teaching – Feasibility Calls
1850-0803 v.203 NAEP 2018 Assessment Delivery Study
1850-0803 v.204 B&B:08/18 Cognitive & Usability Testing
1850-0803 v.205 NHES ATES Spanish Language Cognitive Interviews
1850-0803 v.206 TALIS 2018 Video Feasibility Study
1850-0803 v.207 ICILS 2018 Pretest
1850-0803 v.208 SSOCS 2018 Usability Testing
1850-0803 v.209 NAEP 2018 Field Trial
1850-0803 v.210 NAEP 2018 Assessment Delivery Study - Revised (revised v.203)
1850-0803 v.211 NAEP TEL and eNAEP Pretesting Study - Revised (revised v.199)
1850-0803 v.212 NHES 2019 Types of Schooling Cognitive Interviews - Revised (revised v.200)
1850-0803 v.213 NCVS SCS:19 Cognitive Interviews
1850-0803 v.214 IPEDS 2020 CIP Codes Survey
1850-0803 v.215 NHES 2019 Early Childhood Cog Labs - Revised (revised v.198)
1850-0803 v.216 NHES 2019 Types of Schooling Cognitive Interviews - Revision 2 (revised v.212)
1850-0803 v.217 B&B:16/17 SOGI Items Probing
1850-0803 v.218 NTPS 2019-20 Cognitive Interviews
1850-0803 v.219 NAEP 2021 Items Pretesting
1850-0803 v.220 NCES Confidentiality Pledges Mail Experiment
1850-0803 v.221 ICILS 2018 Pretest - Updated (revised v.207)
1850-0803 v.222 NAEP Reading SBT Tryouts
1850-0803 v.223 NCVS SCS:19 Spanish Cognitive Interviews
1850-0803 v.224 NAEP Grade 8 Social Sciences IICs Pretesting - Round 2
1850-0803 v.225 NHES 2019 Spanish Materials Focus Groups
1850-0803 v.226 FRSS 109 Teacher Use of IT Pretest
1850-0803 v.227 ATES 2021 Cognitive Interviews 1st Round
1850-0803 v.228 ATES 2021 Cognitive Interviews 1st Round Update
1850-0803 v.229 NHES 2019 Spanish Materials Focus Groups Round 2 - Recruitment
1850-0803 v.230 NHES 2019 Screener Cognitive Interviews - Recruitment
1850-0803 v.231 NHES 2019 Screener Cognitive Interviews
1850-0803 v.232 NHES 2019 Spanish Materials Focus Groups Round 2
1850-0803 v.233 eNAEP 2019 Pretesting
1850-0803 v.234 NHES 2019 Web Usability Testing
1850-0803 v.235 NTPS 2019-20 Teacher Focus Groups
1850-0803 v.236 NHES 2019 Web Usability Testing Spanish Materials
1850-0803 v.237 NTPS 2019-20 Teacher Focus Groups Update (revised v.235)
1850-0803 v.238 TIMSS 2019 Pretesting
1850-0803 v.239 NAEP 2022 Economics Pretesting
1850-0803 v.240 HS&B:20 BS FT Cognitive Testing Round 1
1850-0803 v.241 NAEP 2022 Social Science & Economics Questionnaire Cog Labs
1850-0803 v.242 B&B:16/20 Tryouts and Focus Groups
1850-0803 v.243 NPSAS 2019-20 Pretesting
1850-0803 v.244 FRSS 110: Educational Technology in Public Schools Feasibility Calls
1850-0803 v.245 NAEP 2022 TEL Pretesting
1850-0803 v.246 ECLS-K:2023 Preschool Parent Focus Groups
1850-0803 v.247 NPSAS 2019-20 Online Pretesting Round 2
3. Use of Information Technology
When the survey or assessment being tested employs automated methods for its data collection, the research conducted under this submission will also use automated data collection techniques. This clearance offers NCES the opportunity to try innovative technologies that would reduce burden and increase the use of information technology.
4. Efforts to Identify Duplication
Research under this clearance does not duplicate any other questionnaire design work being done by NCES or other Federal agencies. Instead, its purpose is to stimulate additional research, which would not be done under other circumstances due to time constraints. When appropriate, this research involves collaborations with staff from other federal and non-federal agencies. Additionally, to the extent possible, NCES makes use of existing information, including reviewing results of previous evaluations of survey data, however, such information is typically not sufficient to refine survey questionnaires without conducting additional research.
5. Minimizing Burden
This research will be designed as relatively small-scale data collection efforts so as to minimize the amount of burden required to improve questionnaires and procedures, test new ideas, and refine or improve upon positive or unclear results from other tests. The results of the research conducted under this clearance are expected to improve the methods and instruments utilized in full scale studies and thereby improve information quality while minimizing burden to respondents.
6. Consequences of Less Frequent Collection
Without questionnaire development testing, the quality of the data collected in full surveys would suffer.
7. Special Circumstances
There are no special circumstances.
8. Consultations Outside the Agency
Consultation with staff from other Federal agencies that sponsor surveys conducted by NCES will occur in conjunction with testing individual surveys. Consultation with staff from other Federal laboratory facilities may also occur as part of joint research efforts. These consultations will include discussions concerning potential response problems, clarity of questions and instructions, and other aspects of respondent burden. Additional efforts to consult with potential respondents to obtain their views on the availability of data, clarity of instructions, etc., may be undertaken as part of the testing that is conducted under this clearance.
9. Paying Respondents
Respondents for activities conducted in the laboratory (e.g. cognitive interviews and focus groups) under this clearance may receive compensation for travel and participation. This practice has proven necessary and effective in recruiting subjects to participate in such research, and is also employed by the other federal cognitive laboratories. Research on incentives that may be conducted under this clearance may also involve nonmonetary incentives. The Office of Management and Budget (OMB) has noted that effectiveness of such incentives is a worthwhile research topic. If incentives need to be proposed for any research activity under this clearance, justification will be provided and NCES will work closely with OMB on the incentive strategy to be employed. NCES will typically propose incentives at the level approved by the Office of Management and Budget for cognitive laboratories and focus groups. If a higher level incentive is proposed for approval, a meaningful justification will be provided.
10. Assurance of Confidentiality
If the collection is under the authority of Education Sciences Reform Act of 2002 (ESRA 2002), all respondents who participate in research under this clearance will be informed that their participation is voluntary and that all of the information they provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151). For personal visit and telephone interviews, this information will be conveyed verbally by the interviewer, and in personal visit interviews respondents will also receive this information in writing. For self-administered questionnaires, the information will be included in the mailing package, either as part of communication materials or on the questionnaire or instructions. For Internet-based data collections, this information will be displayed prominently, and in a format that allows the respondent to print it out. All participants in cognitive research will be required to sign written notification concerning the voluntary and confidential nature of their participation. NCES will also inform respondents in writing of the need to have an OMB number. No participant direct identifiers will be maintained as part of research under this generic clearance.
11. Justification for Sensitive Questions
Most of the questions that are included on NCES questionnaires are not of a sensitive nature and should not pose a problem to respondents. However, it is possible that some potentially sensitive questions may be included in questionnaires that are tested under this clearance. One of the purposes of the testing is to identify such questions, determine sources of sensitivity, and alleviate them insofar as possible before the actual survey is administered.
12. Estimate of Hour Burden
We estimate that the number of people involved in our exploratory, field test, pilot, cognitive, and focus group work will be at most 200,000 per year; the vast majority of which will be contacted as part of screening and recruitment activities preceding the actual research. Given that screening and recruitment activities are included in the burden calculations, we estimate the annual burden hours will be approximately 0.4 hours per person or an annualized 80,000 hours overall. The total estimated respondent burden is 240,000 hours for the 3-year period beginning on the date of OMB approval in 2019:
Time Period |
Respondents |
Responses |
Respondent burden (hours) |
2019 - 2020 |
200,000 |
200,000 |
80,000 |
2020 - 2021 |
200,000 |
200,000 |
80,000 |
2021 - 2022 |
200,000 |
200,000 |
80,000 |
Total |
600,000 |
600,000 |
240,000 |
A variety of forms will be used in conducting the research under this clearance, and the exact number of different forms, length of each form, and number of subjects/respondents per form are unknown at this time. However, we can project that our activities will likely include testing items, data collection modes, and incentive payment levels, in the form of “hothouse” field tests, expanded field tests including split sample questionnaire experiments in multiple panels, cognitive labs, exploratory interviews, reinterviews, and focus groups among students, teachers, parents, and other types of respondents.
13. Estimate of Cost Burden
There is typically no cost to respondents for participating in the research being conducted under this clearance, except for their time to complete the questionnaire
14. Cost to Federal Government
There is no way to anticipate the actual number of participants, length of interview, and/or mode of data collection for the surveys to be conducted under this clearance. Thus, it is impossible to estimate in advance the cost to the Federal Government. Costs will be covered by divisions conducting the research from their data collection budgets. We will include information about costs in the individual submissions.
15. Reason for Change in Burden
No change to the estimated respondent burden is being requested.
16. Project Schedule
This research program is for questionnaire and procedure development purposes. Data tabulations will be used to evaluate the results of questionnaire and methods testing. The information collected in this effort will not be the subject of population estimates or other statistics in NCES reports; however, it may be published in research and development reports or be included as a methodological appendix or footnote in a report containing data from a larger data collection effort. The results of this research may be prepared for presentation at professional meetings or publication on NCES website and in professional journals. Due to the nature of this clearance, there is no definite or tentative time schedule for individual testing activities at this point. We expect work to continue more or less continuously throughout the duration of the clearance.
17. Request to Not Display Expiration Date
No exemption is requested.
18. Exceptions to the Certification
There are no exceptions to the certification.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Hi All, |
Author | Edith.McArthur |
File Modified | 0000-00-00 |
File Created | 2021-01-20 |