Vol I -- Supporting Statement for NAEP cog lab

Volume_I-NAEP_Background_Cog_Lab.doc

System Clearance for Cognitive, Pilot and Field Test Studies

Vol I -- Supporting Statement for NAEP cog lab

OMB: 1850-0803

Document [doc]
Download: doc | pdf




National Assessment of Educational Progress





Volume I

Supporting Statement



Request for Clearance for Cognitive Interview Study of NAEP Core Background Questions for

Students, Teachers, and School Administrators


OMB# 1850-0803 v.40

(Generic Clearance for Cognitive, Pilot and Field Test Studies)





January 4, 2011



Volume I: Supporting Statement




  1. Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB #1850-0803). This generic clearance provides for NCES to conduct various procedures (such as field tests and cognitive interviews) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments.



  1. Background and Study Rationale

As required by the National Assessment Governing Board (Governing Board), the National Center for Education Statistics (NCES) of the U.S. Department of Education, will conduct nationwide assessments at grades 4, 8, and 12 for the National Assessment of Educational Progress (NAEP). In addition to assessing subject-area achievement, NAEP collects questionnaire data to provide context for the reporting and interpretation of assessment results. This questionnaire data comes from three respondent types: students, teachers, and school administrators. NAEP questionnaires serve to fulfill reporting requirements of federal legislation1 and to provide a context for reporting student performance. Questions that are not subject-specific are given to respondents associated with all subject-area assessments and are referred to as core background questions. This study covers only core background questions.


In 2010, the Governing Board recommended additions, revisions, and deletions to the existing background questions. The goal has been to update the NAEP background questionnaires by developing new questions in emerging topics associated with academic achievement and revising outdated questions. Question development was conducted based on the Governing Board comments and literature reviews that identified key question categories related to student achievement. In 2012, NCES will pilot test these new and revised NAEP background questions with a large, national sample of students, teachers, and school administrators. Prior to the pilot of the questions in 2012, NCES will perform cognitive interviews of these questions.


This cognitive interview study investigates the cognitive processes that respondents use to answer survey questions. We are particularly interested in respondents’ ability to comprehend the question and provide a valid response. Our goal is to identify question problems and limitations prior to the 2012 pilot. Early identification of question problems, prior to administering the questions to a large number of respondents, will increase the quality of the questionnaire by reducing potentially confusing language or improving response categories.


Cognitive interviews are one of several forms of survey question pretesting to identify question problems. In cognitive interviews, an interviewer uses a structured protocol in a one-on-one interview using two methods: think-aloud interviewing and verbal probing techniques. With think-aloud interviewing, respondents are explicitly instructed to “think aloud” (i.e., describe what they are thinking) as they figure out their answers to the survey questions. The interviewer reads each question to the respondent, and then records the cognitive processes that the respondent uses in arriving at an answer to the question. With verbal probing techniques, the interviewer asks probing questions, as necessary, to clarify points that are not evident from the “think aloud” process. These probes might include, for example, asking the respondent to rephrase the question in their own words or assess whether the response categories are relevant.


Cognitive interview studies are largely observational. Although the sample will include a mix of student characteristics, it does not explicitly measure differences by those characteristics. The largely qualitative data collected will consist of mainly verbal reports in response to probes or from think-aloud tasks, in addition to volunteered comments. The objective is to identify and correct problems of ambiguity or misunderstanding, or other difficulties respondents have answering questions. The result should be a set of questionnaires that are easier to understand and therefore less burdensome for respondents while also yielding more accurate information.



  1. Study Design and Context

Sampling Plan

Existing research and practice has failed to offer a methodological or practical consensus regarding the minimum or optimal sample size necessary to provide valid results for non-cognitive question development.2 Several researchers have confirmed the standard of five as the minimum number of participants per subgroup for analysis (i.e., cell) for the purposes of exploratory cognitive interviewing for question development.3 Although a sample size of five per cell will likely identify major problems with a question, more is better, up to approximately fifteen per cell.4


We plan to interview fifteen students for each student question because most of the student questions being tested relate to family demographics and socioeconomic status information, about which students may have limited knowledge. In addition, for questions that are the same across grades, we plan to treat each grade level as a different subgroup for analysis because we expect significant differences in cognitive development and family information knowledge between students at different grade levels.


We are confident that teachers and school administrators have both the knowledge and cognitive ability to answer the questions that are being tested. Therefore, we are using the standard of five participants at each grade level for which background questions were developed (i.e., grades 4 and 8 for teachers and grades 4, 8, and 12 for school administrators).



Student Recruitment and Sample Characteristics

UserWorks, the sub-contractor will recruit student participants for this study via phone and email (See Appendices F, H, I, J) through their Washington, D.C. metropolitan area participant database of volunteers and through other available lists. (Please note that this is the same procedure followed to recruit participants for the cognitive laboratory interviews conducted for NCES earlier this year on the audio/visual stimuli developed for the new writing assessment.) Parents of students expressing interest in participation will be contacted for parental permission. Potential student participants will be screened by telephone (See Appendices H, I, and J). The screening process would be designed to yield a sample of 50 participants for this study in order to comply with the following criteria:

  • 20 fourth-grade students

  • 15 eighth-grade students

  • 15 twelfth-grade students

  • Mix of race (Black, White, Asian)

  • Mix of Hispanic ethnicity

  • Mix of socioeconomic background (based on parental education)

  • Mix of urban/suburban/rural as it relates to neighborhood questions


Teacher and School Administrator Recruitment and Sample Characteristics

ETS, the NAEP contractor for question development, will recruit teacher and school administrator participants for the study via phone and email (See Appendices G and K) through our data collection services group. The screening process would be designed to yield 10 teachers and 15 school administrators for this study in order to comply with the following criteria:

  • 5 fourth-grade teachers

  • 5 eighth-grade teachers

  • 5 fourth-grade school administrators

  • 5 eighth-grade school administrators

  • 5 twelfth-grade school administrators

  • Mix of school size

  • Mix of public school type (traditional and charter)


Question Overview

Questions to be tested in the study are similar in both content and question types to existing NAEP background questions. Student questionnaires collect information on students’ demographic characteristics, classroom experiences, and educational support. Teacher questionnaires gather data on teacher training and instructional practices. School questionnaires gather information on school policies and characteristics. The following table displays the number of background questions being evaluated. For school administrators, there are matrix questions that include a single question stem and multiple sub-questions. In the table, the number of sub-questions is shown in parenthesis. Volume II contains the cognitive interview protocols which include all the student, teacher, and school background questions being evaluated in this study.


Question Burden: Number of Questions

Grade 4

Grade 8

Grade 12

Background: Students

11

9

8

Background: Teachers

13

14


Background: School Administrators

15 (55)

16 (57)

19 (66)



  1. Data Collection Process

ETS will conduct the teacher and school administrator interviews, while Abt Associates (see section 5) will conduct the student interviews. Both ETS and Abt Associates will ensure that qualified interviewers are available to conduct the interviews. Interviewers will be trained on the cognitive interviewing techniques of the protocol. Interview protocols are based on the generic protocol structure described in Volume II. The interviews will focus on how students, teachers, and school administrators answer questions about themselves, their classroom experiences, and their schools.


Cognitive Interview Process

Participants will first be welcomed, introduced to the interviewer and the observer (if an in-room observer is present), and told they are there to help answer questions about how people answer survey questions.


Participants will be reassured that their participation is voluntary and that their answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002, 20 U.S.C §9573].. Interviewers will explain the think-aloud process, conduct a practice question, and then participants will answer questions verbally.


Interviewers will use several different cognitive interviewing techniques, including general think-aloud and question-specific probes, observation, and debriefing questions. Volume II of this submission includes the cognitive interview protocols to be used in the study.


Units of Analysis

The key unit of analysis is the question. Questions will be analyzed across participants within grade and across grades, where applicable.


The types of data collected about the questions will include

  • think-aloud verbal reports;

  • behavioral coding (e.g., errors in reading question);

  • responses to generic probes;

  • responses to question-specific probes;

  • additional volunteered participant comments; and

  • debriefing questions.


A coding frame will be developed for the responses to think-aloud questions and other verbal reports. The frame will be designed to identify and code reflection of verbal responses and to identify problems associated with question comprehension, memory retrieval, judgment/estimation processes, and response processes. The draft coding frame will be modified and supplemented based on reviewing the initial cognitive interviews.


Analysis Plan

The general analysis approach will be to compile the different types of data in spreadsheets and other formats to facilitate identification of patterns of responses for specific questions, for example, patterns of counts of verbal report codes and of responses to probes or debriefing questions.


Each type of data for a question will be examined both independently and in conjunction with item-specific features (e.g., sentence complexity, item type, and number of response choices) for the question in order to determine whether a feature or an effect of a question is observed across multiple measures and/or across administrations of the question.


This approach will ensure that the data are analyzed in a way that is thorough and that will enhance identification of problems with questions and provide recommendations for fixing those problems.


A more detailed analysis plan will be developed as part of the project. Final reports of the study findings will be submitted to OMB along with the proposed questionnaires for pilot testing in 2012.



  1. Consultations Outside the Agency

Educational Testing Service (ETS)

ETS serves as the Item Development (ID) contractor on the NAEP project, developing cognitive and background items for NAEP assessments.



Abt Associates

Abt Associates is a large, established for-profit government and business research and consulting firm. Abt Associates is working as a subcontractor for ETS on this project. Abt Associates provides expert development, testing, and refinement of data collection tools to ensure their reliability and validity. Located in Bethesda, Maryland, Abt Associates has a Cognitive Testing Laboratory facility that offers a range of cognitive interviewing and usability testing services.


Johnny Blair, a Principal Scientist at Abt Associates, is a specialist in survey research design, cognitive interviewing, and usability testing. He has extensive experience in survey methodology, statistics, demography, instrument design, data analysis, and evaluation. He has published extensively on issues such as cognitive interview data quality, optimizing cognitive pretest sample sizes, and customizing cognitive interview protocols.


UserWorks

UserWorks is a Maryland-based company that specializes in recruiting for focus groups, cognitive interviews, and other research. UserWorks is a subcontractor of Abt Associates. The two organizations have worked together regularly for several years.



  1. Assurance of Confidentiality

NCES has policies and procedures that ensure privacy, security, and confidentiality, in compliance with the Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107-347 and Education Sciences Reform Act of 2002 (20 U.S.C. §9573). This legislation ensures that security and confidentiality policies and procedures of all NCES studies, including the NAEP project, are in compliance with the Privacy Act of 1974 and its amendments, NCES confidentiality procedures, and the Department of Education ADP Security Manual.


Participation is voluntary. Written consent will be obtained from legal guardians of minor students, students age 18 or older, teachers, and school administrators before interviews are conducted (see appendixes A, B, and C for legal guardian consent, adult student, and teacher/school personnel consent forms, respectively).


Participants will be assigned a unique student identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form. The consent forms, which include the participant name, will be separated from the participant interview files and kept in a locked cabinet for the duration of the study and will be destroyed after the final report is released.


The interviews will be recorded. The only identification included on the files will be the participant ID. The audio and video files will be secured in a locked cabinet for the duration of the study and will be destroyed after the final report is released.



  1. Justification for Sensitive Questions

Throughout the interview protocol development process, effort has been made to avoid asking for information that might be considered sensitive or offensive. Reviewers have identified and eliminated potential bias in questions. In addition, the background question development process includes sensitivity reviews before use in assessments. Some of the new questions being evaluated are socioeconomic questions which may be considered sensitive. As specified in the Governing Board’s Background Information Framework for NAEP, current law (Section 411 of Public Law 107-110 paragraph (2) subsection (G)) requires every NAEP assessment to collect data on students’ socioeconomic status (SES). However, the Framework also acknowledges that “there has been considerable concern over many years about the quality of SES measures in NAEP, both for reporting to the public and for analysis by researchers”. To address this concern, the Governing Board has requested that NAEP conduct studies aimed towards developing a reliable and valid composite index of SES. In response to these directives, an SES expert panel composed of leading researchers5 identified topic areas, which motivated the development of an enhanced set of SES questions that are being evaluated as part of this cognitive interview study.



  1. Estimate of Hourly Burden

A two-stage recruitment effort will be conducted via email and phone. Initial contact and response via email is estimated at 3 minutes or .05 hours. The follow-up phone call to screen student participants is estimated at 9 minutes or .15 hours per family. The follow-up phone call or letter to confirm participation is estimated at 3 minutes or .05 hours.


Fourth-grade student interviews will be limited to 60 minutes or 1 hour. Eighth- and twelfth-grade student interviews are expected to take 60 minutes and their interviews will be limited to 90 minutes or 1.5 hours. Teacher interviews are estimated to take no more than 90 minutes. School administrator interviews are estimated to take no more than 120 minutes.


Respondent

Hours per respondent

Number of respondents

Total Hours

Parent and Student Recruitment

 

 

Initial contact

0.05

600

30

Follow-up via phone

0.15

120

18

Confirmation via phone

0.05

78

4

Teacher and School Administrator Recruitment

 

 

Initial contact

0.05

50

3

Confirmation via phone

0.05

36

2

Interviews

 

 

 

Grade 4 Students

1

20

20

Grade 8 Students

1.5

15

23

Grade 12 Students

1.5

15

23

Teachers

1.5

10

15

School Administrators

2

15

30

Total Burden Hours

 

 

168



  1. Estimate of Costs for Recruiting and Paying Respondents

Because the study will take place outside of regular academic school hours, a monetary incentive is aimed at ensuring participation and motivation on behalf of the participants. This practice has proven effective in recruiting respondents to participate in similar research. Each participating teacher and school principal will receive a $40 gift card in compensation for time and effort. Each participating student will receive a $25 gift card in compensation for time and effort. In addition, we are offering a gift card of $25 per parent to remunerate them for their time and help offset the travel/transportation costs of bringing the participating student to and from the cognitive laboratory site. These amounts are consistent with previous similar studies. Generic gift cards that can be used anywhere credit cards are accepted is the recommended incentive because low-income participants do not always have bank accounts and check-cashing outlets can be burdensome in terms of fees.



  1. Cost to Federal Government

The following table provides the overall project cost estimates:


Activity

Provider

Estimated Cost

Design, preparation, conducting of student cognitive interviews (including recruitment, incentive costs, data collection, analysis, and reporting)

Abt Associates

$ 165,000

Design, preparation, conducting of teacher and school administrator cognitive interviews (including recruitment, incentive costs, data collection, analysis, and reporting)

ETS

$ 60,000

Totals


$ 225,000



  1. Schedule

The following table provides the schedule of milestones and deliverables:


Activity

Dates

Submission to OMB

mid-December 2010

Recruit participants

January 2011 (subsequent to OMB clearance)

Data collection, preparation, and coding

February 2011

Data analysis

early March 2011

Final study report

end of March 2011


1Public Law 107-279, Education Sciences Reform Act of 2002 (ESRA), Sec. 303, National Assessment of Educational Progress (20 USC 9622).

2 See Almond, P. J., Cameto, R., Johnstone, C. J., Laitusis, C., Lazarus, S., Nagle, K., Parker, C. E., Roach, A. T., & Sato, E. (2009). White paper: Cognitive interview methods in reading test design and development for alternate assessments based on modified academic achievement standards (AA-MAS). Dover, NH: Measured Progress and Menlo Park, CA: SRI International.

3 See Van Someren, M. W., Barnard, Y. F., & Sandberg, J. A. C. (1994). The think-aloud method: A practical guide to modeling cognitive processes. San Diego, CA: Academic Press.

4 See Willis, G. (2005). Cognitive Interviewing: A Tool for Improving Questionnaire Design. Thousand Oaks, CA: Sage.

5 The current members of the SES panel are Chris Chapman, NCES; Chuck Cowan, Analytic Focus, LLC; Robert Hauser, University of Wisconsin-Madison; Robert Kominski, Census Bureau; Hank Levin, Columbia University; Sam Lucas, University of California, Berkeley; Stephen Morgan, Cornell University; Margaret Beale Spencer, University of Chicago.

File Typeapplication/msword
File TitleBackground Cog Lab OMB Submission V.1
SubjectNAEP BQ
AuthorDonnell Butler
Last Modified ByAuthorised User
File Modified2011-01-06
File Created2011-01-04

© 2024 OMB.report | Privacy Policy